-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[1pt] PR: Retrieve code to process NWS AHPS benchmarks for Alaska #1185
Conversation
@mluck Can you confirm that by having AHPS benchmark dataset for SXRA2 and SKLA2, now we can include these two sites into the list of sites in |
@RyanSpies-NOAA The new outputs of the script for the two site in Alaska are here: Note that the new outputs do not have your manual addition of upstream/downstream feature ids as you prepared on the previous version. I just created this new version to confirm the changes on the code-- if it is approved by you, we can go ahead and use your manually-revised dataset for prod. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reviewed changes and completed code testing.
This PR brings back the
preprocess_ahps_nws.py
code to FIM4 and generates new AHPS benchmark datasets for sites SXRA2 and SKLA2 in Alaska. The new AHPS benchmark datasets are available on dev1 here: "/dev_fim_share/foss_fim/outputs/ali_ahps_alaska/AHPS_Results_Alaska/19020302/"This PR closes issue #1130.
To process a new station, follow these steps:
/data/inputs/ahps_sites/evaluated_ahps_sites.csv
file.Note that for the "SKLA2" site, the downloaded ESRI-GDB grid files had a datum issue. This issue was manually corrected during the conversion of GDB files into TIFF files.
Also, a manual modification of the outputs was needed for these sites in Alaska. The WRDS API had some issues for retrieving upstream/downstream feature ids of the gages for these Alaska sites (for example, for being used in "ahps_skla2_huc_19020302_flows_major.csv" file). So, this part was manually applied using QGIS.
Additions
data/nws/preprocess_ahps_nws.py
... retrieved from previous versions of FIM and updated for shapely v2Changes
tools/tools_shared_functions.py
... updated for shapely v2Testing
The new code was tested successfully for sites SXRA2 and SKLA2 in Alaska. The code was run as below:
inputPath
: path to a directory that includes the subdirectories "sxra2" and "skla2". Each of these subdirectories should contain a "depth_grid" folder with TIFF files as described above.referenceRaster
: is the path to an arbitrary output TIFF file from a FIM run for an Alaska HUC . Note that for a site in CONUS, thisreferenceRaster
must be from a FIM run for a CONUS HUC.Below is a comparison for the site "sxra2" between the generated "major" (with stage =14ft) benchmark TIFF file (with EPSG:3338) and the corresponding plot available on the water.noaa.gov website.
Deployment Plan (For developer use)
How does the changes affect the product?
Issuer Checklist (For developer use)
You may update this checklist before and/or after creating the PR. If you're unsure about any of them, please ask, we're here to help! These items are what we are going to look for before merging your code.
[_pt] PR: <description>
dev
branch (the default branch), you have a descriptive Feature Branch name using the format:dev-<description-of-change>
(e.g.dev-revise-levee-masking
)dev
branchpre-commit
hooks were run locally/foss_fim/
, run:pytest unit_tests/
)4.x.x.x
Merge Checklist (For Technical Lead use only)
Data Copy Checklist (For Technical Lead use only)
@RobHanna-NOAA
/test_cases/nws_test_cases/validation_data_nws/19020302/
)Also: