Skip to content

Commit

Permalink
v4.5.4.2 Updating bathymetry-preprocessing script (#1125)
Browse files Browse the repository at this point in the history
  • Loading branch information
hhs732 authored Aug 2, 2024
1 parent fa670d4 commit 3423757
Show file tree
Hide file tree
Showing 2 changed files with 24 additions and 2 deletions.
15 changes: 14 additions & 1 deletion data/bathymetry/preprocess_bathymetry.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
bathy_affine = bathy_ft.transform
bathy_ft = bathy_ft.read(1)
bathy_ft[np.where(bathy_ft == -9999.0)] = np.nan
bathy_ft[np.where(bathy_ft <= 0.0)] = 0.000001
survey_min_depth = np.nanmin(bathy_ft)

assert survey_min_depth < min_depth_threshold, (
Expand Down Expand Up @@ -81,6 +82,10 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
zs_area = zs_area.set_crs(nwm_streams.crs)
zs_area = zs_area.rename(columns={"sum": "missing_volume_m3"})

# print("------------------------------")
# print(zs_area.ID)
# print("------------------------------")

# Derive slope tif
output_slope_tif = os.path.join(os.path.dirname(tif), 'bathy_slope.tif')
slope_tif = gdal.DEMProcessing(output_slope_tif, bathy_gdal, 'slope', format='GTiff')
Expand Down Expand Up @@ -115,14 +120,22 @@ def preprocessing_ehydro(tif, bathy_bounds, survey_gdb, output, min_depth_thresh
)

# Add survey meta data
bathy_nwm_streams['SurveyDateStamp'] = bathy_bounds.loc[0, 'SurveyDateStamp']
time_stamp = bathy_bounds.loc[0, 'SurveyDateStamp']
time_stamp_obj = str(time_stamp)

bathy_nwm_streams['SurveyDateStamp'] = time_stamp_obj # bathy_bounds.loc[0, 'SurveyDateStamp']
bathy_nwm_streams['SurveyId'] = bathy_bounds.loc[0, 'SurveyId']
bathy_nwm_streams['Sheet_Name'] = bathy_bounds.loc[0, 'Sheet_Name']
bathy_nwm_streams["Bathymetry_source"] = 'USACE eHydro'

# Export geopackage with bathymetry
num_streams = len(bathy_nwm_streams)
bathy_nwm_streams = bathy_nwm_streams.to_crs(epsg=5070)

# schema = gpd.io.file.infer_schema(bathy_nwm_streams)
# print(schema)
# print("---------------------------")

if os.path.exists(output):
print(f"{output} already exists. Concatinating now...")
existing_bathy_file = gpd.read_file(output, engine="pyogrio", use_arrow=True)
Expand Down
11 changes: 10 additions & 1 deletion docs/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,20 @@
All notable changes to this project will be documented in this file.
We follow the [Semantic Versioning 2.0.0](http://semver.org/) format.

## v4.5.4.2 - 2024-08-02 - [PR#1125](https://github.com/NOAA-OWP/inundation-mapping/pull/1125)

This PR focuses on updating the preprocess_bathymetry.py for 3 issues: 1) the capability of preprocessing SurveyJobs that have negative depth values, 2) changing the SurveyDateStamp format, and 3) the capability of including multiple SurveyJobs for one NWM feature-id if needed.

### Changes
`data/bathymetry/preprocess_bathymetry.py`: Addressing 3 issues including, the capability of preprocessing SurveyJobs that have negative depth values, changing the SurveyDateStamp format, and the capability of including multiple SurveyJobs for one NWM feature-id.


<br/><br/>

## v4.5.4.1 - 2024-08-02 - [PR#1185](https://github.com/NOAA-OWP/inundation-mapping/pull/1185)

This PR brings back the `preprocess_ahps_nws.py` code to FIM4 and generates new AHPS benchmark datasets for sites SXRA2 and SKLA2 in Alaska. The new AHPS benchmark datasets are available on dev1 here: "/dev_fim_share/foss_fim/outputs/ali_ahps_alaska/AHPS_Results_Alaska/19020302/"

This PR closes issue #1130.

To process a new station, follow these steps:

Expand Down

0 comments on commit 3423757

Please sign in to comment.