Skip to content

Commit

Permalink
merging with the current dev branch.
Browse files Browse the repository at this point in the history
Merge remote-tracking branch 'origin/dev' into dev-update-aep
  • Loading branch information
ZahraGhahremani-NOAA committed Jul 23, 2024
2 parents 548e060 + 25135e0 commit 004af73
Show file tree
Hide file tree
Showing 15 changed files with 330 additions and 54 deletions.
2 changes: 1 addition & 1 deletion config/deny_branch_zero.lst
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ slopes_d8_dem_meters_{}.tif
slopes_d8_dem_meters_masked_{}.tif
sn_catchments_reaches_{}.tif
src_{}.json
src_base_{}.csv
#src_base_{}.csv
#src_full_crosswalked_{}.csv
stage_{}.txt
streamOrder_{}.tif
Expand Down
2 changes: 1 addition & 1 deletion config/deny_branches.lst
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ slopes_d8_dem_meters_{}.tif
slopes_d8_dem_meters_masked_{}.tif
sn_catchments_reaches_{}.tif
src_{}.json
src_base_{}.csv
# src_base_{}.csv
# src_full_crosswalked_{}.csv
stage_{}.txt
streamOrder_{}.tif
Expand Down
70 changes: 68 additions & 2 deletions docs/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,74 @@ This PR updates scripts that use the recurrence flow files. The new flow files (
- `src/src_adjust_ras2fim_rating.py`: 100 year recurrence was removed since it is not included in the new AEP.
- `src/src_adjust_usgs_rating_trace.py`: 100 year recurrence was removed since it is not included in the new AEP.
- `tools/rating_curve_comparison.py`: 100 year recurrence was removed since it is not included in the new AEP. Also, the name of recurrence flow CSV file was updated.
## v4.5.2.11 - 2024-07-19 - [PR#1222](https://github.com/NOAA-OWP/inundation-mapping/pull/1222)

We are having problems with post processing overall duration taking a long time. This new system captures duration times for each module/section inside fim_post_processing.sh and records it to a file on the output directory. It records it as it progress and will also help us learn if fim_post_processing.sh stopped along the way.

Note: When used in code, we call `Set_log_file_path` shell variable with a file name and path (no validation done at this time). The each time a person wants to print to screen and console, use the `l_echo` command instead of the native `echo` command. If the log file has not been set, the output will continue to go to screen, just not the log file.

### Changes
- `fim_pipeline.sh`: A couple of minor text output changes.
- `fim_post_processing.sh`: As described above.
- `src\bash_functions.env`: New functions and adjustments to support the new log system.

<br/><br/>


## v4.5.2.10 - 2024-07-19 - [PR#1224](https://github.com/NOAA-OWP/inundation-mapping/pull/1224)

Addresses warnings to reduce output messages.

### Changes

- `src/'
- `adjust_thalweg_lateral.py`: fixes number type
- `src/delineate_hydros_and_produce_HAND.sh`: removes division by zero warning
- `getRasterInfoNative.py`: adds `gdal.UseExceptions()`

<br/><br/>


## v4.5.2.9 - 2024-07-19 - [PR#1216](https://github.com/NOAA-OWP/inundation-mapping/pull/1216)

Adds `NO_VALID_CROSSWALKS` to `FIM_exit_codes` which is used when the crosswalk table or output_catchments DataFrame is empty. Removes branches that fail with `NO_VALID_CROSSWALKS`.

### Changes
- `add_crosswalk.py`: Added `NO_VALID_CROSSWALKS` as exit status when crosswalk or output_catchments is empty
- `process_branch.sh`: Removed branches that fail with `NO_VALID_CROSSWALKS`
- `utils/fim_enums.py`: Added `NO_VALID_CROSSWALKS` to `FIM_exit_codes`

<br/><br/>


## v4.5.2.8 - 2024-07-19 - [PR#1219](https://github.com/NOAA-OWP/inundation-mapping/pull/1219)

Changes non-fatal `ERROR` messages to `WARNINGS` to avoid triggering being logged as errors.

### Changes

- `src/`
- `bathymetric_adjustment.py`: Changes `WARNING` to `ERROR` in Exception
- `src_roughness_optimization.py`: Changes `ERROR` messages to `WARNING`

<br/><br/>

## v4.5.2.7 - 2024-07-19 - [PR#1220](https://github.com/NOAA-OWP/inundation-mapping/pull/1220)

With this PR we can run post_processing.sh multiple times on a processed batch without any concerns that it may change the hydroTable or src_full_crosswalked files.

### Additions

- `src/update_htable_src.py`

### Changes

- `config/deny_branch_zero.lst`
- `config/deny_branches.lst`
- `fim_post_processing.sh`

<br/><br/>

## v4.5.2.6 - 2024-07-12 - [PR#1184](https://github.com/NOAA-OWP/inundation-mapping/pull/1184)

This PR adds a new script to determine which bridges are inundated by a specific flow. It will assign a risk status to each bridge point based on a specific threshold.
Expand All @@ -21,7 +89,6 @@ This PR adds a new script to determine which bridges are inundated by a specific

<br/><br/>


## v4.5.2.5 - 2024-07-08 - [PR#1205](https://github.com/NOAA-OWP/inundation-mapping/pull/1205)

Snaps crosswalk from the midpoint of DEM-derived reaches to the nearest point on NWM streams within a threshold of 100 meters. DEM-derived streams that do not locate any NWM streams within 100 meters of their midpoints are removed from the FIM hydrofabric and their catchments are not inundated.
Expand All @@ -32,7 +99,6 @@ Snaps crosswalk from the midpoint of DEM-derived reaches to the nearest point on

<br/><br/>


## v4.5.2.4 - 2024-07-08 - [PR#1204](https://github.com/NOAA-OWP/inundation-mapping/pull/1204)

Bug fix for extending outlets in order to ensure proper flow direction in depression filling algorithm. This PR adds a distance criteria that in order for the end of an outlet stream to be snapped to the wbd_buffered boundary, the end point must be less than 100 meters from the WBD boundary.
Expand Down
5 changes: 4 additions & 1 deletion fim_pipeline.sh
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@ echo
echo "---- Unit (HUC) processing is complete"
date -u
Calc_Duration $pipeline_start_time
echo "---------------------------------------------------"

## POST PROCESSING

Expand All @@ -126,8 +127,10 @@ rm -d $workDir/$runName
. $projectDir/fim_post_processing.sh -n $runName -j $jobMaxLimit

echo
echo "======================== End of fim_pipeline.sh =========================="

echo "======================== End of fim_pipeline for $runName =========="
date -u
echo "Total Duration is ..."
Calc_Duration $pipeline_start_time
echo

Expand Down
120 changes: 86 additions & 34 deletions fim_post_processing.sh
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ if [ "$jobLimit" = "" ]; then jobLimit=1; fi
rm -rdf $outputDestDir/logs/src_optimization
rm -f $outputDestDir/logs/log_bankfull_indentify.log
rm -f $outputDestDir/logs/subdiv_src_.log
rm -f $log_file_name

# load up enviromental information
args_file=$outputDestDir/runtime_args.env
Expand All @@ -83,33 +84,85 @@ source $outputDestDir/params.env
source $srcDir/bash_functions.env
source $srcDir/bash_variables.env

echo
# Tell the system the name and location of the post processing log
log_file_name=$outputDestDir/post_proc.log
Set_log_file_path $log_file_name

l_echo ""
echo "++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++"
echo "---- Start of fim_post_processing"
echo "---- Started: `date -u`"
l_echo "---- Start of fim_post_processing"
l_echo "---- Started: `date -u`"
T_total_start
post_proc_start_time=`date +%s`

## RUN UPDATE HYDROTABLE AND SRC ##
# Define the counter file

Tstart
COUNTER_FILE="${outputDestDir}/post_processing_attempt.txt"
# Function to clean up
cleanup() {
if [ "$SUCCESS" = true ]; then
if [ -f "$COUNTER_FILE" ]; then
COUNTER=$(cat "$COUNTER_FILE")
if [ "$COUNTER" -eq 1 ]; then
l_echo "Counter is 1. Removing the counter file."
rm "$COUNTER_FILE"
fi
fi
fi
}

# Set up trap to call cleanup on EXIT, ERR, and INT (interrupt signal)
trap cleanup EXIT ERR INT
# Initialize the counter file if it doesn't exist
if [ ! -f "$COUNTER_FILE" ]; then
echo 0 > "$COUNTER_FILE"
fi

# Read the current counter value
COUNTER=$(cat "$COUNTER_FILE")

# Increment the counter
COUNTER=$((COUNTER + 1))

# Save the new counter value
l_echo "$COUNTER" > "$COUNTER_FILE"

# Check if the counter is greater than one
if [ "$COUNTER" -gt 1 ]; then
# Execute the Python file
l_echo "Updating hydroTable & scr_full_crosswalked for branches"
python3 $srcDir/update_htable_src.py -d $outputDestDir
else
l_echo "Execution count is $COUNTER, not executing the update_htable_src.py file."
fi
Tcount


## AGGREGATE BRANCH LISTS INTO ONE ##
echo -e $startDiv"Start branch aggregation"
l_echo $startDiv"Start branch aggregation"
Tstart
python3 $srcDir/aggregate_branch_lists.py -d $outputDestDir -f "branch_ids.csv" -o $fim_inputs
Tcount

## GET NON ZERO EXIT CODES FOR BRANCHES ##
echo -e $startDiv"Start non-zero exit code checking"
l_echo $startDiv"Start non-zero exit code checking"
find $outputDestDir/logs/branch -name "*_branch_*.log" -type f | \
xargs grep -E "Exit status: ([1-9][0-9]{0,2})" > \
"$outputDestDir/branch_errors/non_zero_exit_codes.log" &

## RUN AGGREGATE BRANCH ELEV TABLES ##
echo "Processing usgs & ras2fim elev table aggregation"
l_echo $startDiv"Processing usgs & ras2fim elev table aggregation"
Tstart
python3 $srcDir/aggregate_by_huc.py -fim $outputDestDir -i $fim_inputs -elev -ras -j $jobLimit
Tcount

## RUN BATHYMETRY ADJUSTMENT ROUTINE ##
if [ "$bathymetry_adjust" = "True" ]; then
echo -e $startDiv"Performing Bathymetry Adjustment routine"
# Run bathymetry adjustment routine
l_echo $startDiv"Performing Bathymetry Adjustment routine"
Tstart
# Run bathymetry adjustment routine
python3 $srcDir/bathymetric_adjustment.py \
-fim_dir $outputDestDir \
-bathy $bathymetry_file \
Expand All @@ -121,9 +174,9 @@ fi

## RUN SYNTHETIC RATING CURVE BANKFULL ESTIMATION ROUTINE ##
if [ "$src_bankfull_toggle" = "True" ]; then
echo -e $startDiv"Estimating bankfull stage in SRCs"
# Run SRC bankfull estimation routine routine
l_echo $startDiv"Estimating bankfull stage in SRCs"
Tstart
# Run SRC bankfull estimation routine routine
python3 $srcDir/identify_src_bankfull.py \
-fim_dir $outputDestDir \
-flows $bankfull_flows_file \
Expand All @@ -133,7 +186,7 @@ fi

## RUN SYNTHETIC RATING SUBDIVISION ROUTINE ##
if [ "$src_subdiv_toggle" = "True" ] && [ "$src_bankfull_toggle" = "True" ]; then
echo -e $startDiv"Performing SRC channel/overbank subdivision routine"
l_echo $startDiv"Performing SRC channel/overbank subdivision routine"
# Run SRC Subdivision & Variable Roughness routine
Tstart
python3 $srcDir/subdiv_chan_obank_src.py \
Expand All @@ -146,8 +199,7 @@ fi
## RUN SYNTHETIC RATING CURVE CALIBRATION W/ USGS GAGE RATING CURVES ##
if [ "$src_adjust_usgs" = "True" ] && [ "$src_subdiv_toggle" = "True" ] && [ "$skipcal" = "0" ]; then
Tstart
echo
echo -e $startDiv"Performing SRC adjustments using USGS rating curve database"
l_echo $startDiv"Performing SRC adjustments using USGS rating curve database"
# Run SRC Optimization routine using USGS rating curve data (WSE and flow @ NWM recur flow values)
python3 $srcDir/src_adjust_usgs_rating_trace.py \
-run_dir $outputDestDir \
Expand All @@ -161,8 +213,7 @@ fi
## RUN SYNTHETIC RATING CURVE CALIBRATION W/ RAS2FIM CROSS SECTION RATING CURVES ##
if [ "$src_adjust_ras2fim" = "True" ] && [ "$src_subdiv_toggle" = "True" ] && [ "$skipcal" = "0" ]; then
Tstart
echo
echo -e $startDiv"Performing SRC adjustments using ras2fim rating curve database"
l_echo $startDiv"Performing SRC adjustments using ras2fim rating curve database"
# Run SRC Optimization routine using ras2fim rating curve data (WSE and flow @ NWM recur flow values)
python3 $srcDir/src_adjust_ras2fim_rating.py \
-run_dir $outputDestDir \
Expand All @@ -176,16 +227,14 @@ fi
## RUN SYNTHETIC RATING CURVE CALIBRATION W/ BENCHMARK POINTS (.parquet files) ##
if [ "$src_adjust_spatial" = "True" ] && [ "$src_subdiv_toggle" = "True" ] && [ "$skipcal" = "0" ]; then
Tstart
echo
echo -e $startDiv"Performing SRC adjustments using benchmark point .parquet files"
l_echo $startDiv"Performing SRC adjustments using benchmark point .parquet files"
python3 $srcDir/src_adjust_spatial_obs.py -fim_dir $outputDestDir -j $jobLimit
Tcount
date -u
fi

## AGGREGATE BRANCH TABLES ##
echo
echo -e $startDiv"Aggregating branch hydrotables"
l_echo $startDiv"Aggregating branch hydrotables"
Tstart
python3 $srcDir/aggregate_by_huc.py \
-fim $outputDestDir \
Expand All @@ -198,8 +247,7 @@ date -u

## PERFORM MANUAL CALIBRATION
if [ "$manual_calb_toggle" = "True" ] && [ -f $man_calb_file ]; then
echo
echo -e $startDiv"Performing manual calibration"
l_echo $startDiv"Performing manual calibration"
Tstart
python3 $srcDir/src_manual_calibration.py \
-fim_dir $outputDestDir \
Expand All @@ -208,35 +256,39 @@ if [ "$manual_calb_toggle" = "True" ] && [ -f $man_calb_file ]; then
date -u
fi

echo
echo -e $startDiv"Combining crosswalk tables"
# aggregate outputs

l_echo $startDiv"Combining crosswalk tables"
Tstart
python3 $toolsDir/combine_crosswalk_tables.py \
-d $outputDestDir \
-o $outputDestDir/crosswalk_table.csv
Tcount
date -u

echo -e $startDiv"Resetting Permissions"

l_echo $startDiv"Resetting Permissions"
Tstart
find $outputDestDir -type d -exec chmod -R 777 {} +
find $outputDestDir -type f -exec chmod 777 {} + # just root level files
Tcount
date -u

echo
echo -e $startDiv"Scanning logs for errors. Results be saved in root not inside the log folder."

l_echo $startDiv"Scanning logs for errors and warnings. This can take quite a few minutes so stand by."
echo "Results will be saved in root not inside the log folder."
Tstart
# grep -H -r -i -n "error" $outputDestDir/logs/ > $outputDestDir/all_errors_from_logs.log
find $outputDestDir -type f | grep -H -r -i -n "error" $outputDestDir/logs/ > $outputDestDir/all_errors_from_logs.log
Tcount
date -u
find $outputDestDir -type f | grep -H -r -i -n "error" $outputDestDir/logs/ > \
$outputDestDir/all_errors_from_logs.log &
l_echo "error scan done, now on to warnings scan"

find $outputDestDir -type f | grep -H -r -i -n "warning" $outputDestDir/logs/ > \
$outputDestDir/all_warnings_from_logs.log &
l_echo "warning scan done"
Tcount

echo
echo "++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++"
echo "---- End of fim_post_processing"
echo "---- Ended: `date -u`"
l_echo "++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++"
l_echo "---- End of fim_post_processing"
l_echo "---- Ended: `date -u`"
Calc_Duration $post_proc_start_time
echo
9 changes: 7 additions & 2 deletions src/add_crosswalk.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
from numpy import unique
from rasterstats import zonal_stats

from utils.fim_enums import FIM_exit_codes
from utils.shared_functions import getDriver
from utils.shared_variables import FIM_ID

Expand Down Expand Up @@ -87,14 +88,18 @@ def add_crosswalk(
crosswalk = crosswalk.filter(items=['HydroID', 'feature_id', 'distance'])
crosswalk = crosswalk.merge(input_nwmflows[['order_']], on='feature_id')

if len(crosswalk) < 1:
if crosswalk.empty:
print("No relevant streams within HUC boundaries.")
sys.exit(0)
sys.exit(FIM_exit_codes.NO_VALID_CROSSWALKS.value)

if input_catchments.HydroID.dtype != 'int':
input_catchments.HydroID = input_catchments.HydroID.astype(int)
output_catchments = input_catchments.merge(crosswalk, on='HydroID')

if output_catchments.empty:
print("No valid catchments remain.")
sys.exit(FIM_exit_codes.NO_VALID_CROSSWALKS.value)

if input_flows.HydroID.dtype != 'int':
input_flows.HydroID = input_flows.HydroID.astype(int)
output_flows = input_flows.merge(crosswalk, on='HydroID')
Expand Down
Loading

0 comments on commit 004af73

Please sign in to comment.