You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The daily polygons provide good information on the approximate area of increase each day of the fire event, but don't always produce "progression maps" that appear to depict a fire's spreading behavior. I think part of this arises because there is uncertainty in the per-pixel burn date that the algorithm uses for clustering.
I wonder if we might achieve smoother fire progression (allowing us perhaps to more readily calculate something like linear rate of spread) by incorporating the uncertainty layer that comes with the MCD64a1 product.
My initial thought is to essentially bootstrap the event delineation by picking a random value for each pixel to offset the reported burn date, run the algorithm to produce daily perimeters, then do a full re-run with a different random value for each pixel to offset the reported burn date. The random value would be constrained by the uncertainty layer. So if a pixel's reported burn date is 2021-07-09 and the uncertainty layer for that pixel has a value of 5, then each run of the algorithm would use a random burn date sometime between 5 days before and 5 days after the reported burn date.
With enough iterations, and enough resulting daily polygons, you could overlay all of them to see what the most likely polygons are for each day.
Of course, this would be pretty computationally intensive so would make sense to only run over a small area (e.g., just California, just over the Rim Fire) or over a smaller time window.
Just making note of this here as a potential approach to try in case anyone has interest+capacity for it before I do.
The text was updated successfully, but these errors were encountered:
totally agree on the idea of incorporating the uncertainty layer as an idea to smooth out the daily progression! I've been thinking about that for years, to be honest. My thought was basically having the first date and last date layers as upper and lower bounds, and basically creating an median estimation surface that way.
The daily polygons provide good information on the approximate area of increase each day of the fire event, but don't always produce "progression maps" that appear to depict a fire's spreading behavior. I think part of this arises because there is uncertainty in the per-pixel burn date that the algorithm uses for clustering.
I wonder if we might achieve smoother fire progression (allowing us perhaps to more readily calculate something like linear rate of spread) by incorporating the uncertainty layer that comes with the MCD64a1 product.
My initial thought is to essentially bootstrap the event delineation by picking a random value for each pixel to offset the reported burn date, run the algorithm to produce daily perimeters, then do a full re-run with a different random value for each pixel to offset the reported burn date. The random value would be constrained by the uncertainty layer. So if a pixel's reported burn date is 2021-07-09 and the uncertainty layer for that pixel has a value of 5, then each run of the algorithm would use a random burn date sometime between 5 days before and 5 days after the reported burn date.
With enough iterations, and enough resulting daily polygons, you could overlay all of them to see what the most likely polygons are for each day.
Of course, this would be pretty computationally intensive so would make sense to only run over a small area (e.g., just California, just over the Rim Fire) or over a smaller time window.
Just making note of this here as a potential approach to try in case anyone has interest+capacity for it before I do.
The text was updated successfully, but these errors were encountered: