When disaster hits, a quick and coordinated response is needed, and that requires data to assess the nature of the damage, the scale of response needed, and to plan safe evacuations.
From the ground, this data collection can take days or weeks, but a team of UConn researchers has found a way to drastically cut the lag time for these assessments using remote sensing data and machine learning, bringing disturbance assessment closer to near real-time (NRT) monitoring. Their findings are published in Remote Sensing of Environment.
Su Ye, a post-doctoral researcher in UConn's Global Environmental Remote Sensing Laboratory (GERS) and the paper's first author, says he was inspired by methods used by biomedical researchers to study the earliest symptoms of infections.
"It's a very intuitive idea," says Ye. "For example, with COVID, the early symptoms can be very subtle, and you cannot tell it's COVID until several weeks later when the symptoms become severe and then they confirm infection."
Ye explains this method is called retrospective chart review (RCR) and it is especially helpful in learning more about infections that have a long latency period between initial exposure to the development of obvious infection.
"This research uses the same ideas. When we're doing land disturbance monitoring of things like disasters or diseases in forests, for example, at the very beginning of our remote sensing observations, we may have very few or only one remote sensing image, so catching the symptoms early could be very beneficial," says Ye.
Until now, early detection was more challenging, because it is harder to differentiate change in the early post-disturbance stages
Several days or weeks after a disturbance, researchers can confirm a change, and much like a patient diagnosed with COVID, Ye reasoned they could trace back and do a retrospective analysis to see if earlier signals could be found in the data and if those data could be used to construct a model for near real-time monitoring.
Ye explains that they have a wealth of data to work with—for example, Landsat data stretches back 50 years—so the team could perform a full retrospective analysis to help create an algorithm that can detect changes much faster than current methods which rely on a more manual approach.
"There is so much data and many good products but we have never taken full advantage of them to retrospectively analyze the symptoms for future analysis. We have never connected the past and the future, but this work is bringing these two together."
Associate Professor in the Department of Natural Resources and the Environment and Director of the GERS Laboratory Zhe Zhu says they used the multitudes of data available and applied machine learning, along with physical barriers to pioneer a technique that pushes the boundary of near real-time detection to, at most, four days as opposed to a month or more.
Until now, early detection was more challenging, because it is harder to differentiate change in the early post-disturbance stages, says Zhu.
"These data contain a lot of noise caused by things like clouds, cloud shadows, smoke, aerosols, even the changing of the seasons, and accounting for these variations makes the interpretation of real change on the Earth's surface difficult, especially when the goal is to detect those disturbances as soon as possible."