2021 marks an increasing trend of putting analytics directly into the space. While before remote sensing researchers used to download “raw” satellite images of the Earth from centralized websites to their computer for further analysis (and even catch physical film canisters from a satellite ironically named Corona back in the 1950s1), now their work gets much easier. Initiatives like from the European Space Agency2 use artificial intelligence on-board of satellites to process images into ready usable products and send them to the ground. In the context of my research, disaster monitoring and assessment, that could mean no more hours spent on working with raw images and building my own algorithms to derive extent of disaster damage from space. Instead, the focus is shifted towards utilizing downloaded image products, like flood masks, in more complicated computer models for various applications and integrating with other datasets.
My current research is about urban disaster damage assessment that goes beyond simple from-above physical damage identification with satellite images. I am interested in linking the pixels to people3 and understanding impact of past and on-going disasters on a society. This large-scale analysis is again, only possible thanks to the computational progress described above, availability of usable data and emergence of big data to better understand our changing environment. For disaster assessment, that means gathering and incorporating big volumes of almost real-time information from individuals together with field surveys, remotely sensed images of the area and longer-term census data. For example, Fig. 1 (below) shows this “layer cake” of disparate data that was implemented by the team of Ohio State researchers for hurricane flood monitoring. The computer platform considers many sensor feeds, including individuals data from Twitter. Another interesting example is FloodFactor platform5 that rather than assessing extent and damage of on-going disasters, offers predictions of future damage risk for a given residential address. The methodology relies on deriving maps of flood probability and putting those in the context of each building type and historical losses in the area. All in all, while merging datasets in damage assessment is not new, there are still several methodological challenges and key datasets needed to be explored. One of which I am focusing on right now is incorporating economic and social geospatial data as “proxies” for physical damage measures in situations of missing data.Most importantly, I would like to position my work within the context of smart cities. Disaster damage assessment is an integral part of future smart cities that “use connected technology and data to improve the efficiency of city service delivery, enhance quality of life for all, and increase equity and prosperity for residents and businesses”6. That is very important when one realizes the exacerbating climate change realities and social vulnerabilities in cities across the globe that lead to disasters. The risk and damage assessment of the future that we need is the one relying on interconnected sensors (satellites, social media, field data etc.), merging of data, and exercised by local municipalities for better decision-making. Accordingly, I seek to contextualize my future findings from local case studies in a broader narrative of smart city development and disaster risk reduction initiatives.
PhD Student, Department of Geography