This week, the Kakhovka (Каховська) dam in Ukraine has been destroyed. According to Wikipedia, the reservoir held a total of 18.2 billion m³ of water. If this volume was shaped into a cube it would measure 2.63 km by 2.63 km by 2.63 km. Since 2023-06-06, water from the Kakhovka reservoir is flooding downstream areas. In what follows I will show how freely available satellite or remote sensing data can be used to better understand, assess, and monitor the flooding that is occurring downstream from Kakhovka and flooding in general.
(Click here if you would like a brief introduction of remote sensing and some of the terms used in this text)
I used data from the satellite types Sentinel-1 and Sentinel-2 from the Sentinel satellite program for this analysis. The data can be obtained free of charge from the Sentinel EO Browser.
Visible spectrum: Sentinel-2
The dam
The following pictures show the Kakhovka dam in the center, first on 2023-06-03 at 08:57 UTC (intact) and second on 2023-06-08 at 08:57 UTC (destroyed). While the first image has good atmospheric conditions with only few clouds, in the second image the atmospheric water content is rather high and thus the image contrast quite weak. I used a custom script to optimize the color depiction called Sentinel-2 L2A True Color Optimized (CC-SA) (click to enlarge):


Zooming out
Using the same data and the same custom script for its visualisation we can see the dam (near the eastern edge of the image) as well as the area downstream from Kakhovka. Also at this zoom level, the atmospheric conditions vary wildly between the „before“ and the „after“ image (click to enlarge):


Water detection
From these visualisation modes, it is not very clear where there is water, especially in the second image. In order to clearly depict the inundated areas we can use custom band combinations and indices. First, I tried False Color (Urban) band combination. This band combination maps the bands B12, B11 and B04 of Sentinel-2 to the red, green and blue channel (R-G-B), respectively. The first two are short-wave infrared (SWIR) around 2190 nm and 1610 nm wavelength, respectively. B04 is in the red part of the visible electro-magnetic spectrum at around 665 nm wavelength. As water reflects (almost) no SWIR radiation and very little in the red spectrum it usually shows up as dark blue or even black in this configuration of Sentinel-2 data (click to enlarge):


The second approach to showing water more clearly is using a suitable index. The following pictures show the Water in Wetlands (WIW) Index (CC-SA). This index applies a threshold onto the measured radiation in the near-infrared (NIR) and the shortwave infrared (SWIR) parts of the electro-magnetic spectrum and paints the areas in question in blue (click to enlarge):


The following pair of images shows the large inundated area in the lower part of the zoomed-out scenes above (click to enlarge):


Cloud cover
There are quite some clouds obstructing the view, especially in the „after“ images of the central area of the zoomed-out scenes above. I checked if there is even more recent Sentinel-2 data than the „after“ imagery shown above (from 2023-06-08) that may enable a clearer assessment of the flooding there. There is indeed one more recent dataset: a Sentinel-2 re-visited the eastern part of the area-of-interest on 2023-06-10. However, the cloud cover is even more pervasive and dense. There is virtually nothing that can be gleaned from this data:

Synthetic aperture radar: Sentinel-1
Since the visible range of the spectrum experiences problems with cloud coverage, it may be worthwhile to (also) rely on data from synthetic aperture radar (SAR) sensors. Their mode of operation, specifically the wavelengths of the electro-magnetic spectrum which these sensors work at, enable them to penetrate clouds and „see“ through them.
Below I visualised two Sentinel-1 scenes from the SAR instrument on board, acquired on 2023-06-02 and on 2023-06-09, respectively (click to enlarge):


As you can see, there are indeed no clouds visible that could obstruct analysis of the flood extent. The scene shows water in blue and black. Qualitatively, it becomes already clear from these simple images that also the regions near the center of the area of interest (that were covered in clouds in the visible-spectrum images above) have been heavily flooded. In order to use these images in a full-fledged analysis, one would properly despeckle them (i.e., remove SAR-related noise), segment the water areas and conduct geospatial analyses from there.
Thankfully, the UNOSAT Emergency Mapping Service has been activated. Various remote sensing-based information products are available to those in charge of the ongoing flood response.
Remote sensing
„Remote sensing“ denotes the acquisition of information about an object or phenomenon without making physical contact with it. The term is often applied to acquiring information about the earth surface. In this context, remote sensing can be understand as sub-discipline of geography. Using remote sensing technology and techniques, e.g. data from the Sentinel satellite program obtained from the Sentinel EO Browser, any large-ish scale phenomenon can be monitored from space.
There is a plethora of commercial and free-to-use remote sensing platforms (i.e., satellites) and every satellite carries at least one, and often several, sensors. The choice of a suitable platform and sensor depends strongly on the application domain. Sensors have varying combinations of imaging or data capturing capabilities. They sense in different regions of the electro-magnetic spectrum.
Sensors may be passive (capturing radiation that is emitted or reflected from natural sources by the earth or atmosphere) or active (emitting a signal themselves and measuring characteristics of the reflection from the earth surface). A type of passive sensors are sensors in the visual or thermal (infrared) spectrum, active sensors e.g. use radar or LiDAR technology and can typically «see» through some objects such as clouds or even vegetation.
Different sensors also have varying spatial resolution (pixel size) and swath widths (width of the image). Combined with platform characteristics these result in varying return times or temporal resolution (roughly: how often a given sensor captures data of a specific location on earth). Return times may be especially important for time-sensitive applications.
Finally, with remotely sensed data we can use specific combinations of input data, e.g. combinations of spectral bands or polarisations (for radar sensors), potentially post-processing, and specific visualisation types to better understand the situation on the ground.