Underwater Multispectral Reef Classification





7.13.23



While HySpeed Computing’s focus is typically on remote sensing applications that view the Earth’s surface from high above, sometimes our expertise is translated to a much closer perspective. We recently partnered with scientists at the Universidad Nacional Autónoma de México (UNAM) to help develop innovative new techniques for underwater multispectral imaging of coral reefs.





Embracing the notion that “you can’t manage what you don’t measure”, scientists currently use a wide variety of different techniques to document the health and distribution of the world’s coral reefs. From the imaging side, this includes a range of techniques from underwater photography and video surveys to overhead drones, airplanes, and satellites.

No matter the platform, a key ingredient for efficient large-scale analysis is the development and use of automated, or at least semi-automated, image processing applications. By minimizing the need for hands-on manual analysis, which is both time consuming and prone to bias, automation allows images to be analyzed in significantly greater volume and with higher consistency. This in turn provides a foundation for routine widespread mapping and monitoring of coral reef ecosystems.

With that thought in mind, Dr. Garza-Pérez and his colleagues at UNAM have been working on new techniques for automating the classification of different reef components in underwater imagery. The objective is to develop image-based methods for rapidly quantifying the amount of coral, algae, sponges, and sand within a given area. Rather than relying on standard RGB photos, however, like those obtained by our smartphones, the UNAM scientists are taking advantage of the greater number of spectral bands in multispectral imagery to improve the classification process.

The sensor being used in UNAM’s research is a MicaSense RedEdge-M, which measures five narrow spectral bands: blue, green, red, red edge, and near-infrared. More commonly utilized in airborne drone applications over agricultural fields, here the sensor is deployed within a custom underwater housing and paired with a set of LED underwater video lights to enable image acquisition in the ocean environment.

Analyzing the images acquired by this sensor requires a series of processing steps, from image-by-image calibration and alignment to multi-image orthomosaic construction and classification. As with satellite and airborne data, photogrammetric and remote sensing techniques provide a foundation for performing each of these steps.

As an example, here we highlight the need for spatially aligning the individual spectral bands within each image. Rather than using a single central lens to acquire images, the MicaSense RedEdge-M employs a collection of five lenses spaced equally about the face of the instrument, where each lens acquires an independent image for each spectral band. When working with this sensor at close range – the reef images are acquired just 0.5m from the seafloor – the spacing between the lenses results in offsets between the individual images for each spectral band. And because the seafloor is not flat, but rather a complex three-dimensional surface, these offsets are not uniform between or within the different bands. As a result, the recorded bands for each image are not aligned with one another.


To help address this issue, HySpeed Computing developed an automated workflow within ENVI/IDL – an industry leading remote sensing software package - to evaluate different image alignment algorithms. Included were the ability to automatically generate and filter tie points between the different bands, align the bands using the resulting tie points, subset the resulting multi-band image to just the overlapping area, and update the image metadata. By testing different methods for generating the tie points and image alignments, we were able to identify an optimal combination of algorithms for automatically aligning the underwater MicaSense images.

While this solution was developed specifically for the UNAM coral reef application, a similar version of the same workflow was also utilized for aligning and analyzing images from greenhouses for monitoring the development and health of plant growth. Thus, as with many other image analysis techniques, this is a good example where one approach can often be applied to multiple applications.

In the UNAM project, once the bands were aligned within each image, multiple images could then be mosaiced together for a given reef area. With this mosaic completed, researchers could then focus on their main objective – developing classification techniques to delineate the different reef components in that area.

To learn more about the research being performed by Dr. Garza-Pérez and his colleagues, please check out their recent publication in Remote Sensing:

---

Garza-Pérez, J.R. Barrón-Coronel, F. Coral Reef Benthos Classification Using Data from a Short-Range Multispectral Sensor. Remote Sens. 2022, 14, 5782. https://doi.org/10.3390/rs14225782



images: Garza-Pérez/UNAM