Skip to content Skip to sidebar Skip to footer

If a tree falls in a forest and no one is around to hear it, does it make a sound? Maybe not, but we can certainly see it! A recent publication by Christan Thiel, from the German Aerospace Center (DLR), and colleagues entitled UAS Imagery-Based Mapping of Coarse Wood Debris in a Natural Deciduous Forest in Central Germany (Hainich National Park) examines how an object-based approach using the Trimble eCognition software can automate the processes of dead wood in forest environments.

Dead wood plays an important role in forest ecosystems. According to the authors, “it provides micro-habitats for several species and nutrients through the contribution of organic matter. Moreover, it is beneficial to forest regeneration, ecosystem stabilization, soil protection, and carbon sequestration”. But, it can also negatively impact our commercial use of the forest as it can be the source of insect and disease outbreaks. Therefore it is important to map dead wood at scale.

Thiel et al. explain that this can be done at an area or object level – “while the latter aims at the direct mapping of individual downed stems, snags and other dead wood debris” which deliver details on the spatial distribution of the wood debris. The wall-to-wall mapping of such objects is difficult. Getting boots on the ground is time consuming and the positional accuracy is negatively affected by dense forest canopies, making “the subdecimeter positioning accuracy needed to survey” such objects unrealistic. The use of remote sensing via satellite and aerial sensors is also limited as dead wood objects are often too small to be detected at such resolutions.

In this study, the authors have chosen to explore the use of mapping coarse wood debris (CWD) with UAS imagery for the increased spatial resolution and flexibility that UAS systems provide. For this purpose, a 75 square kilometer study in Germany’s Hainich National Park (HNP) was selected. The study area is part of a UNESCO World Heritage Site with unmanaged, primeval beech forests, but also includes a variety of other tree species: ash, alder, sycamore maple, hornbeam, wych elm, common and sessile oak, and chequers.

The UAS data acquisition was conducted during leaf-off conditions (early spring 2019) with overcast skies that rendered “diffuse and consistent illumination conditions” which helped avoid shadows and illumination differences. To improve the chances of capturing data from the forest floor considerable image overlap was prescribed. The resulting image products yielded a 4.2 cm resolution and a dense point cloud with 650 million points was derived. In addition to the original image mosaic, a canopy-free image mosaic was also calculated.

To detect the dead wood objects, the authors have developed a novel object-based approach designed to detect linear features “exclusively based on spectral information”. They use the canopy-free orthomosaic as input in eCognition Developer and a linear extraction algorithm is applied to all three bands (blue, green and red). A variety of different features were used to describe the linear structures: line length, line width, border width and line direction. Because the dead wood was not oriented in a standard compass direction, the linear extraction algorithm was embedded in a loop to cover angles from 0 to 179 degrees. The linear feature information was stored as an additional raster layer with eCognition allowing for the inclusion as a segmentation parameter. A multi-threshold segmentation was applied to generate and classify initial image objects as line or non-line.

In a subsequent classification step, objects classified as linear were adapted “to meet certain object criteria and to eliminate misclassifications”. A particular challenge when working with linear features and raster formats is maintaining connectivity between objects belonging to the same cluster – due to the nature of pixels, vector connectivity is sometimes broken, especially with narrow objects like a fallen tree. In order to connect any broken objects, the classified segments were grown into one another based on values in the line layer.

Thiel et al. state that it is obvious that the “OBIA approach is capable of identifying linear features” and that visually there is “no risk of confusion with non-elongated objects such as tree stumps and/or patches covered with short green vegetation”.

The accuracy of the method considered both the length and number of detected objects. To assess the accuracy of length the authors measured: correctly detected (true positive –  tp length), missed out (false negatives – fn length) and incorrectly detected (false positive – fp length) of CWD objects. The figure below demonstrates how these accuracies were measured.

Of the 6473 meters of dead wood manually mapped in the area, the automated approach successfully identified 4478 m. The authors report that 1995 m were missed by the classification and 887 m over detected. This boils down to a precision of 83.5% and recall of 69.2%. In terms of count, 180 of 225 CWD objects were detected.

Errors of omission are a problem for this method as it requires a certain amount of gaps in the canopy to have a line of sight with downed trees. Even with leaf-off conditions the authors it is not always possible to have a clear picture of the forest floor.

Nevertheless, this method has its advantages, particularly the flexibility and scalability of the images produced – increased resolution can be achieved by flying lower and the cost is relatively low. In addition, the use of purely spectral information can increase the potential of detecting dead wood that has reached higher levels of decay – in this case the decaying log is still quite visible in the image, but does not result in enough of an elevation profile to get picked up in a purely elevation-based analysis.

It was exciting to read about this work as it demonstrates the continued power that image products can provide – we can extract valuable, accurate information from them! I would like to thank the Tama Group for forwarding this work on to me as well as the authors for presenting it during last year’s eCognition User Conference for the German speaking region. 

Follow Us on Social Media

For more news on eCognition