Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions
Abstract
:1. Introduction
1.1. Sensors
1.2. Platforms
1.3. Vegetation Classification
1.4. Objective
2. Materials and Methods
3. Results
4. Discussion
4.1. Spatial Resolution
4.2. Spectral Sensitivity
4.3. Spatial Extent
4.4. Temporal Frequency
4.5. Recommendations
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Breckenridge, R.P.; Dakins, M.; Bunting, S.; Harbour, J.L.; Lee, R.D. Using unmanned helicopters to assess vegetation cover in sagebrush steppe ecosystems. Rangel. Ecol. Manag. 2012, 65, 362–370. [Google Scholar] [CrossRef]
- Tehrany, M.S.; Kumar, L.; Drielsma, M.J. Review of native vegetation condition assessment concepts, methods and future trends. J. Nat. Conserv. 2017, 40, 12–23. [Google Scholar] [CrossRef]
- Morsdorf, F.; Mårell, A.; Koetz, B.; Cassagne, N.; Pimont, F.; Rigolot, E.; Allgöwer, B. Discrimination of vegetation strata in a multi-layered Mediterranean forest ecosystem using height and intensity information derived from airborne laser scanning. Remote Sens. Environ. 2010, 114, 1403–1415. [Google Scholar] [CrossRef] [Green Version]
- Wallace, L.; Lucieer, A.; Malenovský, Z.; Turner, D.; Vopěnka, P. Assessment of forest structure using two UAV techniques: A comparison of airborne laser scanning and Structure from Motion (SfM) point clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef]
- Zhang, Y.; Chen, H.Y.H.; Taylor, A.R. Aboveground biomass of understorey vegetation has a negligible or negative association with overstorey tree species diversity in natural forests. Glob. Ecol. Biogeogr. 2016, 25, 141–150. [Google Scholar] [CrossRef]
- Hamraz, H.; Contreras, M.A.; Zhang, J. Forest understory trees can be segmented accurately within sufficiently dense airborne laser scanning point clouds. Sci. Rep. 2017, 7, 6770. [Google Scholar] [CrossRef] [PubMed]
- Hamraz, H.; Contreras, M.A.; Zhang, J. Vertical stratification of forest canopy for segmentation of understory trees within small-footprint airborne LiDAR point clouds. ISPRS J. Photogramm. Remote Sens. 2017, 130, 385–392. [Google Scholar] [CrossRef] [Green Version]
- Xie, Y.; Sha, Z.; Yu, M. Remote sensing imagery in vegetation mapping: A review. J. Plant Ecol. 2008, 1, 9–23. [Google Scholar] [CrossRef]
- McClelland, M.P.; Hale, D.S.; van Aardt, J. A comparison of manned and unmanned aerial Lidar systems in the context of sustainable forest management. In Proceedings of the SPIE Commercial + Scientific Sensing and Imaging, Orlando, FL, USA, 15–19 April 2018; p. 9. [Google Scholar]
- Richards, J.A. Remote Sensing Digital Image Analysis: An Introduction, 5th ed.; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Yamazaki, F.; Liu, W.; Takasaki, M. Characteristics of shadow and removal of its effects for remote sensing imagery. In Proceedings of the 2009 IEEE International Geoscience and Remote Sensing Symposium, Cape Town, South Africa, 12–17 July 2009; pp. IV-426–IV-429. [Google Scholar]
- Milas, A.S.; Arend, K.; Mayer, C.; Simonson, M.A.; Mackey, S. Different colours of shadows: Classification of UAV images. Int. J. Remote Sens. 2017, 38, 3084–3100. [Google Scholar] [CrossRef]
- Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
- Chakraborty, A.; Sachdeva, K.; Joshi, P.K. Chapter 4—A reflection on image classifications for forest ecology management: Towards landscape mapping and monitoring. In Handbook of Neural Computation; Academic Press: Cambridge, MA, USA, 2017. [Google Scholar]
- He, K.S.; Bradley, B.A.; Cord, A.F.; Rocchini, D.; Tuanmu, M.-N.; Schmidtlein, S.; Turner, W.; Wegmann, M.; Pettorelli, N. Will remote sensing shape the next generation of species distribution models? Remote Sens. Ecol. Conserv. 2015, 1, 4–18. [Google Scholar] [CrossRef]
- Sanders, A. Mapping the distribution of understorey Rhododendron ponticum using low-tech multispectral UAV derived imagery. In The Roles of Remote Sensing in Nature Conservation: A Practical Guide and Case Studies; Díaz-Delgado, R., Lucas, R., Hurford, C., Eds.; Springer International Publishing: Cham, Switaerland, 2017; pp. 167–181. [Google Scholar]
- Eitel, J.U.H.; Höfle, B.; Vierling, L.A.; Abellán, A.; Asner, G.P.; Deems, J.S.; Glennie, C.L.; Joerg, P.C.; LeWinter, A.L.; Magney, T.S.; et al. Beyond 3-D: The new spectrum of LiDAR applications for earth and ecological sciences. Remote Sens. Environ. 2016, 186, 372–392. [Google Scholar] [CrossRef]
- Dandois, J.P.; Ellis, E.C. High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sens. Environ. 2013, 136, 259–276. [Google Scholar] [CrossRef] [Green Version]
- Zahawi, R.A.; Dandois, J.P.; Holl, K.D.; Nadwodny, D.; Reid, J.L.; Ellis, E.C. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol. Conserv. 2015, 186, 287–295. [Google Scholar] [CrossRef] [Green Version]
- Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
- Dandois, J.; Baker, M.; Olano, M.; Parker, G.; Ellis, E. What is the point? evaluating the structure, color, and semantic traits of computer vision point clouds of vegetation. Remote Sens. 2017, 9, 355. [Google Scholar] [CrossRef]
- Cunliffe, A.M.; Brazier, R.E.; Anderson, K. Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry. Remote Sens. Environ. 2016, 183, 129–143. [Google Scholar] [CrossRef] [Green Version]
- Vuruskan, A.; Yuksek, B.; Ozdemir, U.; Yukselen, A.; Inalhan, G. Dynamic modeling of a fixed-wing VTOL UAV. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 483–491. [Google Scholar]
- Yuksek, B.; Vuruskan, A.; Ozdemir, U.; Yukselen, M.A.; Inalhan, G. Transition flight modeling of a fixed-wing VTOL UAV. J. Intell. Robot. Syst. 2016, 84, 83–105. [Google Scholar] [CrossRef]
- Fletcher, A.T.; Erskine, P.D. Mapping of a rare plant species (Boronia deanei) using hyper-resolution remote sensing and concurrent ground observation. Ecol. Manag. Restor. 2012, 13, 195–198. [Google Scholar] [CrossRef]
- Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [Green Version]
- Tansey, K.; Chambers, I.; Anstee, A.; Denniss, A.; Lamb, A. Object-oriented classification of very high resolution airborne imagery for the extraction of hedgerows and field margin cover in agricultural areas. Appl. Geogr. 2009, 29, 145–157. [Google Scholar] [CrossRef]
- Platt, R.V.; Rapoza, L. An evaluation of an object-oriented paradigm for land use/land cover classification. Prof. Geogr. 2008, 60, 87–100. [Google Scholar] [CrossRef]
- Tian, J.; Chen, D.M. Optimization in multi-scale segmentation of high-resolution satellite images for artificial feature recognition. Int. J. Remote Sens. 2007, 28, 4625–4644. [Google Scholar] [CrossRef]
- Tuia, D.; Volpi, M.; Copa, L.; Kanevski, M.; Munoz-Mari, J. A survey of active learning algorithms for supervised remote sensing image classification. IEEE J. Sel. Top. Signal Process. 2011, 5, 606–617. [Google Scholar] [CrossRef]
- Laliberte, A.S.; Rango, A.; Havstad, K.M.; Paris, J.F.; Beck, R.F.; McNeely, R.; Gonzalez, A.L. Object-oriented image analysis for mapping shrub encroachment from 1937 to 2003 in southern New Mexico. Remote Sens. Environ. 2004, 93, 198–210. [Google Scholar] [CrossRef]
- Lopez-Granados, F.; Jurado-Exposito, M.; Pena-Barragan, J.M.; Garcia-Torres, L. Using Remote Sensing for Identification of Late-Season Grass Weed Patches in Wheat. Weed Sci. 2006, 54, 346–353. [Google Scholar] [CrossRef]
- Teillet, P.M.; Staenz, K.; William, D.J. Effects of spectral, spatial, and radiometric characteristics on remote sensing vegetation indices of forested regions. Remote Sens. Environ. 1997, 61, 139–149. [Google Scholar] [CrossRef]
- Antonarakis, A.S.; Richards, K.S.; Brasington, J. Object-based land cover classification using airborne LiDAR. Remote Sens. Environ. 2008, 112, 2988–2998. [Google Scholar] [CrossRef]
- Pickering, C.; Byrne, J. The benefits of publishing systematic quantitative literature reviews for PhD candidates and other early-career researchers. High. Educ. Res. Dev. 2014, 33, 534–548. [Google Scholar] [CrossRef]
- Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Int. J. Remote Sens. 2017, 38, 2037–2052. [Google Scholar] [CrossRef]
- Bedell, E.; Leslie, M.; Fankhauser, K.; Burnett, J.; Wing, M.G.; Thomas, E.A. Unmanned aerial vehicle-based structure from motion biomass inventory estimates. J. Appl. Remote Sens. 2017, 11, 026026. [Google Scholar] [CrossRef]
- Chisholm, R.A.; Cui, J.; Lum, S.K.Y.; Chen, B.M. UAV LiDAR for below-canopy forest surveys. J. Unmanned Veh. Syst. 2013, 01, 61–68. [Google Scholar] [CrossRef]
- Getzin, S.; Wiegand, K.; Schöning, I. Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods Ecol. Evol. 2012, 3, 397–404. [Google Scholar] [CrossRef]
- Leduc, M.-B.; Knudby, A. Mapping wild leek through the forest canopy using a UAV. Remote Sens. 2018, 10, 70. [Google Scholar] [CrossRef]
- Lopatin, J.; Fassnacht, F.E.; Kattenborn, T.; Schmidtlein, S. Mapping plant species in mixed grassland communities using close range imaging spectroscopy. Remote Sens. Environ. 2017, 201, 12–23. [Google Scholar] [CrossRef]
- Mafanya, M.; Tsele, P.; Botai, J.; Manyama, P.; Swart, B.; Monate, T. Evaluating pixel and object based image classification techniques for mapping plant invasions from UAV derived aerial imagery: Harrisia pomanensis as a case study. ISPRS J. Photogramm. Remote Sens. 2017, 129, 1–11. [Google Scholar] [CrossRef]
- Mandlburger, G.; Wieser, M.; Hollaus, M.; Pfennigbauer, M.; Riegl, U. Multi-temporal UAV-borne LiDAR point clouds for vegetation analysis-a case study. In Proceedings of the EGU General Assembly Conference Abstracts, Vienna Austria, 17–22 April 2016; p. 7036. [Google Scholar]
- Mitchell, J.J.; Glenn, N.F.; Anderson, M.O.; Hruska, R.C.; Halford, A.; Baun, C.; Nydegger, N. Unmanned aerial vehicle (UAV) hyperspectral remote sensing for dryland vegetation monitoring. In Proceedings of the 2012 4th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Shanghai, China, 4–7 June 2012; pp. 1–10. [Google Scholar]
- Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing is important: Unmanned aircraft vs. satellite imagery in plant invasion monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef]
- Perroy, R.L.; Sullivan, T.; Stephenson, N. Assessing the impacts of canopy openness and flight parameters on detecting a sub-canopy tropical invasive plant using a small unmanned aerial system. ISPRS J. Photogramm. Remote Sens. 2017, 125, 174–183. [Google Scholar] [CrossRef]
- Van Auken, O.W.; Taylor, D.L. Using a drone (UAV) to determine the Acer grandidentatum (bigtooth maple) density in a relic, isolated community. Phytologia 2017, 99, 208–220. [Google Scholar]
- Vepakomma, U.; Cormier, D. Potential of multi-temporal UAV-borne lidar in assessing effectiveness of silvicultural treatments. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 393–397. [Google Scholar] [CrossRef]
- Weil, G.; Lensky, I.; Resheff, Y.; Levin, N. Optimizing the timing of unmanned aerial vehicle image acquisition for applied mapping of woody vegetation species using feature selection. Remote Sens. 2017, 9, 1130. [Google Scholar] [CrossRef]
- PlanetTeam. Planet Application Program Interface: In Space for Life on Earth; PlanetTeam: San Francisco, CA, USA, 2017; Available online: https://api.planet.com (accessed on 21 August 2017).
- Civil Aviation Safety Authority. Unmanned Aircraft and Rocket Operations; CASR Part 101; CASR: Canberra, ACT, Australia, 2003. [Google Scholar]
- Marx, A.; McFarlane, D.; Alzahrani, A. UAV data for multi-temporal Landsat analysis of historic reforestation: A case study in Costa Rica. Int. J. Remote Sens. 2017, 38, 2331–2348. [Google Scholar] [CrossRef]
- Gwenzi, D. LiDAR remote sensing of savanna biophysical attributes: Opportunities, progress, and challenges. Int. J. Remote Sens. 2017, 38, 235–257. [Google Scholar] [CrossRef]
- Cui, J.Q.; Lai, S.; Dong, X.; Chen, B.M. Autonomous navigation of UAV in foliage environment. J. Intell. Robot. Syst. 2016, 84, 259–276. [Google Scholar] [CrossRef]
- Cui, J.Q.; Lai, S.; Dong, X.; Liu, P.; Chen, B.M.; Lee, T.H. Autonomous navigation of UAV in forest. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 726–733. [Google Scholar]
- Johansen, K.; Erskine, P.D.; McCabe, M.F. Using unmanned aerial vehicles to assess the rehabilitation performance of open cut coal mines. J. Clean. Prod. 2019, 209, 819–833. [Google Scholar] [CrossRef]
First Author; Year | Title | Objective (Short); Understorey Assessment | Geographic Location; Ecosystem; Field Validation; Platform | Spectral Range; Spatial Resolution (mm for Spectral, Points per m2 for laser); Extent Covered (ha) | Flight Frequency; Season | Findings/Conclusions |
---|---|---|---|---|---|---|
Ahmed; 2017 [36] | Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle | Different classification methods to assess land cover with UAS; partial | Canada; forest, agricultural; spectral calibration; fixed-wing | RGB, RGB+NIR+RE; ND; 1450 | once; summer | They successfully classified land cover using spectral information along with texture and structure. They had 95% accuracy at broadest level (i.e., forest, shrub, herbaceous), 82% identifying overstorey species, tall or short shrubs, and grasses or crops, and 89% identifying shrub tree species and crop types. |
Bedell; 2017 [37] | Unmanned aerial vehicle-based structure from motion biomass inventory estimates | UAS-based imagery and SfM algorithms to estimate over- and understorey biomass; partial | United States; riparian; vegetation, geoposition; quadcopter | RGB; ND; 0.8 | ND; ND | They were able to count stems at a more spatially representative scale than fieldwork alone. Their use of Structure from Motion (SfM) resulted in a 3D cloud comparable to LiDAR and field-based methods. |
Breckenridge; 2012 [1] | Using unmanned helicopters to assess vegetation cover in sagebrush steppe ecosystems | Assess the use of UAS to collect vegetation cover data; yes | United States; semi-arid; vegetation; helicopter | RGB; ND; 0.0084 UAS, 0.00045 traditionally sampled | once; summer | Comparing UAS-imagery and fieldwork, they found similar cover areas of grass, litter, bare ground, and dead shrub. However, their UAS method overestimated shrub cover by misclassifying forbs. They concluded that UAS are cost-effective techniques to assess vegetation cover. |
Chisholm; 2013 [38] | UAS LiDAR for below-canopy forest surveys | Use UAS LiDAR for understorey; yes | Singapore; roadside; vegetation; quadcopter | laser; ND; 0.04 | ND; ND | They reliably detected and measured trees with a DBH >200 mm. They had issues with GPS reading in understorey vegetation, and suggest that monitoring understorey vegetation will be best in ‘forests on flat terrain with an open understorey and large regular-shaped trees’. |
Cunliffe; 2016 [22] | Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry | SfM photogrammetry (point cloud) to quantify biomass in semi-arid rangelands; yes | United States; semi-arid; no; hexacopter | RGB; 10; 10 | once; autumn | Their use of SfM allowed the structural differentiation of individuals from 20-mm grass tussocks to trees. |
Dandois; 2013 [18] | High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision | The use of UAS to develop SfM point clouds; partial | United States; forest, floodplain; no; hexacopter | RGB, laser; 20 to 67 SfM, 1.7 to 45 laser; 18.75 | multiple (16 months); LiDAR leaf off, UAS 16 month period in all seasons | They successfully used SfM to create 3-D point clouds comparable to LiDAR but coupled with multispectral information to identify and monitor vegetation based on structural and spectral attributes, at a temporal frequency that allows the assessment of phenological variations. |
Getzin; 2012 [39] | Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles | The use of UAS to monitor understorey biodiversity in forest; yes | Germany; forest; vegetation; fixed-wing | RGB; 70; 20 | twice; summer | They found that forest gaps can be used to assess understorey biodiversity using high-resolution imagery. They found a correlation between forest gaps and vegetation diversity. |
Leduc; 2018 [40] | Mapping wild leek through the forest canopy using a UAV | Assess if UAS-imagery can be used to find and map wild leek; yes | Canada; forest; geoposition, colour calibration; quadcopter | RGB; 50; 8.6 | once; spring | They were able to identify wild leek with 76% accuracy, but suggest this would not be possible in areas where understorey species have similar spectral signatures and phenology. They suggest getting acquainted to temporal variations in phenological attributes of different species to choose the appropriate flight times to identify different species. |
Lopatin; 2017 [41] | Mapping plant species in mixed grassland communities using close range imaging spectroscopy | Assess use of UAS to identify grassland species; yes | Germany; botanical garden; vegetation, geoposition; simulated UAS (scaffold) | hyperspectral (61 bands 398 to 957 nm); 3; 6.87 × 10−5 | once; summer | They were only successful in areas with low structural complexity and low canopy overlap. They had trouble identifying species or individuals with great spectral variation due to mixed signals, and suggested higher spatial resolution could help resolve their issue. |
Mafayana; 2017 [42] | Evaluating pixel and object based image classification techniques for mapping plant invasions from UAV derived aerial imagery: Harrisia pomanensis as a case study | Compare pixel vs. object based classification for an invasive species; yes | South Africa; semi-arid; geoposition; unspecified UAS | RGB; 36.5; 872 | once; winter | Through object-based classification, the authors successfully identified invasive species based on their phenological characteristics. They noted that their classification was only possible in areas without overstorey. |
Mandlburger; 2016 [43] | Multi-temporal UAV-borne LiDAR point clouds for vegetation analysis-a case study | Assess temporal change in point cloud density (leaf on vs. leaf off); yes | Austria; forest; no; octopter | laser; 267 to 517 on ground, 348 to 757 on canopy; ND | twice; winter, spring, autumn | The method successfully collected data with similar point cloud densities under leaf on and leaf off conditions. |
Mitchell; 2012 [44] | Unmanned aerial vehicle (UAV) hyperspectral remote sensing for dryland vegetation monitoring | Compare classification methods of vegetation including shrubs, based on UAS hyperspectral data; partial | United States; semi-arid; vegetation, geoposition; fixed-wing | hyperspectral; ND; 0.006 | once; spring | They were able to acquire composite images suitable for classification from hyperspectral sensors, with ‘complications’ on data acquisition. To monitor shrub cover, unsupervised classification performed better the supervised methods. They recommended the acquisition of ground-truthing data, and suggest the acquisition of time series with a wide spectral range to ‘effectively’ identify understorey species. |
Müllerová; 2017 [45] | Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion Monitoring | Assess temporal timing and camera resolution needed to detect invasives based on phenology; yes | Czech Republic; river floodplain, grassland?; geoposition; fixed-wing | multispectral (RGB + modified NIR); 50; 225 | multiple (5 months); summer-autumn | They successfully identified weeds (during leaf-off conditions). They concluded that phenological stages are tightly related to detection accuracy, and highlighted the importance to monitor a wide temporal range to evidence those phenological differences. |
Perroy; 2017 [46] | Assessing the impacts of canopy openness and flight parameters on detecting a sub-canopy tropical invasive plant using a small unmanned aerial system | Assess the use of UAS to detect an understorey invasive tree; yes | United States; forest; vegetation, geoposition; quadcopter | RGB; 13.7–53.1; 0.8 | once; summer | The authors successfully detected understorey trees at flight altitudes between 30 and 40 m, but not higher. Canopy openness >40% allowed detection of all plants, while those at <10% were undetectable. They found that the use of oblique photos increased detection rates. |
Van Auken; 2017 [47] | Using a drone (UAV) to determine the Acer grandidentatum (bigtooth maple) density in a relic, isolated community | Count numbers of trees (understorey and overstorey) subject to heavy grazing by white-tailed deer (Odocoileus virginianus); yes | United States; forest; vegetation; quadcopter | RGB; 40.64; 1520 ha UAS, 0.56 ha traditionally sampled | once; autumn to spring | Successfully identified Acer grandidentatum woodland communities by monitoring phenological changes and flying the UAS when the leaves were different. |
Vepakomma; 2017 [48] | Potential of multi-temporal UAV-borne LiDAR in assessing effectiveness of silvicultural treatments | Use of LiDAR to detect changes in forest treatments and laser reaching the ground through autumn foliage; partial | Canada; forest; geoposition; helicopter | laser; ND; 10.5 | twice; summer, autumn | They suggest that understorey vegetation can be monitored through LiDAR by removing the overstorey information from the point cloud. They suggest that monitoring understorey structure is possible, and their method can be used to assess successional stages. |
Weil; 2017 [49] | Optimizing the timing of unmanned aerial vehicle image acquisition for applied mapping of woody vegetation species using feature selection | Asses UAS to identify species, where herbaceous patches were treated as an item (no species identified); partial | Israel; forest; vegetation; self produced, fixed-wing | multispectral (RGB + NIR + RE); 200, on average; 10 | five; winter, spring, summer | They successfully identified different tree and shrub species as well as herbaceous patches by flying at different times that were phenologically relevant for the different species in their area. They conclude that flying at multiple relevant times can substitute the need for to obtain data on a wider spectral range. |
Zahawi; 2015 [19] | Using lightweight unmanned aerial vehicles to monitor tropical forest recovery | The use of SfM to assess structural complexity of restoration sites; partial | Costa Rica; forest; vegetation; hexacopter | RGB; ND; 13 | once; summer | They concluded that SfM methods couple spectral and structural information that can be used to assess habitats and forest dynamics, as well as to obtain biomass metrics. |
Small Size of Understorey Species | Similar Spectral Characteristics of Understorey Species | Spatial Overlap within the Understorey | Canopy Penetration | |
---|---|---|---|---|
Spatial resolution | High spatial resolution imagery is required to detect small individual plants | High resolution imagery might help with the identification of species within small gaps in the understorey | The use of high resolution (small pixel size) imagery reduces the amount of mixed pixels between canopy and the understorey, helping to identify understorey species | |
Spectral resolution | Higher spectral resolution will help in the discrimination of subtle differences in vegetation reflectance (e.g., the use of multispectral and hyperspectral sensors) | |||
Temporal frequency | Targeted surveys can allow discrimination based on phenological changes such as senescence and flowering | Surveys can be targeted to coincide with leaf off periods in deciduous forests | ||
Spatial extent | Greater potential to avoid spectral signature alterations due to shades (sun angle and clouds). | |||
Platform type | The use of multi-rotors with collision avoidance might allow sub canopy surveys in the near future | |||
SfM | SfM uses trigonometry to improve ground coverage, and improve penetrability below the canopy of the understorey |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hernandez-Santin, L.; Rudge, M.L.; Bartolo, R.E.; Erskine, P.D. Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions. Drones 2019, 3, 9. https://doi.org/10.3390/drones3010009
Hernandez-Santin L, Rudge ML, Bartolo RE, Erskine PD. Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions. Drones. 2019; 3(1):9. https://doi.org/10.3390/drones3010009
Chicago/Turabian StyleHernandez-Santin, Lorna, Mitchel L. Rudge, Renee E. Bartolo, and Peter D. Erskine. 2019. "Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions" Drones 3, no. 1: 9. https://doi.org/10.3390/drones3010009