Next Article in Journal
Drones for Conservation in Protected Areas: Present and Future
Previous Article in Journal
Acknowledgement to Reviewers of Drones in 2018
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions

by
Lorna Hernandez-Santin
1,*,
Mitchel L. Rudge
1,2,
Renee E. Bartolo
2 and
Peter D. Erskine
1
1
Centre for Mined Land Rehabilitation, The University of Queensland, Brisbane, QLD 4072, Australia
2
Environmental Research Institute of the Supervising Scientist, Department of Environment and Energy, Darwin, NT 0820, Australia
*
Author to whom correspondence should be addressed.
Submission received: 13 December 2018 / Revised: 6 January 2019 / Accepted: 7 January 2019 / Published: 8 January 2019

Abstract

:
Understorey vegetation plays an important role in many ecosystems, yet identifying and monitoring understorey vegetation through remote sensing has proved a challenge for researchers and land managers because understorey plants tend to be small, spatially and spectrally similar, and are often blocked by the overstorey. The emergence of Unmanned Aerial Systems (UAS) is revolutionising how vegetation is measured, and may allow us to measure understorey species where traditional remote sensing previously could not. The goal of this paper was to review current literature and assess the current capability of UAS to identify and monitor understorey vegetation. From the literature, we focused on the technical attributes that limit the ability to monitor understorey vegetation—specifically (1) spatial resolution, (2) spectral sensitivity, (3) spatial extent, and (4) temporal frequency at which a sensor acquires data. We found that UAS have provided improved levels of spatial resolution, with authors reporting successful classifications of understorey vegetation at resolutions of between 3 mm and 200 mm. Species discrimination can be achieved by targeting flights to correspond with phenological events to allow the detection of species-specific differences. We provide recommendations as to how UAS attributes can be tailored to help identify and monitor understorey species.

1. Introduction

The interaction of plant species occurrence, diversity, distribution, and abundance result in particular vegetation assemblages. Disturbances to these different vegetation assemblages can affect vegetation condition, including cover and structure, which reflects the environmental health of most ecosystems [1,2]. The structure of vegetation is associated with its spatial distribution and is composed of horizontal and vertical attributes of species within each habitat [3,4]. Structure refers to both the distribution of species in horizontal space and the vertical subdivision of multiple strata, with the number of strata dependent upon the complexity of the vegetation. In woody ecosystems, the overstorey is the uppermost layer of vegetation, often represented by the canopies of the tallest trees. All vegetation that falls under the overstorey represent classes of the understorey, and includes non-emergent individuals of overstorey tree species, smaller stature trees, climbers, shrubs, forbs, and grasses. Most woody biomass is stored in overstorey species, while the understorey generally has higher biodiversity, and plays a critical role in forest nutrient cycles, soil carbon accumulation, and stand development [5,6,7]. As such, understorey vegetation is a vital component of most terrestrial woodland and forest ecosystems, although monitoring its dynamics remains challenging for researchers and land managers globally.
Methods to monitor vegetation depend on the specific goals of the research or management activities. The traditional, and most prevalent, methods involve hands-on fieldwork [2]. However, the spatial extent at which fieldwork can be conducted is usually limited due to its high logistic cost, in terms of time, money, and site access. Therefore, field-based methods have been considered ‘impractical’ for the monitoring of large areas [2,8,9]. The spatial extent problem has been overcome to a certain extent by the emergence of remote sensing methods since the 1970s [2]. For example, Breckenridge et al. [1] reported that their remote sensing method required 22% of the time that would have been needed to sample their sites using the point-frame field method. Remote sensing is based on the use of sensors to detect different properties of the environment from a distance and this data is often trained or verified with comparatively minimal fieldwork. The identification of plant species and their patterns can be achieved through remote sensing due to the inherent differences between species in terms of distribution, extent, structure, colour, and texture [2,10].
Remote sensing of vegetation cover has been mostly limited to monitoring the overstorey, while the understorey and its contribution to the environment has been greatly overlooked due to the challenges it represents. Individual understorey species tend to be smaller than canopy trees, requiring finer spatial resolution for successful detection and identification through remote sensing technologies. Many understorey species can also have short lifecycles, making the timing of data collection crucial for detection. Traditional remote sensing systems have generally failed to detect and monitor understorey species due to limitations of platforms and sensors used for data acquisition. Platform and sensor limitations are evident when considering cloud cover or the angle of the sun at the time of data acquisition. The associated effects of shadows from clouds, and the overstorey on the understorey, can alter spectral signatures or obscure understorey species altogether [11,12]. Although sensors are designed to account for them to some extent [10] and there are desktop methods to correct the radiance of shaded areas (e.g., Yamazaki et al. [11]), prevention of shadows is preferred in most cases.

1.1. Sensors

Remote sensors can acquire data passively or actively, with passive sensors detecting the reflection of the sun’s radiation [13,14], and active sensors sending a signal and measuring the return to detect different properties [2,14], most commonly through laser or microwave signals [15]. The most common form of passive sensors involves image acquisition and their analyses, where vegetation is classified or identified based on colour, texture, and, to a limited extent, structure, with each pixel of the image capturing the light reflectance of the species closest to the sensor [2,16]. Differences in vegetation colour are driven by slight variations in the proportion of pigments within their cells, such as chlorophyll, carotene, and anthocyanin [2,16]. Structural differences include the distribution of cellular components such as the spaces between organelles or between cells that may hold different proportions of air and water, resulting in different light reflectance properties [2]. Differences in vegetation colour and structure lead to species-specific combinations of light reflectance across the spectrum, known as spectral signatures [2,16]. However, healthy vegetation most commonly reflects light in the green and near infrared portions of the spectrum, leading to similar spectral signatures that are overlapping for species with similar characteristics.
The most prevalent active sensors currently used are Light Detection and Ranging (LiDAR) sensors, which send many pulses per second that bounce back (return) after finding a surface (barrier) and use the temporal axis to determine the distance and angle of objects, resulting in a 3-Dimensional cloud [4,13,17]. The number of returns acquired can be discrete (usually 4 returns) or continuous (full-waveform) [17]. LiDAR signals can penetrate lower strata [7]. However, the proportion of pulses that reach lower strata and return to the sensor are lower, for example, Hamraz et al. [6] found that 90% of the pulses reached the forest overstorey whilst only 60% of the pulses reached the understorey. The issue with sensors and detection of vegetation is most apparent when considering that passive sensors capture multiple wavelengths of the layer closest to the sensor, generally represented by overstorey species, while active LiDAR sensors reflect single-wavelength structural features, many of which are not species-specific.

1.2. Platforms

Along with the sensor type, the remote sensing platform plays an important role in the ability to detect understorey vegetation, with trade-offs between cost, spatial extent, and spatial and temporal resolution. Traditional systems are delivered through satellite, aerial, or terrestrial technologies. Satellites have allowed the surface of the whole world to be monitored (actively and passively) and many provide freely available data, but have coarse spatial resolutions and are monitored based on set schedules that do not consider the needs of each project. Aircraft systems offer increased spatial resolution, but the cost is prohibitive for most projects, often exceeding ~USD$20,000 per flight [18,19]. Terrestrial delivery methods, such as hand held laser scanning devices or cameras mounted on tripods, offer high spatial resolution, but are extremely limited in spatial extent and site accessibility. Satellites and aircraft capture images at a distance of typically >400 m from target vegetation, where the spatial resolution is generally too low to discriminate understorey plants. Conversely, small Unmanned Aerial Systems (sUAS, referred as UAS from here on) can fly just above the canopy, which increases the spatial resolution considerably, possibly allowing the discrimination of small understorey plants.
Also known as drones or Unmanned Aerial Vehicles (UAVs), UAS can produce high resolution, relatively low cost (starting from ~USD$300), imagery at a moderate spatial scale (1–1000 hectares) and user defined collection timing [19,20]. The development of UAS has been met with the development of small, lightweight, active and passive sensors (e.g., LiDAR, hyperspectral, and multispectral) that have been historically restricted to aerial and satellite remote sensing. One of the biggest influences of UAS technology to remote sensing has been the concomitant creation of 3D point clouds based on passive sensors, using Structure from Motion (SfM) algorithms. Although the combination of spectral and structural information from traditional remote sensing systems was possible, it was infrequently done, as it required matching data from different sources, with different extents, alignments, and even scales [18,21]. In contrast, SfM algorithms combine spectral information acquired from image sensors with structural information gathered from the movement and positioning of the UAS itself [4,18]. The structural portion of SfM requires the acquisition of overlapping images, covering as much of the surface as possible from different angles [22], which are then matched through ‘image feature detector’ algorithms [21].
UAS exist with many shapes and flight attributes, however, they are mainly subdivided into two groups: fixed-winged (the UAS-equivalent of airplanes) and multi-rotor (the equivalent of helicopters) platforms. In order to fly, fixed-wing systems must keep moving forward and have horizontal take-off and landing requirements (with some exceptions), while multi-rotors can hover and have vertical take-off and landing capabilities. The main difference among multi-rotors is the number of rotors, which can vary between one and sixteen. Most commonly, multi-rotors have four (known as quadracopter), six (hexacopter), or eight (octocopter) rotors. Fixed-winged systems are more efficient at flying, flying faster (speeds can exceed 80 km/h) can cover larger areas, and can carry heavier payloads and therefore have bigger sensors and more stored energy (batteries/fuel). However, multi-rotor systems have many advantages, including the ability to take-off and land in confined spaces, hover, and manoeuvre through tight spaces. These characteristics can represent advantages or limitations for each UAS type, depending on the requirement of the project. The industry has recognised the different platforms strengths and weaknesses, such that a new wave of vertical take-off and landing (VTOL) UAS are currently attempting to bridge the gap between efficient flight and manoeuvrability (e.g., [23,24]). However, any of these UAS can be flown at user defined intervals to correspond with cues in phenology such as flowering [25] and senescence. As a result, UAS have the potential to overcome many of the difficulties faced when attempting to monitor understorey using traditional remote sensing techniques [26].

1.3. Vegetation Classification

Through remote sensing, vegetation is commonly classified using unsupervised or supervised methods, based on pixels or objects [27], that rely on different algorithms such as maximum likelihood, nearest neighbour, or machine learning (e.g., support vector machines, neural networks) [10,28,29,30]. With unsupervised classification, the user has no input, allowing software processing to detect a user-defined number of classes that are defined throughout the image [10]. Conversely, with supervised classification, the user provides information on the signature of the different classes [10]. This information is usually provided as a geographic area (polygon), with background processing that allows the spectral signature or feature to be extracted and extrapolated (trained) to detect similar characteristics across the image. Pixel based classification considers each pixel of an image as a separate entity, each with the same probability of being casted as any of the available classes [27,31]. Object based classification uses information of particular objects (clusters, known as segments, of pixels with characteristics) and its neighbours to assign them to one of the classes available, incorporating information on shape and texture [27,28,31].
Both supervised and unsupervised methods can be used for spectral (e.g., multispectral) and structural (e.g., LiDAR-derived) data. When spectral data is used, the classification methods are based on reflectance values or the relationships among different bands; where ratios and indices, such as the widely used Normalized Difference Vegetation Index (NDVI), that enhance spectral differences can be calculated [10,32,33]. Having a greater number of bands increases the probabilities of finding species-specific spectral signatures, which becomes more important when trying to identify species with high spectral overlap. Therefore, hyperspectral data can be expected to outperform multispectral data (with more than three bands being better than the traditional red, green, and blue –RGB- data). When structural data is available, the classification methods are based on shapes and other structural features such as vegetation height and percent canopy [34]. The SfM methods have been shown to deliver 3D point clouds that are comparable to those acquired through LiDAR [4], which can also have spectral and structural information comparable to that acquired by complementing LiDAR with fieldwork [18]. Therefore, SfM methods allow a reliable integration of spectral information with structural attributes.

1.4. Objective

The goal of this paper was to review the current literature and assess the current capability of UAS to monitor understorey vegetation. We focused on the technical attributes that limit our ability to identify and monitor understorey vegetation, as identified by Sanders [16], which include (1) spatial resolution, (2) spectral sensitivity, (3) spatial extent, and (4) temporal frequency at which a sensor acquires data from the literature. Spatial resolution limits our ability to differentiate species or features, with coarser resolutions resulting in a higher proportion of mixed features (i.e., mixed pixels). As derived from spectral sensors, spectral sensitivity relates to the proportion of the light spectrum that is measured, which plays a vital role in discriminating species with similar spectral signatures. Spatial extent refers to the area covered, and temporal frequency refers to the number of times and the times of the year a particular area is surveyed and monitored. Through this process, we hope to highlight the successes or limitations of various studies, to help guide those attempting to use UAS to conduct species composition surveys and monitor understorey vegetation.

2. Materials and Methods

We used the systematic quantitative literature review method developed by Pickering and Byrne [35], which involves a methodical search of previous research. Between 13 and 16 November 2017, we searched in databases including Web of Science, Science Direct, and Google Scholar, using the following keywords: “UAV understorey”, “UAV understory”, “UAS understorey”, “UAS understory”, “drone understory”, “drone understorey”, “UAV subcanopy”, “UAV sub-canopy”, “drone subcanopy”, and “drone sub-canopy”. Using the same keywords, we searched again on the 15th of October 2018, this time refining the search to include only results from 2017 and 2018 to account for most recent publications that could have occurred after the original search (i.e., publications made on or after November 2017).
We only considered original-research papers published in peer-reviewed journals available online. We eliminated articles that did not include any information on UAS. Displaying ten articles per results page, we stopped searching after looking on five pages that did not have relevant information. By this point, only one of the search words was detected in the results. Articles were re-checked to assess their relevance in the context of monitoring understorey vegetation using UAS. We excluded articles that were not related to the subject, that used the term ‘understorey’ only as part of the site description, or those that were not related to the subject. We extracted information on year of publication, objective, whether the understory was a main part of their study or not (understory assessment), geographic location and ecosystem, platform and spectral range, whether they had any measure of validation on the field and its type (field validation), flight frequency and season, extent covered (ha) and spatial resolution (mm), and findings/conclusions.

3. Results

Our search keywords found anything between zero and 656 results. Based on the original search (November 2017), Web of Science had the fewest search results with zero hits on four of the ten keyword combinations, Science direct had between one and 20 search results, and Google Scholar had the most search results ranging between 77 and 656. Google Scholar included results with different spellings of the keywords and included results with one keyword rather than the two. The second search (October 2018) had similar patterns per webpage, but with more zero-result keywords attributed to the temporal filter applied to the search.
In total, we downloaded 131 articles that were then re-checked to assess their relevance. Ultimately, we only found 18 original-research articles that assessed both understorey vegetation and used high resolution remote sensing (Table 1; [1,18,19,22,36,37,38,39,40,41,42,43,44,45,46,47,48,49]). We reviewed all of them. Of the 18 original-research articles, six did not have understorey species as their main objective, but were kept due to a partial inclusion of understorey in their objectives or analyses. The article by Lopatin et al. [41] had a terrestrial delivery mechanism (2.5 m scaffold) rather than using a UAS, but was retained because their purpose was to simulate UAS imagery. Half of the articles conducted fieldwork (n = 10), and eight reported taking ground control measurements. Research by Ahmed et al. [36] and Leduc and Knudby [40] reported the use of colour controls to calibrate imagery. The article by Weil et al. [49] clumped most understorey species as ‘herbaceous patches’, but was kept because they identified several shrub species.
Geographically, 17 of the 18 articles reviewed were conducted in the northern hemisphere and one in the southern hemisphere. The majority of the UAS-understorey research has appeared in the literature in recent years, with nine papers published in 2017, three in 2012, two in 2013 and 2016, and a single article in 2015 and 2018. Most papers were conducted in forested environments, with 10 articles reported in natural forests and one in a riparian area. Other ecosystems surveyed included semi-arid environments (n = 4 papers), grassland (botanical garden, n = 1), floodplain (n = 1), and roadside (n = 1). Most research acquired data only once (n = 12), while others collected data two times (n = 2), three times (n = 1), five times (n = 1), six times (n = 1), and one did not specify. Based on the traditional four seasons, the majority of research was conducted during summer (n = 11 papers), followed by autumn (n = 6), spring (n = 5), and winter (n = 5). However, we note that seasonal variations closer to the equator and closer to sea level are less abrupt and, therefore, less relevant.
Of the 18 articles, 15 used passive sensors (spectral devices) and three used active sensors (laser scanning devices). Of the studies that used spectral sensors, ten articles reported analysis based on the Red, Green, Blue (RGB) wavelengths, three reported multispectral acquisition (RGB + NIR), and two reported on hyperspectral data. The spatial extent covered between 0.8 ha and 1520 ha, the latter acquired over several plots. Only seven papers specifically stated the spatial resolution of the final product, which ranged between 3 mm and 70 mm per pixel.
Measures of success in terms of the detection of understorey species depend greatly on the objectives of each project, as well as the methods selected. With a sensor capturing RGB data at a resolution of 40 mm per pixel, Van Auken and Taylor [47] were able to identify and count the number of overstorey and understorey woody species in their area (18 ha). Müllerová et al. [45] used four bands (RGB + NIR) with 50 mm pixel resolution to successfully identify two invasive weeds (each located in a different area) during phenologically distinctive times (different for each species), with ideal times recognised after comparing data from six temporally different UAS flights (three flights per area). Mafanya et al. [42] used RGB data with a spatial resolution of 37 mm per pixel to identify an invasive species occurring in open areas (i.e., ignoring individuals that could have occurred under the canopy), during a phenologically relevant time, and compared classification methods. They found that unsupervised classification methods were less accurate than supervised classifications, and were explained by low spectral resolution (i.e., using RGB bands only). Lopatin et al. [41] worked with a spatial resolution of 3 mm obtained from a hyperspectral camera mounted on a scaffold to simulate UAS data, successfully classifying species and their cover in areas with low canopy cover and low structural complexity. However, the authors noted that the classification would be hindered for species with high intra-species variability or with similar structural signatures, or occurring under the canopy. Cunliffe et al. [22] used RGB data with a pixel size of 10 mm, successfully assessing vegetation structure through SfM methods (90% accuracy based on canopy height) to quantify biomass in heterogeneous semi-arid rangelands, and stated they were able to overcome detection limitations of smaller vegetation as associated with SfM precision levels identified in more complex habitats (e.g., forests). Leduc and Knudby [40] reported 76% accuracy when identifying a phenologically distinctive understorey ‘spring ephemeral species’, wild leek (Allium tricoccum), based on RGB imagery with a pixel size of 50 mm. Getzin et al. [39] used RGB data with 70 mm resolution and correlated it with field work to successfully assess understorey diversity within canopy gaps of deciduous and deciduous/coniferous forests of Germany. Conversely, using RGB data to identify an invasive understorey tree in a tropical forest of Hawaii, Perroy et al. [46] acquired data at different heights, with pixel resolution between 14 and 53 mm, with different camera angles and degrees of canopy cover, and found that their finest resolution (flying at 30 m) only detected 41% of the tree stands, that oblique angles increased the detection rates, but also that their methods failed to detect individuals under thick overstorey (<10% openness).
Research that used active sensors accounting for understorey species were sparse (n = 2). Mandlburger et al. [43] used LiDAR sensors and found that their point clouds were similar during leaf on and leaf off conditions, except in areas with a dense shrub layer that had lower point cloud density. Chisholm et al. [38] also used LiDAR sensors to monitor vegetation on the side of a road, and successfully detected 73% of trees with a diameter at breast height (DBH) above 200 mm, but were unable to reliably detect tree stems below 200 mm DBH, which would include the majority of juvenile trees. Although success rates of these studies seem limited in their ability to detect understorey species, technological advances are likely to allow this in the near future. However, the challenges of signal penetration might continue to impede the full capabilities of LiDAR extraction of structural information in areas with dense understorey. For example, in a dense forest of Canada, Vepakomma and Cormier (2017) reported an acquisition of return signals of 10% in 2016 and only 2% in 2015, explaining the decline of acquisition rates by the thick foliage of summer (2015) when compared to autumn (2016).

4. Discussion

4.1. Spatial Resolution

Understorey species tend to be smaller and less spatially distinct than overstorey species, therefore the detection and identification of understorey plants requires greater (i.e., finer) spatial resolutions. Traditional platforms tend to have resolutions that are too coarse to identify understorey species, with common, freely available, satellite-based resolutions historically ranging from 1 km to 30 m [26], which have been improved by the recent emergence of satellites that deliver 3000 mm resolutions [50] and aircraft-based resolutions, which, at best, deliver 100 mm resolution imagery. Consequently, imagery-based remote sensing of understorey species pre-UAS was restricted to large forest gaps or habitats without overstorey cover, with limited species identification capabilities derived from mixed pixels (i.e., multiple objects with multiple spectral information per pixel).
UAS have made important contributions to improve spatial resolution, which enables the acquisition of sub-meter and even sub-centimetre resolutions, allowing the identification of individuals in the understorey. Successful classifications of understorey vegetation analysed during this literature review reported resolutions ranging between 3 mm and 200 mm. A caveat of ultra-high resolutions acquired by UAS lead to the hyper-differentiation of the parts of plants such as leaves, stems, and trunks, which can complicate identification at the species level due to a wider range of single-individual textures and spectral signatures [4]. As such, object based image analysis is preferred over traditional pixel classification for UAS imagery.
It is important to note that due to their high resolution, UAS-derived data is very sensitive to discrepancies in geographic positioning. Therefore, ground-truthing is an essential part of UAS derived data. Spatial ground-truthing refers to the geopositioning of spectral signatures generated by marking different species and several control points using high precision Geographic Positioning Systems, such as differential GPSs (dGPS) or GNSS.
In terms of three dimensional point clouds, spatial resolution has also increased considerably. For example, comparing platforms in a forested environment, McClelland et al. [9] found LiDAR point clouds between 30 and 70 pts/m2 on manned aircraft, while those obtained from UAS ranged between 500 and 1500 pts/m2 depending upon flight elevation. Moreover, SfM methods have also been demonstrated in different environments (mixed forest, riparian, floodplain) with Dandois and Ellis [18] finding SfM point clouds obtained by UAS (20 to 67 pts/m2) had a higher density than those derived from manned-aircraft LiDAR data (1.7 to 45 pts/m2). However, the incorporation of SfM methods with traditional imagery analysis has greater potential. Based on UAS-imagery of a tropical forest of Costa Rica, Zahawi et al. [19] used SfM algorithms to combine structural and spectral attributes and successfully estimated habitat and biomass characteristics of their restoration sites. In a forest of Tasmania, Australia, Wallace et al. [4] used UAS to compare laser scanning and SfM methods and found canopy cover estimates of 59% through fieldwork, 63% through LiDAR, and 50% through SfM. The sub-estimation of canopy cover from SfM methods was interpreted as a result ‘visual occlusion’ that prevented proper image overlap, especially at the edges of the plot [4]. Regardless, Wallace et al. [4] concluded that both LiDAR and SfM methods are comparable and result in an accurate representation of the environment.

4.2. Spectral Sensitivity

Healthy vegetation strongly reflects green and near infrared wavelengths, resulting in many species sharing reasonably similar spectral signatures [2,16]. This means that finer differentiation of wavelengths (e.g., hyperspectral imagery) will be required to successfully discern spectrally similar species. For this reason, there is the added need of spectral ground-truthing, which includes the calibration of spectral signatures that are especially important for hyperspectral sensors. The miniaturisation of sophisticated sensors has allowed their incorporation into UAS, which has allowed the acquisition of fine spectral detail. An example of the improved species discrimination achievable from greater spectral sensitivity is provided by Ahmed et al. [36], who compared RGB-only and multispectral sensors mounted on a UAS and SfM algorithms to classify land cover and vegetation based on three hierarchical levels. Vegetation classifications based on RGB-only imagery resulted in lower classification accuracies compared to multispectral sensors, the latter acquiring accuracies of 82% for the detection of forest overstorey species, 89% for the identification of three crop classes, and 95% for the classification of broad land cover classes [36].
Nevertheless, sensors capturing only the RGB bands are sufficient in some instances. For example, Leduc and Knudby [40] reported successful monitoring of understorey species based on RGB data when easily distinguishable phenological characteristics were present, with an accuracy of 76%. Other successful studies were the result of spectral information that was complementary to other information, such as structural attributes derived from SfM. For example, Cunliffe et al. [22] successfully delineated individual plants to account for biomass rather than species diversity, as species identification was not part of their aims. The limited use of hyperspectral sensors on UAS has left a knowledge gap that needs to be addressed. Although hyperspectral data provides more opportunities to find spectral signatures with less overlap among species, the only two papers reviewed that used this type of data reported only partial successes. Lopatin et al. [41] attributed their shortcomings to a relatively coarse spatial resolution, while Mitchell et al. [44] attributed theirs to ‘complications’ on data acquisition and suggested better results could be attained by using a wider spectral range and monitoring at different phenological times.

4.3. Spatial Extent

Compared to traditional remote sensing platforms, the greatest limitation of widely available UAS technology is related to the relatively low spatial extent they can cover. In contrast to worldwide coverage of satellite imagery, research analysed during this literature review reported data acquisition with an extent between 0.8 ha and 1520 ha. Despite the spatial limitation of UAS compared to traditional platforms, UAS exceed the coverage capabilities of field-based methods. For example, through intensive fieldwork conducted between November 2015 and June 2016, Van Auken and Taylor [47] monitored vegetation using traditional field based methods over 0.56 ha, while they were able to monitor 18 ha per flight, for a total of 1520 ha surveyed by UAS in November 2014.
The spatial extent that widely available UAS can cover is severely limited by intrinsic and extrinsic factors. Intrinsic factors that limit the spatial extent are related the choice of platform and flight settings, as well as to battery life, allowing flights of up to 10 to 18 min depending on the system and weight of the mounted sensors. Extrinsic factors that can limit the area covered are related to geographic and environmental conditions, but may also include regional laws and permits. Geographic and environmental conditions might become a limitation when monitoring a larger area due to slight changes in the position of the sun or the limited ability to maintain ideal environmental conditions (such as those related to cloud cover and rain, as well as wind) when covering larger areas. Nonetheless, relative to traditional platforms, monitoring with UAS allow greater control of unsuitable environmental conditions. Legal regulations for commercial purposes (including research) in Australia and most international jurisdictions dictate that, unless a specific approval has been provided, UAS can only be flown by a qualified pilot with an operator’s certificate, away from airports and air traffic, at a maximum altitude of 120 m whilst maintaining visual line of sight with the aircraft, and not in close proximity to buildings or people [51].

4.4. Temporal Frequency

The temporal frequency at which data can be acquired using UAS technology is more flexible and cost effective than with traditional remote sensing platforms [45]. Satellites capture data on defined paths and frequencies regardless of the environmental conditions on Earth. For example, Landsat satellite covers the world every 16 days [52]. Manned aircrafts can have user defined temporal frequencies, but are restricted by high economic and logistic costs [45]. On the other hand, UAS can be flown multiple times at flexible temporal frequencies [45]. Temporal frequencies of UAS-derived data are defined by the project’s needs. For example, Dandois and Ellis [18] used LiDAR data collected by manned aircraft in summer 2005 and autumn 2011, as well as spectral UAS-imagery collected in a 19-month period covering all seasons from August 2010 to February 2012. However, given that UAS collect a limited spatial extent and require on-site personnel, data is currently only collected on a needs basis that restricts many possibilities of a posteriori data analyses.
Highly flexible temporal frequencies can help overcome three of the inherent difficulties in understorey monitoring: canopy penetration (including the shadow effect), cloud shade effect, and understorey species discrimination. Canopy penetration can be addressed appropriately by acquiring data during leaf-off conditions of deciduous forests when the target species or species assemblages are perennial (Figure 1). For example, Vepakomma and Cormier [48] monitored differences in forest thinning with LiDAR, and found a decrease in signal penetrability during summer, attributed to thicker foliage. The shadow effect of overstorey can be addressed by flying the UAS at, or close to, solar noon. The effect of cloud shadows can be prevented by flying in clear sky or cloudy conditions to provide more homogeneous targets.
Species discrimination can be addressed by targeting UAS flights during particular events that allow capturing phenological differences among species (i.e., by choosing the best season to acquire UAS data). For example, Müllerová et al. [45] monitored invasive species between May and November, and favoured classifications made during summer for giant hogweed (Heracleum mantegazzianum), when they were able to identify hogweed by detecting their ‘large white inflorescences’, and during autumn for knotweeds (Fallopia spp.), when they detected a ‘reddish-brown colouring of decaying plants’. This also highlights that, given species-specific phenological cycles, multi-temporal images may need to be acquired to identify different species. Moreover, in strongly seasonal environments or those subject to seasonal fires, the whole array of understorey species may not be detected at certain times of the year. For example, in the fire-prone open woodlands of northern Australia, which also have strong variations between the dry and wet seasons, identification of species and monitoring of understorey vegetation is most appropriate during the mid-wet season, when species are at their peak biomass but before fire risk increases towards the end of the wet and early dry season.

4.5. Recommendations

The most important issues relevant to the assessment of understorey species can be the lack of penetration of spectral sensors or partial penetration of active sensors (due to visual obstruction or signal thinning by overstorey, respectively), and the intrinsic characteristics of species in the understorey, an issue that has been reported by several authors [1,6,38,46,48,53]. In relation to penetration, it is unsurprising that the main success stories of UAS-based understorey monitoring occur in open environments where ‘above canopy’ images can be acquired without concealing understorey species. In fact, the best conditions have been identified by Lopatin et al. [41] as those with ‘low structural complexity and low canopy overlaps’. Thus, it is likely that using current technologies environments most suitable for monitoring understorey species include savannas, desert grasslands, and shrublands, as well as open forests and even deciduous forests during leaf-off conditions. However, based on our findings, most attempts to monitor understorey species have been in unsuitable environments (i.e., closed forests), where overstorey species block observations of the understorey. Although there is an important knowledge gap regarding the use of this technology in open environments, some understorey species identification has already been successfully conducted in grasslands or during leaf-off conditions (e.g., Lopatin et al. [41] and Müllerová et al. [45]. Using UAS technology, as well as understanding its capabilities and limitations for monitoring understorey species in open environments becomes more evident when considering that most of the terrestrial ecosystems on Earth are represented by open environments such as savannas, desert grasslands, shrublands, and open forests. Thus, in this section, we provide suggestions and considerations for UAS-based monitoring of understorey species in open environments based on current technologies.
The characteristics that make understorey species measurement difficult are associated with their relatively smaller size (compared to overstorey species), similarity of spectral signatures among species, and spatial overlap of individuals and species (Table 2). The issue of detecting smaller species remotely could be overcome by managing technological attributes associated with spatial extent. UAS systems can attain ultra-high resolutions related to sensor capabilities (e.g., megapixel resolution of the camera) and to the UAS-flight elevation, with finer resolutions attained closer to the ground. Testing flight elevations between 30 and 120 m, Perroy et al. [46] concluded that data acquired at 30 m, resulting in a pixel resolution of 14 mm, in their case, was best to detect tree stands in a tropical forest. Although elevation parameters have not been tested in open environments, which intrinsically have less overlap among vegetation (and therefore higher detectability than similar species in closed forests), similar resolution (<14 mm) may allow the identification of vegetation species smaller than a tree stand, while coarser resolutions should be adequate to detect open forest tree stands.
The issue of spectral signature similarity can be addressed by managing the spectral sensitivity and temporal frequency attributes. The spectral sensitivity attribute can help differentiate species by capturing a greater number of bands in relevant portions of the spectrum, along the green, red, and near infrared wavelengths, for example, choosing multispectral or hyperspectral sensors over traditional three-band sensors (RGB). The temporal frequency attribute can be managed by choosing the time of the year where a particular species or set of species display phenological attributes that distinguish them from others in their environment (e.g., presence of flowers or fruits, leaf coloration or discoloration, or perennial compared to deciduous species, among others).
The issue of spatial overlap among species is perhaps the most challenging, because as can be appreciated with the canopy cover issue, overlapping species that occur under others in the understorey will go undetected. Partial overlap can be managed through spectral sensitivity or the temporal frequency attributes, when spectral signatures are different, which may result from phenologically distinct stages, or through the selection of sensors/methods that allow monitoring of vegetation structure. For example, LiDAR sensors and SfM- 3D point clouds might allow differentiating species by shape, so they can be used as an additional step for species identification. However, as a standalone method, the lack of spectral information of LiDAR sensors make them unsuitable to assess the whole suite of understorey species in the environment, even if shapes and locations of some species might allow proper identification of some species.
Along with the choice of methods and manipulation of technological attributes of remote sensing, it is important to note that the choice of UAS will also contribute to the degree of success in monitoring understorey species. Greater manoeuvrability of multi-rotor systems, along with their ability to be used in tight spaces, in additional to the advancement of object avoidance capabilities, will be essential for monitoring understorey vegetation. In fact, most of the articles reviewed here used multi-rotor systems to monitor understorey vegetation (Table 1). Manoeuvrability is especially important in closed environments, where recent flights in foliage (forested) environments were possible using a quadcopter mounted with a Kalman filter and a 2D laser range finder, in combination with an algorithm used for real-time path planning, to assess obstacle presence and avoid them [54,55]. Cui [54] reported that their real-time object avoidance was possible through very slow flight speeds (0.5 m/s). Thus, it is likely that monitoring understorey vegetation in the future could also include closed environments.
Other features that can assist with species identification, beyond the tuning of adjustable attributes, include creating a library of spectral signatures and/or shapes from individual species. This can be conducted with hand-held spectrometers, additional flights where the sensors are located directly above identified species, and conducting on-ground vegetation surveys where different species are directly geo-referenced through high precision GPS. It is important to note that ground-truthing, in terms of georeferencing and spectral calibration, will be essential for appropriate analysis of UAS-derived data.
Even when following these recommendations, it is likely that only species that have a combination of the following qualities will be regularly detected: they are common, have a defining shape, distinct phenological attributes and/or spectral characteristics. These are the species that are most likely identifiable utilising high-resolution data captured with UAS.

5. Conclusions

We reviewed original research that used UAS as platforms to assess understorey vegetation, focusing on the four technical attributes outlined by Sanders [16] as the key for successful understorey monitoring. We found that UAS-related technology, along with the emergence of light-weight sensors, has been used to assess understorey vegetation with varying degrees of success in terms of species identification. The level of success can be related to the choice of methods or sensors given the interaction of the different attributes. It is important to note that the different attributes discussed here are interrelated, for example, the platform can be deployed from a higher altitude to cover a larger area at the expense of monitoring with a coarser resolution [16].
Remote sensing using UAS platforms can measure and monitor understorey vegetation at a local scale. Even though many of the limitations of passive and active sensors of traditional platforms (i.e., manned airborne and satellite) are still applicable, UAS can be successfully flown close to the vegetation and even through the foliage (although this is currently not routine), providing very-high spatial resolutions. High spatial resolutions will allow understorey vegetation to be measured and monitored in open environments such as savannas and forest gaps, with the potential of acquiring data under the overstorey canopy when UAS may be flown under canopy in the near future. Furthermore, with the technological advancement that has allowed for the development of lightweight sensors, the spectral sensitivity can be user defined and can range from RGB to hyperspectral sensors. The temporal frequency at which data can be acquired is very flexible and defined by the project’s needs, but consistent multi-temporal datasets, in a rapidly changing technology environment [56], will continue to be a challenge to obtain.

Author Contributions

Conceptualization, P.D.E. and R.E.B.; Methodology, L.H.-S. and P.D.E.; Formal Analysis, L.H.-S.; Data Curation, L.H.-S. and M.L.R.; Writing–Original Draft Preparation, L.H.-S.; Writing–Review & Editing, L.H.-S., M.L.R., P.D.E., and R.E.B.; Visualization, P.D.E. and R.E.B.; Supervision, P.D.E. and R.E.B.; Project Administration, P.D.E. and R.E.B.; Funding Acquisition, P.D.E. and R.E.B.

Funding

This review has not received external funding.

Conflicts of Interest

We declare there are neither conflicts of interest nor direct financial benefits from this paper.

References

  1. Breckenridge, R.P.; Dakins, M.; Bunting, S.; Harbour, J.L.; Lee, R.D. Using unmanned helicopters to assess vegetation cover in sagebrush steppe ecosystems. Rangel. Ecol. Manag. 2012, 65, 362–370. [Google Scholar] [CrossRef]
  2. Tehrany, M.S.; Kumar, L.; Drielsma, M.J. Review of native vegetation condition assessment concepts, methods and future trends. J. Nat. Conserv. 2017, 40, 12–23. [Google Scholar] [CrossRef]
  3. Morsdorf, F.; Mårell, A.; Koetz, B.; Cassagne, N.; Pimont, F.; Rigolot, E.; Allgöwer, B. Discrimination of vegetation strata in a multi-layered Mediterranean forest ecosystem using height and intensity information derived from airborne laser scanning. Remote Sens. Environ. 2010, 114, 1403–1415. [Google Scholar] [CrossRef] [Green Version]
  4. Wallace, L.; Lucieer, A.; Malenovský, Z.; Turner, D.; Vopěnka, P. Assessment of forest structure using two UAV techniques: A comparison of airborne laser scanning and Structure from Motion (SfM) point clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef]
  5. Zhang, Y.; Chen, H.Y.H.; Taylor, A.R. Aboveground biomass of understorey vegetation has a negligible or negative association with overstorey tree species diversity in natural forests. Glob. Ecol. Biogeogr. 2016, 25, 141–150. [Google Scholar] [CrossRef]
  6. Hamraz, H.; Contreras, M.A.; Zhang, J. Forest understory trees can be segmented accurately within sufficiently dense airborne laser scanning point clouds. Sci. Rep. 2017, 7, 6770. [Google Scholar] [CrossRef] [PubMed]
  7. Hamraz, H.; Contreras, M.A.; Zhang, J. Vertical stratification of forest canopy for segmentation of understory trees within small-footprint airborne LiDAR point clouds. ISPRS J. Photogramm. Remote Sens. 2017, 130, 385–392. [Google Scholar] [CrossRef] [Green Version]
  8. Xie, Y.; Sha, Z.; Yu, M. Remote sensing imagery in vegetation mapping: A review. J. Plant Ecol. 2008, 1, 9–23. [Google Scholar] [CrossRef]
  9. McClelland, M.P.; Hale, D.S.; van Aardt, J. A comparison of manned and unmanned aerial Lidar systems in the context of sustainable forest management. In Proceedings of the SPIE Commercial + Scientific Sensing and Imaging, Orlando, FL, USA, 15–19 April 2018; p. 9. [Google Scholar]
  10. Richards, J.A. Remote Sensing Digital Image Analysis: An Introduction, 5th ed.; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  11. Yamazaki, F.; Liu, W.; Takasaki, M. Characteristics of shadow and removal of its effects for remote sensing imagery. In Proceedings of the 2009 IEEE International Geoscience and Remote Sensing Symposium, Cape Town, South Africa, 12–17 July 2009; pp. IV-426–IV-429. [Google Scholar]
  12. Milas, A.S.; Arend, K.; Mayer, C.; Simonson, M.A.; Mackey, S. Different colours of shadows: Classification of UAV images. Int. J. Remote Sens. 2017, 38, 3084–3100. [Google Scholar] [CrossRef]
  13. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  14. Chakraborty, A.; Sachdeva, K.; Joshi, P.K. Chapter 4—A reflection on image classifications for forest ecology management: Towards landscape mapping and monitoring. In Handbook of Neural Computation; Academic Press: Cambridge, MA, USA, 2017. [Google Scholar]
  15. He, K.S.; Bradley, B.A.; Cord, A.F.; Rocchini, D.; Tuanmu, M.-N.; Schmidtlein, S.; Turner, W.; Wegmann, M.; Pettorelli, N. Will remote sensing shape the next generation of species distribution models? Remote Sens. Ecol. Conserv. 2015, 1, 4–18. [Google Scholar] [CrossRef]
  16. Sanders, A. Mapping the distribution of understorey Rhododendron ponticum using low-tech multispectral UAV derived imagery. In The Roles of Remote Sensing in Nature Conservation: A Practical Guide and Case Studies; Díaz-Delgado, R., Lucas, R., Hurford, C., Eds.; Springer International Publishing: Cham, Switaerland, 2017; pp. 167–181. [Google Scholar]
  17. Eitel, J.U.H.; Höfle, B.; Vierling, L.A.; Abellán, A.; Asner, G.P.; Deems, J.S.; Glennie, C.L.; Joerg, P.C.; LeWinter, A.L.; Magney, T.S.; et al. Beyond 3-D: The new spectrum of LiDAR applications for earth and ecological sciences. Remote Sens. Environ. 2016, 186, 372–392. [Google Scholar] [CrossRef]
  18. Dandois, J.P.; Ellis, E.C. High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sens. Environ. 2013, 136, 259–276. [Google Scholar] [CrossRef] [Green Version]
  19. Zahawi, R.A.; Dandois, J.P.; Holl, K.D.; Nadwodny, D.; Reid, J.L.; Ellis, E.C. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol. Conserv. 2015, 186, 287–295. [Google Scholar] [CrossRef] [Green Version]
  20. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  21. Dandois, J.; Baker, M.; Olano, M.; Parker, G.; Ellis, E. What is the point? evaluating the structure, color, and semantic traits of computer vision point clouds of vegetation. Remote Sens. 2017, 9, 355. [Google Scholar] [CrossRef]
  22. Cunliffe, A.M.; Brazier, R.E.; Anderson, K. Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry. Remote Sens. Environ. 2016, 183, 129–143. [Google Scholar] [CrossRef] [Green Version]
  23. Vuruskan, A.; Yuksek, B.; Ozdemir, U.; Yukselen, A.; Inalhan, G. Dynamic modeling of a fixed-wing VTOL UAV. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 483–491. [Google Scholar]
  24. Yuksek, B.; Vuruskan, A.; Ozdemir, U.; Yukselen, M.A.; Inalhan, G. Transition flight modeling of a fixed-wing VTOL UAV. J. Intell. Robot. Syst. 2016, 84, 83–105. [Google Scholar] [CrossRef]
  25. Fletcher, A.T.; Erskine, P.D. Mapping of a rare plant species (Boronia deanei) using hyper-resolution remote sensing and concurrent ground observation. Ecol. Manag. Restor. 2012, 13, 195–198. [Google Scholar] [CrossRef]
  26. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [Green Version]
  27. Tansey, K.; Chambers, I.; Anstee, A.; Denniss, A.; Lamb, A. Object-oriented classification of very high resolution airborne imagery for the extraction of hedgerows and field margin cover in agricultural areas. Appl. Geogr. 2009, 29, 145–157. [Google Scholar] [CrossRef]
  28. Platt, R.V.; Rapoza, L. An evaluation of an object-oriented paradigm for land use/land cover classification. Prof. Geogr. 2008, 60, 87–100. [Google Scholar] [CrossRef]
  29. Tian, J.; Chen, D.M. Optimization in multi-scale segmentation of high-resolution satellite images for artificial feature recognition. Int. J. Remote Sens. 2007, 28, 4625–4644. [Google Scholar] [CrossRef]
  30. Tuia, D.; Volpi, M.; Copa, L.; Kanevski, M.; Munoz-Mari, J. A survey of active learning algorithms for supervised remote sensing image classification. IEEE J. Sel. Top. Signal Process. 2011, 5, 606–617. [Google Scholar] [CrossRef]
  31. Laliberte, A.S.; Rango, A.; Havstad, K.M.; Paris, J.F.; Beck, R.F.; McNeely, R.; Gonzalez, A.L. Object-oriented image analysis for mapping shrub encroachment from 1937 to 2003 in southern New Mexico. Remote Sens. Environ. 2004, 93, 198–210. [Google Scholar] [CrossRef]
  32. Lopez-Granados, F.; Jurado-Exposito, M.; Pena-Barragan, J.M.; Garcia-Torres, L. Using Remote Sensing for Identification of Late-Season Grass Weed Patches in Wheat. Weed Sci. 2006, 54, 346–353. [Google Scholar] [CrossRef]
  33. Teillet, P.M.; Staenz, K.; William, D.J. Effects of spectral, spatial, and radiometric characteristics on remote sensing vegetation indices of forested regions. Remote Sens. Environ. 1997, 61, 139–149. [Google Scholar] [CrossRef]
  34. Antonarakis, A.S.; Richards, K.S.; Brasington, J. Object-based land cover classification using airborne LiDAR. Remote Sens. Environ. 2008, 112, 2988–2998. [Google Scholar] [CrossRef]
  35. Pickering, C.; Byrne, J. The benefits of publishing systematic quantitative literature reviews for PhD candidates and other early-career researchers. High. Educ. Res. Dev. 2014, 33, 534–548. [Google Scholar] [CrossRef]
  36. Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Int. J. Remote Sens. 2017, 38, 2037–2052. [Google Scholar] [CrossRef]
  37. Bedell, E.; Leslie, M.; Fankhauser, K.; Burnett, J.; Wing, M.G.; Thomas, E.A. Unmanned aerial vehicle-based structure from motion biomass inventory estimates. J. Appl. Remote Sens. 2017, 11, 026026. [Google Scholar] [CrossRef]
  38. Chisholm, R.A.; Cui, J.; Lum, S.K.Y.; Chen, B.M. UAV LiDAR for below-canopy forest surveys. J. Unmanned Veh. Syst. 2013, 01, 61–68. [Google Scholar] [CrossRef]
  39. Getzin, S.; Wiegand, K.; Schöning, I. Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods Ecol. Evol. 2012, 3, 397–404. [Google Scholar] [CrossRef]
  40. Leduc, M.-B.; Knudby, A. Mapping wild leek through the forest canopy using a UAV. Remote Sens. 2018, 10, 70. [Google Scholar] [CrossRef]
  41. Lopatin, J.; Fassnacht, F.E.; Kattenborn, T.; Schmidtlein, S. Mapping plant species in mixed grassland communities using close range imaging spectroscopy. Remote Sens. Environ. 2017, 201, 12–23. [Google Scholar] [CrossRef]
  42. Mafanya, M.; Tsele, P.; Botai, J.; Manyama, P.; Swart, B.; Monate, T. Evaluating pixel and object based image classification techniques for mapping plant invasions from UAV derived aerial imagery: Harrisia pomanensis as a case study. ISPRS J. Photogramm. Remote Sens. 2017, 129, 1–11. [Google Scholar] [CrossRef]
  43. Mandlburger, G.; Wieser, M.; Hollaus, M.; Pfennigbauer, M.; Riegl, U. Multi-temporal UAV-borne LiDAR point clouds for vegetation analysis-a case study. In Proceedings of the EGU General Assembly Conference Abstracts, Vienna Austria, 17–22 April 2016; p. 7036. [Google Scholar]
  44. Mitchell, J.J.; Glenn, N.F.; Anderson, M.O.; Hruska, R.C.; Halford, A.; Baun, C.; Nydegger, N. Unmanned aerial vehicle (UAV) hyperspectral remote sensing for dryland vegetation monitoring. In Proceedings of the 2012 4th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Shanghai, China, 4–7 June 2012; pp. 1–10. [Google Scholar]
  45. Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing is important: Unmanned aircraft vs. satellite imagery in plant invasion monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef]
  46. Perroy, R.L.; Sullivan, T.; Stephenson, N. Assessing the impacts of canopy openness and flight parameters on detecting a sub-canopy tropical invasive plant using a small unmanned aerial system. ISPRS J. Photogramm. Remote Sens. 2017, 125, 174–183. [Google Scholar] [CrossRef]
  47. Van Auken, O.W.; Taylor, D.L. Using a drone (UAV) to determine the Acer grandidentatum (bigtooth maple) density in a relic, isolated community. Phytologia 2017, 99, 208–220. [Google Scholar]
  48. Vepakomma, U.; Cormier, D. Potential of multi-temporal UAV-borne lidar in assessing effectiveness of silvicultural treatments. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 393–397. [Google Scholar] [CrossRef]
  49. Weil, G.; Lensky, I.; Resheff, Y.; Levin, N. Optimizing the timing of unmanned aerial vehicle image acquisition for applied mapping of woody vegetation species using feature selection. Remote Sens. 2017, 9, 1130. [Google Scholar] [CrossRef]
  50. PlanetTeam. Planet Application Program Interface: In Space for Life on Earth; PlanetTeam: San Francisco, CA, USA, 2017; Available online: https://api.planet.com (accessed on 21 August 2017).
  51. Civil Aviation Safety Authority. Unmanned Aircraft and Rocket Operations; CASR Part 101; CASR: Canberra, ACT, Australia, 2003. [Google Scholar]
  52. Marx, A.; McFarlane, D.; Alzahrani, A. UAV data for multi-temporal Landsat analysis of historic reforestation: A case study in Costa Rica. Int. J. Remote Sens. 2017, 38, 2331–2348. [Google Scholar] [CrossRef]
  53. Gwenzi, D. LiDAR remote sensing of savanna biophysical attributes: Opportunities, progress, and challenges. Int. J. Remote Sens. 2017, 38, 235–257. [Google Scholar] [CrossRef]
  54. Cui, J.Q.; Lai, S.; Dong, X.; Chen, B.M. Autonomous navigation of UAV in foliage environment. J. Intell. Robot. Syst. 2016, 84, 259–276. [Google Scholar] [CrossRef]
  55. Cui, J.Q.; Lai, S.; Dong, X.; Liu, P.; Chen, B.M.; Lee, T.H. Autonomous navigation of UAV in forest. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 726–733. [Google Scholar]
  56. Johansen, K.; Erskine, P.D.; McCabe, M.F. Using unmanned aerial vehicles to assess the rehabilitation performance of open cut coal mines. J. Clean. Prod. 2019, 209, 819–833. [Google Scholar] [CrossRef]
Figure 1. Identification and monitoring of understorey vegetation: challenges and suggestions to overcome them. (1) illustrates key challenges involved with identification and monitoring of understorey vegetation, which can be subdivided on (a) overstorey and its shadow blocking view of understorey species and (b) intrinsic challenges of understorey identification as related to scale (smaller species) and spatial and spectral overlap of understorey species. (2) Shows how flight parameters and technical specifications can be manipulated to help overcome the challenges of understorey monitoring. To overcome obscuration from the overstorey, users can (c) reduce the line spacing to increase side overlap, and (d) reduce speed to increase the forward overlap. To help overcome the spectral overlap of understorey species, (e) sensor spectral range can be increased, such as with the use of multispectral and hyperspectral sensors. To assist with the detection of small understorey plants, operators can (f) fly lower and (g) change the camera specifications by increasing sensor resolution and increasing the focal length of the lens used. (3) Shows how UAS flights can be timed to overcome overstorey obscuration and spectral overlap in the understorey. Overstorey obscuration can be overcome by (g) targeting “leaf off” periods if working in deciduous environments. Spectral overlap can be overcome by targeting understorey phenological events such as (h) senescence and (i) flowering.
Figure 1. Identification and monitoring of understorey vegetation: challenges and suggestions to overcome them. (1) illustrates key challenges involved with identification and monitoring of understorey vegetation, which can be subdivided on (a) overstorey and its shadow blocking view of understorey species and (b) intrinsic challenges of understorey identification as related to scale (smaller species) and spatial and spectral overlap of understorey species. (2) Shows how flight parameters and technical specifications can be manipulated to help overcome the challenges of understorey monitoring. To overcome obscuration from the overstorey, users can (c) reduce the line spacing to increase side overlap, and (d) reduce speed to increase the forward overlap. To help overcome the spectral overlap of understorey species, (e) sensor spectral range can be increased, such as with the use of multispectral and hyperspectral sensors. To assist with the detection of small understorey plants, operators can (f) fly lower and (g) change the camera specifications by increasing sensor resolution and increasing the focal length of the lens used. (3) Shows how UAS flights can be timed to overcome overstorey obscuration and spectral overlap in the understorey. Overstorey obscuration can be overcome by (g) targeting “leaf off” periods if working in deciduous environments. Spectral overlap can be overcome by targeting understorey phenological events such as (h) senescence and (i) flowering.
Drones 03 00009 g001
Table 1. Journal articles reviewed and relevant characteristics observed [1,18,19,22,36,37,38,39,40,41,42,43,44,45,46,47,48,49]. Objective (Short) refers to a condensed version of their main goal. ND = not determined.
Table 1. Journal articles reviewed and relevant characteristics observed [1,18,19,22,36,37,38,39,40,41,42,43,44,45,46,47,48,49]. Objective (Short) refers to a condensed version of their main goal. ND = not determined.
First Author; YearTitleObjective (Short); Understorey AssessmentGeographic Location; Ecosystem; Field Validation; PlatformSpectral Range;
Spatial Resolution (mm for Spectral, Points per m2 for laser);
Extent Covered (ha)
Flight Frequency; SeasonFindings/Conclusions
Ahmed;
2017
[36]
Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicleDifferent classification methods to assess land cover with UAS;
partial
Canada;
forest, agricultural; spectral calibration; fixed-wing
RGB, RGB+NIR+RE; ND;
1450
once;
summer
They successfully classified land cover using spectral information along with texture and structure. They had 95% accuracy at broadest level (i.e., forest, shrub, herbaceous), 82% identifying overstorey species, tall or short shrubs, and grasses or crops, and 89% identifying shrub tree species and crop types.
Bedell;
2017
[37]
Unmanned aerial vehicle-based structure from motion biomass inventory estimatesUAS-based imagery and SfM algorithms to estimate over- and understorey biomass; partialUnited States; riparian;
vegetation, geoposition; quadcopter
RGB;
ND;
0.8
ND;
ND
They were able to count stems at a more spatially representative scale than fieldwork alone. Their use of Structure from Motion (SfM) resulted in a 3D cloud comparable to LiDAR and field-based methods.
Breckenridge; 2012
[1]
Using unmanned helicopters to assess vegetation cover in sagebrush steppe ecosystemsAssess the use of UAS to collect vegetation cover data;
yes
United States;
semi-arid;
vegetation;
helicopter
RGB;
ND;
0.0084 UAS, 0.00045 traditionally sampled
once;
summer
Comparing UAS-imagery and fieldwork, they found similar cover areas of grass, litter, bare ground, and dead shrub. However, their UAS method overestimated shrub cover by misclassifying forbs. They concluded that UAS are cost-effective techniques to assess vegetation cover.
Chisholm;
2013
[38]
UAS LiDAR for below-canopy forest surveysUse UAS LiDAR for understorey;
yes
Singapore;
roadside;
vegetation; quadcopter
laser;
ND;
0.04
ND;
ND
They reliably detected and measured trees with a DBH >200 mm. They had issues with GPS reading in understorey vegetation, and suggest that monitoring understorey vegetation will be best in ‘forests on flat terrain with an open understorey and large regular-shaped trees’.
Cunliffe;
2016
[22]
Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetrySfM photogrammetry (point cloud) to quantify biomass in semi-arid rangelands;
yes
United States;
semi-arid;
no;
hexacopter
RGB;
10;
10
once;
autumn
Their use of SfM allowed the structural differentiation of individuals from 20-mm grass tussocks to trees.
Dandois;
2013
[18]
High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer visionThe use of UAS to develop SfM point clouds;
partial
United States;
forest, floodplain;
no;
hexacopter
RGB, laser;
20 to 67 SfM, 1.7 to 45 laser;
18.75
multiple (16 months);
LiDAR leaf off, UAS 16 month period in all seasons
They successfully used SfM to create 3-D point clouds comparable to LiDAR but coupled with multispectral information to identify and monitor vegetation based on structural and spectral attributes, at a temporal frequency that allows the assessment of phenological variations.
Getzin;
2012
[39]
Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehiclesThe use of UAS to monitor understorey biodiversity in forest; yesGermany;
forest;
vegetation;
fixed-wing
RGB;
70;
20
twice;
summer
They found that forest gaps can be used to assess understorey biodiversity using high-resolution imagery. They found a correlation between forest gaps and vegetation diversity.
Leduc;
2018
[40]
Mapping wild leek through the forest canopy using a UAVAssess if UAS-imagery can be used to find and map wild leek;
yes
Canada;
forest;
geoposition, colour calibration; quadcopter
RGB;
50;
8.6
once;
spring
They were able to identify wild leek with 76% accuracy, but suggest this would not be possible in areas where understorey species have similar spectral signatures and phenology. They suggest getting acquainted to temporal variations in phenological attributes of different species to choose the appropriate flight times to identify different species.
Lopatin;
2017
[41]
Mapping plant species in mixed grassland communities using close range imaging spectroscopyAssess use of UAS to identify grassland species; yesGermany;
botanical garden; vegetation, geoposition; simulated UAS (scaffold)
hyperspectral (61 bands 398 to 957 nm);
3;
6.87 × 10−5
once;
summer
They were only successful in areas with low structural complexity and low canopy overlap. They had trouble identifying species or individuals with great spectral variation due to mixed signals, and suggested higher spatial resolution could help resolve their issue.
Mafayana;
2017
[42]
Evaluating pixel and object based image classification techniques for mapping plant invasions from UAV derived aerial imagery: Harrisia pomanensis as a case studyCompare pixel vs. object based classification for an invasive species;
yes
South Africa;
semi-arid;
geoposition; unspecified UAS
RGB;
36.5;
872
once;
winter
Through object-based classification, the authors successfully identified invasive species based on their phenological characteristics. They noted that their classification was only possible in areas without overstorey.
Mandlburger;
2016
[43]
Multi-temporal UAV-borne LiDAR point clouds for vegetation analysis-a case studyAssess temporal change in point cloud density (leaf on vs. leaf off);
yes
Austria;
forest;
no;
octopter
laser;
267 to 517 on ground,
348 to 757 on canopy;
ND
twice;
winter, spring, autumn
The method successfully collected data with similar point cloud densities under leaf on and leaf off conditions.
Mitchell;
2012
[44]
Unmanned aerial vehicle (UAV) hyperspectral remote sensing for dryland vegetation monitoringCompare classification methods of vegetation including shrubs, based on UAS hyperspectral data; partialUnited States;
semi-arid;
vegetation, geoposition;
fixed-wing
hyperspectral;
ND;
0.006
once;
spring
They were able to acquire composite images suitable for classification from hyperspectral sensors, with ‘complications’ on data acquisition. To monitor shrub cover, unsupervised classification performed better the supervised methods. They recommended the acquisition of ground-truthing data, and suggest the acquisition of time series with a wide spectral range to ‘effectively’ identify understorey species.
Müllerová; 2017
[45]
Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion MonitoringAssess temporal timing and camera resolution needed to detect invasives based on phenology;
yes
Czech Republic; river floodplain, grassland?; geoposition;
fixed-wing
multispectral (RGB + modified NIR); 50;
225
multiple (5 months); summer-autumnThey successfully identified weeds (during leaf-off conditions). They concluded that phenological stages are tightly related to detection accuracy, and highlighted the importance to monitor a wide temporal range to evidence those phenological differences.
Perroy;
2017
[46]
Assessing the impacts of canopy openness and flight parameters on detecting a sub-canopy tropical invasive plant using a small unmanned aerial systemAssess the use of UAS to detect an understorey invasive tree; yesUnited States;
forest;
vegetation, geoposition; quadcopter
RGB;
13.7–53.1;
0.8
once;
summer
The authors successfully detected understorey trees at flight altitudes between 30 and 40 m, but not higher. Canopy openness >40% allowed detection of all plants, while those at <10% were undetectable. They found that the use of oblique photos increased detection rates.
Van Auken;
2017
[47]
Using a drone (UAV) to determine the Acer grandidentatum (bigtooth maple) density in a relic, isolated communityCount numbers of trees (understorey and overstorey) subject to heavy grazing by white-tailed deer (Odocoileus virginianus);
yes
United States;
forest;
vegetation; quadcopter
RGB;
40.64;
1520 ha UAS, 0.56 ha traditionally sampled
once;
autumn to spring
Successfully identified Acer grandidentatum woodland communities by monitoring phenological changes and flying the UAS when the leaves were different.
Vepakomma;
2017
[48]
Potential of multi-temporal UAV-borne LiDAR in assessing effectiveness of silvicultural treatmentsUse of LiDAR to detect changes in forest treatments and laser reaching the ground through autumn foliage; partialCanada;
forest;
geoposition; helicopter
laser;
ND;
10.5
twice;
summer, autumn
They suggest that understorey vegetation can be monitored through LiDAR by removing the overstorey information from the point cloud. They suggest that monitoring understorey structure is possible, and their method can be used to assess successional stages.
Weil;
2017
[49]
Optimizing the timing of unmanned aerial vehicle image acquisition for applied mapping of woody vegetation species using feature selectionAsses UAS to identify species, where herbaceous patches were treated as an item (no species identified); partialIsrael;
forest;
vegetation;
self produced, fixed-wing
multispectral (RGB + NIR + RE);
200, on average;
10
five;
winter, spring, summer
They successfully identified different tree and shrub species as well as herbaceous patches by flying at different times that were phenologically relevant for the different species in their area. They conclude that flying at multiple relevant times can substitute the need for to obtain data on a wider spectral range.
Zahawi;
2015
[19]
Using lightweight unmanned aerial vehicles to monitor tropical forest recoveryThe use of SfM to assess structural complexity of restoration sites; partialCosta Rica;
forest;
vegetation;
hexacopter
RGB;
ND;
13
once;
summer
They concluded that SfM methods couple spectral and structural information that can be used to assess habitats and forest dynamics, as well as to obtain biomass metrics.
Table 2. Summary of challenges for the identification and monitoring of understorey species using UAS. Spatial resolution depends on flight parameters and sensor specifications. Spectral resolution depends on choice of sensor (RGB, multispectral, hyperspectral).
Table 2. Summary of challenges for the identification and monitoring of understorey species using UAS. Spatial resolution depends on flight parameters and sensor specifications. Spectral resolution depends on choice of sensor (RGB, multispectral, hyperspectral).
Small
Size of Understorey Species
Similar Spectral Characteristics of Understorey SpeciesSpatial Overlap within the UnderstoreyCanopy Penetration
Spatial resolutionHigh spatial resolution imagery is required to detect small individual plants High resolution imagery might help with the identification of species within small gaps in the understoreyThe use of high resolution (small pixel size) imagery reduces the amount of mixed pixels between canopy and the understorey, helping to identify understorey species
Spectral resolution Higher spectral resolution will help in the discrimination of subtle differences in vegetation reflectance (e.g., the use of multispectral and hyperspectral sensors)
Temporal frequency Targeted surveys can allow discrimination based on phenological changes such as senescence and flowering Surveys can be targeted to coincide with leaf off periods in deciduous forests
Spatial extent Greater potential to avoid spectral signature alterations due to shades (sun angle and clouds).
Platform type The use of multi-rotors with collision avoidance might allow sub canopy surveys in the near future
SfM SfM uses trigonometry to improve ground coverage, and improve penetrability below the canopy of the understorey

Share and Cite

MDPI and ACS Style

Hernandez-Santin, L.; Rudge, M.L.; Bartolo, R.E.; Erskine, P.D. Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions. Drones 2019, 3, 9. https://doi.org/10.3390/drones3010009

AMA Style

Hernandez-Santin L, Rudge ML, Bartolo RE, Erskine PD. Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions. Drones. 2019; 3(1):9. https://doi.org/10.3390/drones3010009

Chicago/Turabian Style

Hernandez-Santin, Lorna, Mitchel L. Rudge, Renee E. Bartolo, and Peter D. Erskine. 2019. "Identifying Species and Monitoring Understorey from UAS-Derived Data: A Literature Review and Future Directions" Drones 3, no. 1: 9. https://doi.org/10.3390/drones3010009

Article Metrics

Back to TopTop