Next Article in Journal
Tendon Anomaly Identification in Prestressed Concrete Beams Based on an Advanced Monitoring MEMS and Data-Driven Detection of Structural Damage
Previous Article in Journal
Munsell Soil Colour Prediction from the Soil and Soil Colour Book Using Patching Method and Deep Learning Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Characterization of Hazelnut Trees in Open Field Through High-Resolution UAV-Based Imagery and Vegetation Indices

1
Department of Control and Computer Engineering (DAUIN), Politecnico di Torino, Corso Duca degli Abruzzi, 24, 10129 Torino, Italy
2
Institute for Sustainable Plant Protection, National Research Council, (IPSP-CNR), Strada delle Cacce, 73, 10135 Torino, Italy
*
Author to whom correspondence should be addressed.
Submission received: 27 November 2024 / Revised: 15 December 2024 / Accepted: 24 December 2024 / Published: 6 January 2025
(This article belongs to the Section Smart Agriculture)

Abstract

:
The increasing demand for hazelnut kernels is favoring an upsurge in hazelnut cultivation worldwide, but ongoing climate change threatens this crop, affecting yield decreases and subject to uncontrolled pathogen and parasite attacks. Technical advances in precision agriculture are expected to support farmers to more efficiently control the physio-pathological status of crops. Here, we report a straightforward approach to monitoring hazelnut trees in an open field, using aerial multispectral pictures taken by drones. A dataset of 4112 images, each having 2Mpixel resolution per tree and covering RGB, Red Edge, and near-infrared frequencies, was obtained from 185 hazelnut trees located in two different orchards of the Piedmont region (northern Italy). To increase accuracy, and especially to reduce false negatives, the image of each tree was divided into nine quadrants. For each quadrant, nine different vegetation indices (VIs) were computed, and in parallel, each tree quadrant was tagged as “healthy/unhealthy” by visual inspection. Three supervised binary classification algorithms were used to build models capable of predicting the status of the tree quadrant, using the VIs as predictors. Out of the nine VIs considered, only five (GNDVI, GCI, NDREI, NRI, and GI) were good predictors, while NDVI SAVI, RECI, and TCARI were not. Using them, a model accuracy of about 65%, with 13% false negatives was reached in a way that was rather independent of the algorithms, demonstrating that some VIs allow inferring the physio-pathological condition of these trees. These achievements support the use of drone-captured images for performing a rapid, non-destructive physiological characterization of hazelnut trees. This approach offers a sustainable strategy for supporting farmers in their decision-making process during agricultural practices.

1. Introduction

According to the International Society for Precision Agriculture (ISPA), precision agriculture consists in a method that “gathers, processes, and analyzes temporal, spatial, and individual data” to support more accurate management decisions useful during agricultural practices. Using different tools such as GPS, sensors, drones or unmanned-aerial vehicles (UAVs), satellite imagery, ground stations, and data analytics, farmers can monitor crop conditions, plant and soil health, and environmental variables at a highly detailed level. This allows further precise interventions, improving crop yield, sustainability, and management efficiency, while minimizing waste, environmental impact, and production costs. At present, farmers primarily rely on on-site inspections of plants by qualified personnel, but more recently, high-resolution imaging conducted with UAVs or drones has started to provide new opportunities in various precision agricultural applications, particularly viticulture [1,2]. Compared to other remote sensing platforms, UAVs offer greater flexibility, adaptability, and accuracy, with the advantages of easy displacement and operation [3,4]. However, multi-rotor UAVs suffer from limited battery duration, a constraint that nonetheless allows the handling of small-to-medium-sized devices. Moreover, as other sensing tools, UAVs are influenced by weather conditions (e.g., rain, snowfall, clouds, wind, and fog), limiting their applicability.
In agriculture, image analysis is commonly performed using the visible and IR spectra. Several approaches to analyze the health status of plants have been adopted, based on neural networks (NNs) or vegetation indices (VIs), with promising results. The majority of VIs are proposed as a numerical synthesis of vegetation features, and are calculated based on spectral characteristics associated with an image pixel [5]. In simple terms, VIs result from the combination of surface reflectance at two or more wavelengths, which highlight a particular feature of the plant canopy. Since VIs mainly emphasize the photosynthetic activity of a crop, they are ubiquitously implemented in remote sensing agricultural applications, providing immediate indications of plant fitness in a specific environmental context. The majority of VIs rely on the inverse relationship between the red and near infrared (NIR) band reflectance associated with green vegetation.
Aerial imaging has been applied (i) to survey forests, which harbor different types of plants but often consist of a prevalent botanical species, such as conifers [6]; and (ii) to monitor cultivated fields, where only one type of plant is grown [7,8], with the intent of developing precision agriculture. Forest monitoring generally aims to identify biotic (pests, diseases) or abiotic (pollution, ice, snow, fire, drought) stressors [9], facing the additional problem of considering different types of trees with diverse canopies and crowns. In this case, individual trees are not the target of analysis, contrary to orchard monitoring. Indeed, VIs combined with a random forest model successfully identified the discoloration of the pine tree’s needles as the first stress response signal [10]. Moreover VIs predicted the sanitary status of forest trees, successfully identifying dead plants [11]; other studies focused on tree height and volume analysis [12,13]. As for orchard monitoring, UAVs were exploited to identify the plant species cultivated in a specific area, with yield monitoring purposes [14,15].
Images can be acquired by different types of sensors, including radars, RGB, or infrared (IR), all ending up in the problems of image segmentation or object identification and classification, requiring the adoption of different machine learning (ML) algorithms and notably, deep learning algorithms based on NNs, particularly for hazelnut fields [16,17,18]. At a smaller scale level, the identification of single trees in an orchard was attempted, facing a semantic segmentation problem [15,19,20]. If the distribution and position of individual trees in an orchard are known, specific parameters, i.e., canopy surface, tree architecture, and plant volume, acting as indirect indicators of plant health and growth rate, can be evaluated through image segmentation and 3D reconstruction; these issues were already considered for different orchard types, namely peach [21], olive [22,23,24], almond [25], apple [26], mango [27], cherry [28], pine [29], hazelnut [30,31,32], and grapevine [33,34].
Remote sensing techniques to assess the health status of individual trees in orchards typically consider water stress conditions, attack by pests, or disease symptomatology [35]. The identification of citrus trees infected by huanglongbing and bacterial canker was reached by applying different VIs, both in laboratory [36] and field conditions [37]. Other studies investigated the possibility of identifying verticillium wilt (VW) in olive trees, finding that crown-temperature-based indices, such as the crop water stress index (CWSI), chlorophyll fluorescence, and carotenoid indices, carotenoid reflectance index 2 (CRI2), and the normalized difference vegetation index (NDVI) were good predictors of early and advanced (e.g., water stress) symptoms caused by VW infection [38,39]. Moreover, apple scab could be detected in apple trees based on leaf wetness [40], while fire blight disease development appeared slightly correlated with normalized different spectral indices (NDSIs), generated from visible–NIR reflectance spectra (e.g., green normalized difference vegetation index, GNDVI, NDVI, and normalized difference red edge, NDRE) [41]. Additionally, different VIs, including NDVI, GNDVI, NDRE, and REGNDVI, were applied to monitor the vigor and health status of peach trees [42]. Lastly, recent studies involving convolutional neural networks (CNNs) attempted to identify Halyomorpha halys bugs in orchards [43,44]. Overall, although several VIs have been used for pest and disease identification or for the analysis of the water status of plants, a general ready-to-use framework to transfer the results obtained in a single case—and specifically in one plant—to other individuals is not yet available. Both the complexity of disease diagnosis and the diversity of crop species hamper a direct application of the results to other methods of detecting diseases or abiotic stresses of fruit trees [30].
Extreme weather events, mainly heat waves and drought due to the ongoing climate change are inevitably challenging the agricultural sector, causing serious losses in crop productivity and biodiversity [45,46]. Climate alteration favors the spread and the diversity of pathogens affecting crops [47]. This, combined with the increased severity of the diseases further entail a massive use of chemical treatments in most European countries [48], with inevitable onset of critical environmental issues [49,50]. This condition has deeply modified agricultural practices, searching for novel and more sustainable crop protection strategies. In this projected scenario, a timely and steady monitoring of physiological and pathological conditions of crops, including hazelnut, is fundamental for supporting farmers to preserve future harvests and safeguard food security.
European hazelnut (Corylus avellana L.) is a major species of interest for its nutritional value, and its kernels are employed worldwide in the chocolate, confectionery, and bakery industries. This pushed hazelnut cultivation outside its native areas, so that both the total surface planted and the tonnage harvested show a worldwide continuous growth trend from 1961 to 2021 reaching an overall production of 1.07 million metric tons in 2023, with Turkey being the first producer and Italy the second, with 765.000 and 98.670 tons, respectively [51]. European hazelnut trees are particularly sensitive to water deficiency, and under specific climate conditions and in areas with poor precipitation, supplemental irrigation is the only way to ensure plant productivity [52,53]. Additionally, hazelnut suffers from the attack by insects (such as leaf beetles, stink bugs, hazelnut weevils, scale insects, moths, aphids, etc.), mites (eriophyds), fungi (powdery mildew), bacteria (Xanthomonas and Pseudomonas spps.), viruses (e.g., Apple mosaic virus) [54,55,56,57], or other environmental stresses, such as intense UV exposure and temperature shifts [58]. Therefore, introducing regular and standardized monitoring systems to evaluate the physiological and sanitary conditions of hazelnut plants would allow a rational and timely management of the irrigation and protection practices. Moreover, implementing innovative technologies to provide steady and real-time monitoring would support preventive strategies to promptly recognize these issues, guiding agronomic interventions in a more timely and effective manner.
The goal of this work was to build models capable of predicting the sanitary status of a plant starting from the VI scores, calculated using a binary classification approach. Since, to our knowledge, no studies using VIs applied to C. avellana plants are available, we compared the performances of different VIs to select those effectively suitable to predict the sanitary status of these plants in open-field conditions. Images from hazelnut orchards across various frequency spectra were collected with drones, simultaneously characterizing the physio-pathological status of individual plants based on visual inspection. Due to the erratic distribution of disease or stress symptoms, possibly linked to the bushy structure of hazelnut trees, images were partitioned and each sub-image was treated as a different datapoint. These data were used to calculate VIs and develop and calibrate ML algorithms useful for an accurate characterization of the plant partition.
This approach allowed us to identify the most suitable VIs and ML tools capable of precisely classifying the target plants. A model accuracy of about 65%, with 13% false negatives, was achieved in a way that was rather independent of the algorithms, demonstrating that the selected VIs can be used to successfully infer the physio-pathological condition of hazelnut trees.
In summary, the novelty of the work presented in this paper lies in the following:
  • Applying VIs as predictors of health for hazelnut trees, trees with specific characteristics that put them apart from fruit trees where the VI approach has already been applied;
  • Considering portions of trees (and not whole trees) in the analysis, to achieve better classification accuracy, and in particular less false negatives;
  • Identifying a subset of VIs (GNDVI, GCI, NDREI, NRI, and GI) as best predictors, while excluding others (NDVI SAVI, RECI, and TCARI), in a literature context where the best VI predictors change in function of the tree considered.

2. Materials and Methods

2.1. Site Description

This research was conducted during the summer season of 2022 in hazelnut fields located in the Cuneo province in the Piedmont region, Northern Italy. Images were collected from two different orchards. The first field was in Carrù (44°27′40.8″ N 7°49′59.5″ E) with 127 plants of 2 m height, planted at 4 m × 4 m spacing within and between rows, in parallel and contiguous rows on a flat terrain. The second field located in Farigliano (Dogliani, 44°32′37.8″ N 7°54′52.5″ E) hosted 58 plants of 3 m height, planted at the above spacing on a flat surface (Figure 1).

2.2. Image Collection

Remote-sensed images for crop telemonitoring were collected using a P4 Multispectral drone (DJI, Shenzhen, China). The drone is equipped by the company with a multispectral camera in the RGB, red edge (RE) (730 nm), and near-infrared (NIR) (840 nm) frequencies, allowing a 2 M pixels resolution per image. For both fields, three flights were conducted at monthly intervals during the plant vegetative season (May–August).
To maximize image resolution, the drone was piloted to capture one image per plant from a 10 m height. Camera settings were defined to obtain a ground sample distance of 1 cm/pixel. Based on detailed maps of the field, a precise drone path was defined, with stops directly above each plant. The same path was repeated during each shooting. The UAV flight was planned in a flight-based automated manner, designed and executed by the DJI GSPro App, using a polygon grid flight plan with 10 m altitude, a velocity of 1.5 m/s, and with the capture mode “Hover & Capture at Point”.

2.3. Image Processing

2.3.1. Plant Recognition

Each photograph was taken from the zenith above each tree. When images did not perfectly correspond to an individual plant, a pre-processing step was performed. For large plants with overlapping canopies (Figure 2a), a geometric cropping approach was adopted, based on the exact position of the trunk and the distance between plants. For instance, based on the 4 × 4 m distance of planting between and within rows, images were cropped to include a 4 × 4 m2 area, positioning the trunk in the center of the image.
For young plants with no overlapping canopy (Figure 2b), pixels corresponding to the plant were isolated from background pixels representing soil, using the normalized difference vegetation index (NDVI) (see 2.5 for its definition). Since soil typically has an NDVI value much lower than the plant canopy [4], pixels with NDVI values < 0.2 were excluded from subsequent analyses. From an RGB plant image (Figure 2c), the corresponding NDVI values are shown, with soil pixels below the threshold highlighted in red. After applying a hierarchical contouring algorithm and removing smaller contours contained within larger ones, plant contours were finally identified (green areas in Figure 2d).

2.3.2. Image Slicing

Initially, we intended to classify each plant as healthy/unhealthy. However, symptoms of abiotic stress or pathogen attack (e.g., downward curling of the leaf lamina, leaf yellowing or reddening, leaf scorch, wilting of some portions of the canopy and/or branch collapse) were visible only in a portion of the plant in around half of the collected images. Additionally, as VIs are computed per pixel and averaged to compute a value per image, this would likely produce a high number of false negatives. Therefore, images were geometrically partitioned into nine individual pictures (Figure 3), each representing a surface slightly larger than 1 m2, so that VIs were associated with each segmented image.

2.4. Tree Tagging

Each sliced image was visually inspected and classified in binary terms, i.e., healthy vs. unhealthy. To increase accuracy, the classification was made independently by three different persons with more than 10 years of botanic experience. In case of disagreement, the team met and discussed until an agreement was reached.
Binary classification in two classes only (healthy, not healthy) is limited, and ideally each image could be broadly characterized specifying the level of pathological status, with multiple possible values. However, due to the nearly limitless physio-pathological conditions of a plant and considering the reduced number of images collected, a binary classification was adopted.

2.5. Vegetation Indices (VIs)

Vegetation indexes are numerical indicators computed from the different spectral ranges of an image, with the intent of characterizing vegetation properties [5]. VIs are computed per pixel, so they have to be aggregated to provide a VI per image: the average value of all pixels of an image is computed and the VI of the image is considered. Due to the image slicing strategy adopted, VIs are associated with each sub-image.
Nine different VIs were tested, based on (i) vigor indices, used to differentiate plants from their surrounding (ground, soil, roads, people, etc.) and (ii) chlorophyll indices, suitable for evaluating the plant physiological conditions (e.g., the intensity of the green color of leaves as a proxy of the chlorophyll content, nitrogen reflectance, etc.). The VIs used in this work are listed below.
(1) NDVI, one of the most used VIs in remote sensing measurements, quantifies vegetation by measuring the normalized difference between NIR bands, strongly reflected by vegetation, and red (RED) bands. NDVI is calculated pixel-to-pixel as follows:
N D V I = N I R R E D N I R + R E D
NDVI can range from −1 and +1, with positive values attesting the presence of vegetation, strictly negative values representing water (or clouds for satellite imagery), and values around 0 indicating soil properties (dirt, rock, sand, etc.). Values > 0.3 characterize green areas such as crops or forests. Consequently, NDVI below a certain threshold, due to a low reflectance of the NIR bands, can be indicative of plants undergoing stress events.
(2) GNDVI, a proxy of the plant photosynthetic activity, provides information on the nitrogen and water content of plants. It employs the NIR and green bands (GREEN) but, contrary to NDVI, it is more sensitive to changes in chlorophyll content. For this, GNDVI is expected to more accurately determine the physiological status of plants and has a higher saturation point. Compared to NDVI, this chlorophyll-based index is normally applied at later stages of plant growth, as it saturates later than NDVI. Therefore, while NDVI is more suitable to estimate crop vigor during the early growing stages, GNDVI can be used in crops with dense canopies or in more advanced stages of development. Like NDVI, GNDVI can range from −1 to +1 and is computed as follows:
G N D V I = N I R G R E E N N I R + G R E E N
(3) The green chlorophyll vegetation index (GCI) is used to assess the sanitary condition of a crop, based on its chlorophyll content. GCI is derived from the NIR and GREEN bands and is calculated using the following formula, assuming values from −1 to +infinity:
G C I = N I R G R E E N 1
(4) NDREI is commonly adopted to determine the concentration of chlorophyll in plants, mainly in the mid-to-late growing season when the plant fruits start to be mature. NDREI is based on the red-edge (RE) band measurement and is calculated using the following formula:
N D R E I = N I R R E D _ E D G E N I R + R E D _ E D G E
(5) The red-edge chlorophyll index (RECI) assesses the chlorophyll concentration in leaves and, as NDREI, is determined using the ratio of reflectivity in the NIR and RE bands:
R E C I = N I R R E D _ E D G E 1
(6) The nitrogen reflectance index (NRI) aims at determining the nitrogen content of plants, an indispensable macronutrient present in proteins, enzymes, and chlorophyll, whose deficiency causes stunted growth, small leaves with pale green or yellowish color, and lower chlorophyll concentration. Since low NRI values are commonly associated with higher reflectance, healthy plants are expected to exhibit lower NRI values.
N R I = G R E E N R E D G R E E N + R E D
(7) The greenness index (GI) directly correlates with the chlorophyll content and therefore plant’s health status; as for NRI, low GI values apply to healthy crops
G I = G R E E N R E D
(8) The transformed chlorophyll absorption and reflectance index (TCARI) allows the identification of chlorotic areas in a field, due to nutritional deficiencies or plant diseases.
T C A R I = 3   R E D _ E D G E R E D   0.2   ( R E D G R E E N ) R E D _ E D G E R E D
(9) The soil-adjusted vegetation index (SAVI) applies a correction to the NDVI to reduce the influence of soil brightness in areas where vegetation coverage is scarce. It is calculated with the following formula, where L indicates the correction factor.
S A V I = ( 1 + L )   N I R R E D N I R + R E D + L
L can range from 0 to 1 and it is set to 1 for areas with scarce vegetation, but as it can assume different values depending on the environment, it is frequently set to 0.5. Noteworthy, for L = 0, the SAVI value corresponds to NDVI.

2.6. Machine Learning Protocols

Three classical supervised ML algorithms, i.e., random forest, K-nearest neighbors (KNN), and logistic regression (LR) were applied and compared for classification [59]. In all cases, models were built using Python 3.7, with the libraries Tensorflow 2.11, scikitlearn 1.2, and PyTorch.1.13, adopting random search to optimize the hyperparameters and selecting k-fold = 5.
For random forests, the selected hyperparameters are #estimators = 100, criterion = Gini, no max depth. For KNN, the hyperparameters selected are neighbors = 5, uniform weights, ball tree computation of distance with leaf size = 30. For LR, the hyperparameters selected are lbgs solver, penalty l2.

3. Results

3.1. Image Processing and Binary Classification of Plants

After image segmentation, a total of 4995 pictures were obtained during the whole season (from May to July). Image portions containing mostly ground soil were excluded, leading to a final dataset consisting of 4112 images. The class distribution of the whole dataset after visual inspection and binary classification across the whole acquisition period is shown in Figure 4. A clear increase in the number of images classified as “unhealthy” occurred along the season, with a ratio of healthy vs. unhealthy of 2.28, 0.92, and 0.39 in the three shooting time points, respectively. Considering the whole dataset, the two classes were evenly distributed.

3.2. Computation and Selection of VIs

The mean VIs were calculated for all pixels of an image, and this value was taken as the VI of the whole image. Figure 5 shows the boxplots for the VIs computed over the whole dataset of images, tagged as ‘healthy’ and ‘unhealthy’ classes.
The class distribution of “healthy/unhealthy” plants appeared for the SAVI (which basically includes NDVI), RECI, and TCARI VIs. Indeed, SAVI mainly discriminates vegetation canopies from the ground. Both RECI and TCARI include red edge, suggesting that the content of this information is not highly meaningful for this study. Therefore, SAVI, RECI, and TCARI were excluded from further analyses.
Conversely, the GCI, GNVI, and NDREI metrics provided a general trend showing higher and lower values for healthy and unhealthy plants, respectively. Additionally, higher NRI and GI values were calculated for unhealthy plants vs. healthy plants, as expected.

3.3. VIs as Predictors of Health Status

The dataset was divided into a training set and a testing set in the proportions of 80–20%, using stratified random sampling. The best combinations of hyperparameters were achieved through parameter optimization methods, such as random search and grid search. Moreover, for each model, a k-fold cross-validation process was used and the evaluation was made showing confusion matrices and measures of accuracy, precision, recall, and F1-score.
The results of the computation of the above selected VIs, i.e., GNDVI, GCI, NDREI, NRI, GI, and their analyses with different ML algorithms are shown in Figure 6 and Table 1. Overall, the three different ML approaches adopted produced similar results. The LR method performed best with an accuracy of 0.66 and F1-scores of 0.62 and 0.69 for “healthy” and “unhealthy” plants, respectively. Random forest scored second in terms of accuracy, with 0.65, and obtained F1-scores of 0.61 and 0.69, while the KNN method achieved a slightly lower level of accuracy of 0.64 with F1-scores of 0.61 and 0.67.
In agriculture, the most important issue is to recognize unhealthy plants, and false negatives represent the most dangerous failure of a diagnostic technique. In this regard, the KNN identified 132 false negatives (out of 823 data points, 16%), while only 112 and 113 false negatives were computed by the random forests or the LR method, corresponding to about 13% of cases.

4. Discussion

In this work, we used images acquired with drones, having a resolution up to 16 times higher than non-military satellites, such as Landsat 8 which produces images with a resolution of 15–100 m per pixel (based on the band considered), unsuitable for individual tree monitoring.
To evaluate whether VIs could successfully predict the sanitary condition of a hazelnut grove, multispectral and hyperspectral sensors in the visible and IR range were considered. Among the nine VIs considered, SAVI, RECI, and TCARI could not successfully discriminate between healthy and unhealthy plants. Since SAVI is based on NDVI, we assumed that NDVI was also unsuitable for such discrimination analyses. On the contrary, GNDVI, GCI, NDREI, NRI, and GI were used to build models capable of correctly classifying plants, achieving an average accuracy of 65%. The significance of these results did not statistically differ when different methods were applied, such as random forest, LR, or KNN. Although a similar approach was already adopted for other crops [10,11,36,37,38,39,41,42], to our knowledge, no reports concerning hazelnut trees are available. This possibly results from the phenotypic traits of these plants, having a bushy pattern with large leaves and dense foliage, making it difficult to treat them as fruit trees. Such features could also explain the reduced performances of NDVI obtained in this work.
A wide range of accuracy is reported in the literature when the identification of the sanitary status of orchard trees using different VIs is evaluated. This variability is linked to different factors, including the kind of orchard and the size and age of the trees, the stress condition considered, and the time of detection. For example, the WI (water index), ARI, and TCARI1 VIs achieved 92% accuracy in the identification of canker disease in the fruits of citrus trees [36], but only when the orchard was evaluated at late stages of infection (i.e., with clearly visible symptoms), while the accuracy dropped to around 46% at early stages of disease development. In another work, NIR_R scored as the best predictor for citrus tree and canker disease classification analysis, while NDVI, GNDVI, SAVI were inefficient [37]. In this case, a classification accuracy ranging between 67 and 85% was reported, but with false negatives rates varying between 7 and 32%, possibly due to the low image resolution (4.5 cm per pixel) or to the fact that trees were classified as “with. /without disease” as a whole, without considering possible erratic or localized distribution of symptoms. In addition, the performances of a number of VIs (including NDVI and RDVI) were also tested to discriminate the level of severity of the disease, but in this case, the results of their classification were not reported [38]. In a study conducted on apple trees affected by the fire blight disease, NDVI, GNDVI, NDRE showed classification accuracies between 74 and 90%, with false negatives in the range of 3–11% [41], but the number of plants considered was very limited. Moreover, the NDVI, GNDVI, NDRE, REGNDVI values of peach trees were computed, assuming that higher values identify the most vigorous trees [42]; however, in this case, plant vigor was not correlated with any patho-physiological analysis of the trees. The accuracy results of this study are in line with the studies reported above, but not better. On one hand, a straight comparison is difficult to do, because, while the usage of VIs is similar, the object of study (hazelnuts) is different. Differences in accuracy results could depend either on the intrinsic characteristic of hazelnuts (foliage, bushy structure) or on the specificity of the research goal (quite broad in this study, in terms of healthy/not healthy, but more specific in other studies, targeting, for instance, only canker disease).
In other works, trees were generally considered as a whole and classification was made in terms of “healthy/unhealthy plant”. The image splitting strategy adopted herein prompted us to individually tag each sub-image and compute the selected VIs as the average of each pixel in the image. This approach provided a higher precision in the characterization of the phenotypic traits of plants, likely associated with the effects of abiotic and/or biotic stressors. This strategy was particularly suitable considering the typical hazelnut tree architecture, a bush composed of different independent branches possibly displaying different and erratic symptoms.
Moreover, this splitting strategy allowed us to minimize the risk of false negatives detection. Indeed, false negative predictions are a dangerous issue when the early monitoring of biotic or abiotic stress conditions is pursued. In fact, in 10% of the trees considered, only one of the nine sub-images was classified as “unhealthy”, thus contributing to limiting the number of false negatives and simultaneously providing a more correct classification of the tree. However, despite this approach, up to 13% of the plants were predicted as “unhealthy” and could therefore be incorrectly classified; nonetheless, such value is similar or below the false negative values reported previously (e.g., 7–32% in the study by [37] and 3–11% in [41].
Overall, these results lay the foundations to develop a UAV-based monitoring service suitable to flag potentially stressed or diseased plants on a whole field surface, thereby enabling farmers to rapidly identify the plants requiring prompt agronomical intervention. In this scenario, a drone-based monitoring service capable of surveying large fields, at high frequency (e.g., at least once a week) with repeated inspections of flagged plants, can be envisaged. This approach would help farmers to save time and money, thanks to the more frequent and deeper controls of flagged individuals in a timely manner.
The accuracy of plant classification using the proposed approach would benefit from the use of larger datasets, of other computational techniques of image analysis, such as neural networks, or from the use of a wider frequency of the spectrum considered. In addition, a more detailed definition of the plant parameters, rather than the binary classification adopted herein, would strongly increase the value of this approach. However, given the almost infinite kind of stresses occurring on plants, even simultaneously, a pragmatic approach should be adopted, focusing on defining a finite number of patho-physiological conditions specific for the crop of interest, e.g., “water stress”, “pest attack”, “pathogen infection”, or “no symptoms”.

5. Conclusions

In this work, we tested the suitability of different VIs to precisely classify the sanitary status of hazelnut plants, adopting in parallel a binary classification of trees based on visual inspection. Overall, the novelty of this work lies in the possibility of applying VIs as predictors of sanitary conditions in this crop, characterized by specific plant architecture and canopy morphology traits. Out of nine different VIs tested, GNDVI, GCI, NDREI, NRI, and GI were selected as the best predictors, contrary to NDVI, SAVI, RECI, and TCARI, indicating that the suitability of VIs changes according to the crop considered. Moreover, the specific features of hazelnut trees asked for the adoption of image segmentation strategies to increase the classification accuracy and reduce false negatives. Overall, a model accuracy of about 65%, with 13% false negatives was achieved in a way rather independent of the algorithms, demonstrating that the aforementioned VIs can be used to successfully infer the physio-pathological condition of hazelnut trees.
Conclusively, we gathered and analyzed data related to the physio-pathological status of cultivated hazelnut plants to preserve the qualitative and quantitative traits of this crop, including product yield, ultimately aiming at reducing agricultural management costs (e.g., irrigation and pest control measures). Computing VIs from images collected with a small commercial drone coupled to a multispectral camera allowed a prompt identification of unhealthy hazelnut plants, at early times, providing farmers with an accurate tool to identify trees with specific physio-pathological issues. A frequent, continuous, large-scale, and real-time monitoring of the physio-pathological condition of plants, currently mostly relying on human intervention, can optimize agronomic practices in the medium- to long-term perspective, resulting in cost reductions and improvements in productivity and the quality of the product. We believe that the proposed approach can be extended to the entire fruit sector.

Author Contributions

Conceptualization, M.M., L.A. and E.N.; methodology and analysis, S.P., C.P., A.M. and J.D.; software, S.P. and J.D.; validation, S.P., J.D. and L.A.; formal analysis and investigation, M.M., E.N., C.P., A.M., S.P. and J.D.; data curation, S.P., J.D., M.M. and L.A.; writing—original draft preparation, M.M.; writing—review and editing, E.N. and C.P.; supervision, M.M., E.N. and L.A.; project administration, M.M.; funding acquisition, M.M. and E.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the project DRONUTS—DROni per il monitoraggio Noccioli sUl Territorio, P.O.R. FESR 2014/2020—Asse I—Azione I.1b.1.2—Bando PRISM-E.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Acknowledgments

The authors are grateful to Michele Carelli, Carlo Ferrero, and Eleonora Caronia of Linear Systems s.r.l., and Maria Corte of Agrion for their technical support.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Primicerio, J.; Caruso, G.; Comba, L.; Crisci, A.; Gay, P.; Guidoni, S.; Genesio, L.; Aimonino, D.R.; Vaccari, F.P. Individual Plant Definition and Missing Plant Characterization in vineyards from High-Resolution UAV Imagery. Eur. J. Remote Sens. 2017, 50, 179–186. [Google Scholar] [CrossRef]
  2. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Barge, P.; Tortia, C.; Gay, P. Enhanced vineyard descriptors combining UAV 2D and 3D crop models. In Proceedings of the AgEng 2018—New Engineering Concepts for Valued Agriculture, Wageningen, The Netherlands, 8–10 July 2018; pp. 49–56. Available online: https://hdl.handle.net/2318/1670903 (accessed on 20 August 2022).
  3. Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in Precision Agriculture: Applications and Challenges. Energies 2022, 15, 217. [Google Scholar] [CrossRef]
  4. Liu, J.; Xiang, J.; Jin, Y.; Liu, R.; Yan, J.; Wang, L. Boost Precision Agriculture with Unmanned Aerial Vehicle Remote Sensing and Edge Intelligence: A Survey. Remote Sens. 2021, 13, 4387. [Google Scholar] [CrossRef]
  5. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sensors 2017, 2017, 353691. [Google Scholar] [CrossRef]
  6. Karthigesu, J.; Owari, T.; Tsuyuki, S.; Hiroshima, T. Improving the Estimation of Structural Parameters of a Mixed Conifer–Broadleaf Forest Using Structural, Textural, and Spectral Metrics Derived from Unmanned Aerial Vehicle Red Green Blue (RGB) Imagery. Remote Sens. 2024, 16, 1783. [Google Scholar] [CrossRef]
  7. Soussi, A.; Zero, E.; Sacile, R.; Trinchero, D.; Fossa, M. Smart Sensors and Smart Data for Precision Agriculture: A Review. Sensors 2024, 24, 2647. [Google Scholar] [CrossRef]
  8. Sassu, A.; Gambella, F.; Ghiani, L.; Mercenaro, L.; Caria, M.; Pazzona, A.L. Advances in Unmanned Aerial System Remote Sensing for Precision Viticulture. Sensors 2021, 21, 956. [Google Scholar] [CrossRef]
  9. Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.-J.; Tiede, D.; Seifert, T. UAV-Based Forest Health Monitoring: A Systematic Review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
  10. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV Multispectral Imagery Can Complement Satellite Data for Monitoring Forest Health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef]
  11. Saarinen, N.; Vastaranta, M.; Näsi, R.; Rosnell, T.; Hakala, T.; Honkavaara, E.; Wulder, M.A.; Luoma, V.; Tommaselli, A.M.G.; Imai, N.N.; et al. Assessing Biodiversity in Boreal Forests with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. Remote Sens. 2018, 10, 338. [Google Scholar] [CrossRef]
  12. Gatziolis, D.; Lienard, J.F.; Vogs, A.; Strigul, N.S. 3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles. PLoS ONE 2015, 10, e0137765. [Google Scholar] [CrossRef]
  13. Han, X.; Thomasson, J.A.; Bagnall, G.C.; Pugh, N.A.; Horne, D.W.; Rooney, W.L.; Jung, J.; Chang, A.; Malambo, L.; Popescu, S.C.; et al. Measurement and Calibration of Plant-Height from Fixed-Wing UAV Images. Sensors 2018, 18, 4092. [Google Scholar] [CrossRef] [PubMed]
  14. Ozdarici-Ok, A.; Ozgun-Ok, A. Using remote sensing to identify individual tree species in orchards: A review. Sci. Hortic. 2023, 321, 112333. [Google Scholar] [CrossRef]
  15. Lobo Torres, D.; Queiroz Feitosa, R.; Nigri Happ, P.; Elena Cué La Rosa, L.; Marcato Junior, J.; Martins, J.; Olã Bressan, P.; Gonçalves, W.N.; Liesenberg, V. Applying Fully Convolutional Architectures for Semantic Segmentation of a Single Tree Species in Urban Environment on High Resolution UAV Optical Imagery. Sensors 2020, 20, 563. [Google Scholar] [CrossRef] [PubMed]
  16. Tumer, I.N.; Sengul, G.S.; Sertel, E.; Ustaoglu, B. Object-Based Detection of Hazelnut Orchards Using Very High Resolution Aerial Photographs. In Proceedings of the 12th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Novi Sad, Serbia, 15–18 July 2024; pp. 1–5. [Google Scholar] [CrossRef]
  17. Lodato, F.; Pennazza, G.; Santonico, M.; Vollero, L.; Grasso, S.; Pollino, M. In-Depth Analysis and Characterization of a Hazelnut Agro-Industrial Context through the Integration of Multi-Source Satellite Data: A Case Study in the Province of Viterbo, Italy. Remote Sens. 2024, 16, 1227. [Google Scholar] [CrossRef]
  18. Sasso, D.; Lodato, F.; Sabatini, A.; Pennazza, G.; Vollero, L.; Santonico, M.; Merone, M. Hazelnut mapping detection system using optical and radar remote sensing: Benchmarking machine learning algorithms. Artif. Intell. Agric. 2024, 12, 97–108. [Google Scholar] [CrossRef]
  19. Cheng, Z.; Qi, L.; Cheng, Y.; Wu, Y.; Zhang, H. Interlacing Orchard Canopy Separation and Assessment using UAV Images. Remote Sens. 2020, 12, 767. [Google Scholar] [CrossRef]
  20. Dong, X.; Zhang, Z.; Yu, R.; Tian, Q.; Zhu, X. Extraction of Information about Individual Trees from High-Spatial-Resolution UAV-Acquired Images of an Orchard. Remote Sens. 2020, 12, 133. [Google Scholar] [CrossRef]
  21. Mu, Y.; Fujii, Y.; Takata, D.; Zheng, B.; Noshita, K.; Honda, K.; Ninomiya, S.; Guo, W. Characterization of peach tree crown by using high-resolution images from an unmanned aerial vehicle. Hortic. Res. 2018, 5, 74. [Google Scholar] [CrossRef] [PubMed]
  22. Caruso, G.; Palai, G.; D’Onofrio, C.; Marra, F.; Gucci, R.; Caruso, T. Detecting biophysical and geometrical characteristics of the canopy of three olive cultivars in hedgerow planting systems using an UAV and VIS-NIR cameras. Acta Hortic. 2021, 1314, 269–274. [Google Scholar] [CrossRef]
  23. Stateras, D.; Kalivas, D. Assessment of Olive Tree Canopy Characteristics and Yield Forecast Model Using High Resolution UAV Imagery. Agriculture 2020, 10, 385. [Google Scholar] [CrossRef]
  24. Sarabia, R.; Aquino, A.; Ponce, J.M.; López, G.; Andújar, J.M. Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis. Remote Sens. 2020, 12, 748. [Google Scholar] [CrossRef]
  25. Torres-Sánchez, J.; de Castro, A.; Peña, J.; Jiménez-Brenes, F.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  26. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  27. Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano, J.J. A novel approach for vegetation classification using UAV-based hyperspectral imaging. Comput. Electron. Agric. 2018, 144, 80–85. [Google Scholar] [CrossRef]
  28. Blanco, V.; Blaya-Ros, P.J.; Castillo, C.; Soto-Vallés, F.; Torres-Sánchez, R.; Domingo, R. Potential of UAS-Based Remote Sensing for Estimating Tree Water Status and Yield in Sweet Cherry Trees. Remote Sens. 2020, 12, 2359. [Google Scholar] [CrossRef]
  29. Gallardo-Salazar, J.L.; Pompa-García, M. Detecting Individual Tree Attributes and Multispectral Indices Using Unmanned Aerial Vehicles: Applications in a Pine Clonal Orchard. Remote Sens. 2020, 12, 4144. [Google Scholar] [CrossRef]
  30. Vinci, A.; Brigante, R.; Traini, C.; Farinelli, D. Geometrical Characterization of Hazelnut Trees in an Intensive Orchard by an Unmanned Aerial Vehicle (UAV) for Precision Agriculture Applications. Remote Sens. 2023, 15, 541. [Google Scholar] [CrossRef]
  31. Altieri, G.; Maffia, A.; Pastore, V.; Amato, M.; Celano, G. Use of High-Resolution Multispectral UAVs to Calculate Projected Ground Area in Corylus avellana L. Tree Orchard. Sensors 2022, 22, 7103. [Google Scholar] [CrossRef]
  32. Martelli, R.; Civitarese, V.; Barbanti, L.; Ali, A.; Sperandio, G.; Acampora, A.; Misturini, D.; Assirelli, A. Multi-Parametric Approach to Management Zone Delineation in a Hazelnut Grove in Italy. Sustainability 2023, 15, 10106. [Google Scholar] [CrossRef]
  33. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Gay, P. Unsupervised detection of vineyards by 3D point-cloud UAV photogrammetry for precision agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [Google Scholar] [CrossRef]
  34. Pagliai, A.; Ammoniaci, M.; Sarri, D.; Lisci, R.; Perria, R.; Vieri, M.; D’Arcangelo, M.E.M.; Storchi, P.; Kartsiotis, S.-P. Comparison of Aerial and Ground 3D Point Clouds for Canopy Size Assessment in Precision Viticulture. Remote Sens. 2022, 14, 1145. [Google Scholar] [CrossRef]
  35. Zhang, C.; Valente, J.; Kooistra, L.; Guo, L.; Wang, W. Orchard management with small unmanned aerial vehicles: A survey of sensing and analysis approaches. Precision Agric. 2021, 22, 2007–2052. [Google Scholar] [CrossRef]
  36. Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-Based Remote Sensing Technique to Detect Citrus Canker Disease Utilizing Hyperspectral Imaging and Machine Learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
  37. Garcia-Ruiz, F.; Sankaran, S.; Mari Maja, J.; Suk Lee, W.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  38. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote sensing of environment. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  39. Iatrou, G.; Mourelatos, S.; Zartaloudis, Z.; Iatrou, M.; Gewehr, S.; Kalaitzopoulou, S. Remote Sensing for the Management of Verticillium Wilt of Olive. Fresenius Environ. Bull. 2016, 25, 3622–3628. [Google Scholar]
  40. Stella, A.; Caliendo, G.; Melgani, F.; Goller, R.; Barazzuol, M.; La Porta, N. Leaf Wetness Evaluation Using Artificial Neural Network for Improving Apple Scab Fight. Environments 2017, 4, 42. [Google Scholar] [CrossRef]
  41. Jarolmasjed, S.; Sankaran, S.; Marzougui, A.; Kostick, S.; Si, Y.; Quirós Vargas, J.J.; Evans, K. High-Throughput Phenotyping of Fire Blight Disease Symptoms Using Sensing Techniques in Apple. Front. Plant Sci. 2019, 10, 576. [Google Scholar] [CrossRef]
  42. Cunha, J.; Gaspar, P.D.; Assunção, E.; Mesquita, R. Prediction of the Vigor and Health of Peach Tree Orchard. In Proceedings of the Computational Science and Its Applications—ICCSA 2021, Cagliari, Italy, 13–16 September 2021; Gervasi, O., Murgante, M., Misra, S., Garau, C., Blečić, I., Taniar, D., Apduhan, B., Rocha, A., Tarantino, E., Torre, C., Eds.; ICCSA 2021. Lecture Notes in Computer Science (LNTCS). Springer: Cham, Switzerland, 2021; Volume 12951, pp. 541–551. [Google Scholar] [CrossRef]
  43. Betti Sorbelli, F.; Corò, F.; Das, S.K.; Di Bella, E.; Maistrello, L.; Palazzetti, L.; Pinotti, C.M. A Drone-based Application for Scouting Halyomorpha halys Bugs in Orchards with Multifunctional Nets. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), Pisa, Italy, 21–25 March 2022; pp. 127–129. [Google Scholar] [CrossRef]
  44. Ichim, L.; Ciciu, R.; Popescu, D. Using Drones and Deep Neural Networks to Detect Halyomorpha halys in Ecological Orchards. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 437–440. [Google Scholar] [CrossRef]
  45. Ceccarelli, S.; Grando, S. Evolutionary Plant Breeding as a Response to the Complexity of Climate Change. IScience 2020, 23, 101815. [Google Scholar] [CrossRef]
  46. Lesk, C.; Rowhani, P.; Ramankutty, N. Influence of extreme weather disasters on global crop production. Nature 2016, 529, 84–87. [Google Scholar] [CrossRef]
  47. Corredor-Moreno, P.; Saunders, D.G.O. Expecting the unexpected: Factors influencing the emergence of fungal and oomycete plant pathogens. New Phytol. 2020, 225, 118–125. [Google Scholar] [CrossRef] [PubMed]
  48. Eurostat. Available online: https://ec.europa.eu/eurostat/cache/metadata/en/aei_fm_salpest09_esms.htm (accessed on 14 December 2023).
  49. Kaur, H.; Garg, H. Pesticides: Environmental Impacts and Management Strategies. In Pesticides—Toxic Aspects; Larramendy, M.L., Soloneski, S., Eds.; IntechOpen Limited: London, UK, 2014; pp. 187–230. [Google Scholar] [CrossRef]
  50. Provost, C.; Pedneault, K. The organic vineyard as a balanced ecosystem: Improved organic grape management and impacts on wine quality. Sci. Hortic. 2016, 208, 57–77. [Google Scholar] [CrossRef]
  51. FAO. Available online: https://www.fao.org/faostat/en/#search/hazelnut%20production (accessed on 27 December 2023).
  52. Girona, J.; Cohen, M.; Mata, M.; Marsal, J.; Miravete, C. Physiological, growth and yield responses of hazelnut (Corylus avellana L.) to different irrigation regimes. Acta Hortic. 1994, 351, 463–472. [Google Scholar] [CrossRef]
  53. Cristofori, V.; Muleo, R.; Bignami, C.; Rugini, E. Long term evaluation of hazelnut response to drip irrigation. Acta Hortic. 2014, 1052, 179–185. [Google Scholar] [CrossRef]
  54. AliNiazee, M.T. Ecology and management of hazelnut pests. Annu. Rev. Entomol. 1998, 43, 395–419. [Google Scholar] [CrossRef] [PubMed]
  55. Mezzalama, M.; Guarnaccia, V.; Martano, G.; Spadaro, D. Presence of Powdery Mildew Caused by Erysiphe corylacearum on Hazelnut (Corylus avellana) in Italy. Plant Dis. 2021, 105, 1565. [Google Scholar] [CrossRef] [PubMed]
  56. Lamichhane, J.R.; Fabi, A.; Ridolfi, R.; Varvaro, L. Epidemiological study of hazelnut bacterial blight in central Italy by using laboratory analysis and geostatistics. PLoS ONE 2013, 8, e56298. [Google Scholar] [CrossRef] [PubMed]
  57. Matić, S.; Caruso, A.G.; D’Errico, C.; Botto, C.S.; Noris, E.; Trkulja, V.; Panno, S.; Davino, S.; Moizio, M. Powdery mildew caused by Erysiphe corylacearum: An emerging problem on hazelnut in Italy. PLoS ONE 2024, 19, e0301941. [Google Scholar] [CrossRef]
  58. An, N.; Turp, M.T.; Türkeş, M.; Kurnaz, M.L. Mid-Term Impact of Climate Change on Hazelnut Yield. Agriculture 2020, 10, 159. [Google Scholar] [CrossRef]
  59. James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning: With Applications in R; Springer: New York, NY, USA, 2021; pp. 1–607. [Google Scholar] [CrossRef]
Figure 1. Aerial images of the two hazelnut fields used in this study: (a) Farigliano and (b) Carrù fields. (c) Representative image of a single hazelnut plant in the Carrù field.
Figure 1. Aerial images of the two hazelnut fields used in this study: (a) Farigliano and (b) Carrù fields. (c) Representative image of a single hazelnut plant in the Carrù field.
Sensors 25 00288 g001
Figure 2. Pre-processing of images for plant recognition and identification of the hazelnut tree canopy. (a) Image collected in the Carrù field showing multiple and overlapping canopies; (b) RGB image collected in the Farigliano field showing well-separated trees; (c) example of application of the normalized difference vegetation index (NDVI) to define the plant contours of the image shown in (b) (NDVI values are shown on the key colors); (d) the same image of (b,c) obtained after excluding pixels with NDVI values < 0.2, with plant contours defined and shown in green.
Figure 2. Pre-processing of images for plant recognition and identification of the hazelnut tree canopy. (a) Image collected in the Carrù field showing multiple and overlapping canopies; (b) RGB image collected in the Farigliano field showing well-separated trees; (c) example of application of the normalized difference vegetation index (NDVI) to define the plant contours of the image shown in (b) (NDVI values are shown on the key colors); (d) the same image of (b,c) obtained after excluding pixels with NDVI values < 0.2, with plant contours defined and shown in green.
Sensors 25 00288 g002
Figure 3. Example of plant image slicing. The inset represents a magnification of the slice bordered in red. Each slice was visually inspected, and binary classified as healthy/unhealthy.
Figure 3. Example of plant image slicing. The inset represents a magnification of the slice bordered in red. Each slice was visually inspected, and binary classified as healthy/unhealthy.
Sensors 25 00288 g003
Figure 4. Distribution of the whole dataset of images collected from hazelnut plants following binary classification in terms of “healthy/unhealthy”, across the whole acquisition period from May to July (three shooting time points).
Figure 4. Distribution of the whole dataset of images collected from hazelnut plants following binary classification in terms of “healthy/unhealthy”, across the whole acquisition period from May to July (three shooting time points).
Sensors 25 00288 g004
Figure 5. Boxplots of the vegetation indices calculated on the image dataset.
Figure 5. Boxplots of the vegetation indices calculated on the image dataset.
Sensors 25 00288 g005
Figure 6. Performances of the different supervised machine learning algorithms applied to the selected vegetation indices GNDVI, GCI, NDREI, NRI, and GI. The performance is expressed in terms of accuracy and F1-score.
Figure 6. Performances of the different supervised machine learning algorithms applied to the selected vegetation indices GNDVI, GCI, NDREI, NRI, and GI. The performance is expressed in terms of accuracy and F1-score.
Sensors 25 00288 g006
Table 1. Performance of the different machine learning classification algorithms used in this work, calculated on the test sets and expressed as figures of merit (precision, recall, F1-score).
Table 1. Performance of the different machine learning classification algorithms used in this work, calculated on the test sets and expressed as figures of merit (precision, recall, F1-score).
Tested ModelBinary ClassificationPrecisionRecallF1-Score
Random forest00.670.560.61
10.640.740.69
Accuracy 0.65
Macro average0.660.650.65
Weighted average0.660.650.65
Logistic regression00.670.570.62
10.650.730.69
Accuracy 0.66
Macro average0.660.650.65
Weighted average0.660.660.65
KNN00.640.580.61
10.640.700.67
Accuracy 0.64
Macro average0.640.640.64
Weighted average0.640.640.64
Note: The test set included 823 images, of which 397 were classified as “Healthy” and 426 as “Unhealthy”.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Morisio, M.; Noris, E.; Pagliarani, C.; Pavone, S.; Moine, A.; Doumet, J.; Ardito, L. Characterization of Hazelnut Trees in Open Field Through High-Resolution UAV-Based Imagery and Vegetation Indices. Sensors 2025, 25, 288. https://doi.org/10.3390/s25010288

AMA Style

Morisio M, Noris E, Pagliarani C, Pavone S, Moine A, Doumet J, Ardito L. Characterization of Hazelnut Trees in Open Field Through High-Resolution UAV-Based Imagery and Vegetation Indices. Sensors. 2025; 25(1):288. https://doi.org/10.3390/s25010288

Chicago/Turabian Style

Morisio, Maurizio, Emanuela Noris, Chiara Pagliarani, Stefano Pavone, Amedeo Moine, José Doumet, and Luca Ardito. 2025. "Characterization of Hazelnut Trees in Open Field Through High-Resolution UAV-Based Imagery and Vegetation Indices" Sensors 25, no. 1: 288. https://doi.org/10.3390/s25010288

APA Style

Morisio, M., Noris, E., Pagliarani, C., Pavone, S., Moine, A., Doumet, J., & Ardito, L. (2025). Characterization of Hazelnut Trees in Open Field Through High-Resolution UAV-Based Imagery and Vegetation Indices. Sensors, 25(1), 288. https://doi.org/10.3390/s25010288

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop