1. Introduction
Precision agriculture is focused on the use of various technologies to monitor the spatial and temporal variability related to agricultural management, yield maximization, and economic and environmental benefits [
1]. In precision viticulture, the technological advances used in precision agriculture are used to improve yield forecasting, harvesting management, water management, and the improvement of grape quality, which then in turn influences the quality of the bottled product, namely, wine. In recent decades, the development of uncrewed aerial systems (UAS) technology and satellite technology for remote sensing has experienced an increase in spatial resolution, temporal availability, and the capability to describe the physiological, geometrical, and morphological parameters of plants [
2].
Vegetation indices, calculated based on multispectral sensor data from UAS or multispectral satellite data, are used to monitor plant status, stress levels, and vineyard variability. The index values are correlated to chlorophyll content within leaves, nitrogen concentration, and plant water status for vine variability and vigor monitoring [
3]. Besides vineyard variability monitoring, the estimation of vine row area and leaf area, disease detection, and vineyard vigor mapping, yield estimation is another important topic [
1]. Yield estimation is used for wine cellar management and to calculate the grape purchases from external sources if the forecast of the yield is below the needs of the winery, or to intensify the selling of grapes if the yield is above the expectations. Therefore, it is of great importance to monitor the yield of a vineyard to work profitably and accurately. However, optimizing yield estimation requires current and georeferenced vineyard information regarding the parameters, including crop height, canopy architecture, and fruit weight [
4]. From all collectable datasets within a vineyard, data related to yield estimation stand out for their economic relevance and for being of great importance for optimizing plant growth and fruit quality [
5].
The optimization of vineyard management techniques often requires efficient and automated or semi-automated methods in order to identify vine-specific morphological and geometric parameters like canopy architecture, vine heights, vine row geometry, and vine location within rows. UAS-based imagery offers the capability of modeling the plant-specific morphological parameters using photogrammetric processing [
6]. Photogrammetry can provide a non-contact 3D reconstruction of objects that can be beneficial if the objects are spread over a large area or are difficult to reach. Structure from motion (SfM) processing combines photogrammetric data acquisition techniques with computer vision in order to reconstruct 3D surfaces [
7]. The algorithm uses a technique of comparing recurring points in image sets to reconstruct the UAS camera positions. The process results in a sparse point cloud, containing the triangulated locations of points that were matched across the images [
8]. The generated output requires imagery from different angles and viewpoints to reliably reconstruct the geometry of the area under study. The image acquisition and the photogrammetric processing are followed by the output creation. The outputs of the digital image processing are a dense 3D point cloud, a textured 3D model, a digital surface model (DSM), and an orthomosaic [
9].
UAS are among the most important sensor platforms in modern precision agriculture. They show great potential in precision viticulture thanks to their capacity to produce very high spatial, spectral, and temporal resolution with a ground sampling distance (GSD) up to 1 cm; the possibility to apply multispectral sensors; and the option of a high repeatability rate. Sensors onboard UAS can collect a variety of different data sets that are used to calculate vegetation indices for monitoring the vitality and the geometric and morphological characteristics of the objects under study. Müllerová et al. [
10] presented a comprehensive framework where common rules in the domain of UAS-based monitoring of ecosystems are identified. By analyzing studies, the researchers found similarities in workflows according to the character of the vegetative properties under investigation. Properties concerning biodiversity, ecosystem structure, plant status, and dynamics were defined. Successful UAS surveys accounted for the choices of sensor and platform, knowledge about the phenomenon under study, and use of the desired resolution and analytical methods. In the study presented herein, the ecosystem structure of vines is monitored. Parameters like leaf surface area (LSA), leaf area index (LAI), and vine height are derived from UAS-collected multispectral and RGB datasets. According to Müllerová et al. [
10], vineyard rows appear as distinct objects that are highly dissimilar to the surrounding objects. In terms of UAS data acquisition, the unique structural appearance of planted rows should lead to a high spatial resolution and low spectral resolution. The preferable classification method comprises an object-based image analysis (OBIA) framework. In order to derive the targeted vegetation heterogeneity components, a passive sensor should be used with a high overlap, followed by SfM processing. By using the resulting 3D point cloud and the respective analytical methods, the heterogeneity parameters of vine height, LSA, and LAI can be derived [
10].
Matese et al. [
11] proposed the effectiveness of low-cost UAS systems for vigor mapping in vineyards. UAS-collected data can serve as the baseline for photogrammetric processing workflows that are then used to create a variety of 3D datasets. RGB data are useful for spatial variability monitoring within a vineyard when accurately segmented and properly analyzed. However, manually segmenting and digitizing individual vines is a time-consuming and error-prone task. Poblete et al. [
12] provided a framework for vine canopy recognition using different automatic classification methods like K-means, Ann, random forest, and spectral indices. The results showed that the spectral index method provided the best results for vine canopy detection, since this method is automatic and does not require specific software to calculate the indices. The 2G_RBi index was obtained from the difference of the divergency of the red and blue bands from the green band of the original RGB orthomosaic.
The very high resolution of UAS images can also be a challenge in classification due to the high spectral variability between different vegetation classes. In the study of de Castro et al. [
13], the researchers proposed a method for the automated detection of grapevine 3D canopy structures based on high-resolution UAS imagery. An algorithm was developed where the 3D morphology of vines was investigated, a height estimation was conducted, and missing plants within a row were detected. The algorithm needs no training and is self-adaptive to different crop field conditions. The researchers used a random forest (RF) classification environment where the RF randomly selected a training set, gathered the optimal feature values, and classified weeds and crop rows [
13]. The research carried out by Mesas-Carrascosa et al. [
14] shows an approach to plant variability monitoring that uses RGB point clouds for the recognition and classification of plants to investigate the geometric structure of vines using the soil points as a reference and calculating differences to the canopy tops. To obtain information about the tree canopy, a series of 3D point clouds was generated using the SfM technique on images acquired with an RGB sensor on board a UAS. Together with the geometry, each point of the cloud stored the information from the RGB color space. This information was used by the researcher to distinguish vegetation from bare soil and to perform an automatic classification of the point clouds where the soil points were subtracted from the vine points, resulting in a vegetation index-based automatic calculation of canopy heights within the research areas.
The monitoring of the vine’s ecosystem structure is a labor-intensive task and is mostly performed manually. Manual measurements are often inconsistent, leading to errors in the metrics, and only small regions can be sampled [
15]. The ecosystem structures of vines correlate with plant growth, health status, and potential yield [
16]. UAS-based remote sensing methods for data acquisition have the benefit of delivering sub-centimeter spatial resolution imagery in a fast and efficient way in comparison to proximal sensing. Satellite-based imagery provides valuable insights at a landscape scale, but the spatial resolution is often too coarse for precision viticulture [
17]. In the review paper of Moreno and Andújar [
15], the researchers state that future research should investigate the optimal flight angles of UAS-mounted sensors in order to collect valuable data for the geometric characterization of vines. In the current study, 3D point cloud analysis combined with object-based image analysis is used to detect and segment single vines. The derivation of single plants in the research area enables the calculation of ecosystem structures at the single-plant level. The study’s contribution to precision viticulture involves accurately identifying individual plant locations and extracting the geometric parameters of vines associated with the identified individual plants using high-resolution, UAS-based 3D point clouds.
3. Validation
The estimation of vine parameters is pivotal in viticulture for informed decision making. The vine-related parameters could serve as indicators of vine health, growth, and productivity, offering valuable insights for optimizing vineyard management practices. Validation is a critical step in ensuring the robustness and applicability of the derived values to real-world vineyard conditions. It involves a comprehensive assessment against ground truth measurement. Through this validation process, the aim is to assess the accuracy, precision, and overall performance of the employed techniques, ultimately enhancing the reliability of the findings. Here, a systematic approach to validation, encompassing field data collection, statistical analyses, and comparative studies, is presented. Specific challenges encountered during the validation process are highlighted, and potential sources of error that may influence the accuracy of the derived parameters are discussed. Through this validation framework, a solid foundation is established for the subsequent interpretation and application of the derived ecosystem structures of vines.
In
Figure 11, the reference vine points are visually displayed together with the derived vine points using the 3D point cloud approach, where the maximum z coordinate of the mesh to cloud distance was used to report the respective X and Y coordinates, which resulted in the X, Y, Z location of the top of the vine trunks.
The average distance from the reference points to the observed vine points measured 10.7 cm (
Table 1). The vine detection method offers an automated solution, making it highly practical for applications in vineyards. It enables the efficient counting of single vines, precise vine localization for yield and vigor assessments, and the tracking of missing plants over successive years. This is a solution to the task that proves challenging with manual approaches due to the intertwining growth of plants, making the clear delineation of single plants difficult. For the derivation of vine locations, a root mean square error (
RMSE) was calculated as follows:
where
n is the number of distances observed,
di is the distance between a reference point and the derived vine location for each vine, and
mean(d) is the mean overall distance. For the vine location calculation, an
RMSE of 1.707 was calculated, which represents the average discrepancy between the derived vine points and the reference measurements.
The validation of the automatically derived vine height, using an OBIA framework approach that included several analysis steps within a ruleset built using eCognition Developer 10.3 Software, was performed by comparing the derived vine heights to manual measurements across the whole vineyard. Measurement locations were measured using the Leica Viva GS16 GNSS-Rover with an accuracy of 10 mm. The reference measurements were compared to the derived vine heights at the same location.
Ground-level measurements were conducted at 23 different points across the vineyard, as shown in
Figure 12, to assess the accuracy of the derived vine heights. This evaluation also aimed to gauge the effectiveness of the OBIA approach in segmenting the vine vegetation. Field measurements were specifically taken at locations with active vine growth. In areas without vine growth, the nDSM yielded a value of 0. A vine height reference point registering a value of 0 may signify potential bias in the segmentation process, warranting further scrutiny and refinement.
Linear regression analysis comparing manually measured vine heights to heights automatically derived from a nDSM was performed on a dataset comprising 23 data points (
Figure 13). The coefficient of determination (R
2) was found to be 0.83, indicating a strong positive correlation between the two sets of measurements. This suggests that the derived vine heights were highly predictive of the manually measured values. The regression equation provides a mathematical representation of this relationship. Specifically, for every unit increase in the automatically derived vine height, there was an associated 0.96 cm increase in the manually measured vine height, with an additional constant offset of 1.37 cm [
30]. These findings affirm the validity and accuracy of the vine height estimation approach and underscore its potential utility in vineyard management and related applications.
The validation of the leaf area index (LAI) in the vineyard will be deferred to a subsequent stage due to the unavailability of an LAI measurement device, such as a ceptometer or LAI-2200 Plant Canopy Analyzer. Similarly, the validation of the leaf area will also be postponed. This is attributed to the labor-intensive nature of obtaining analog measures of one-sided leaf area, which will be addressed in subsequent phases of the study. It is important to note that these measures, while currently assumed, are based on established methodologies and demonstrate technical feasibility within the scope of this research.
4. Discussion
This study has delved into various aspects of vineyard ecosystem analysis, encompassing the derivation of the location of single vines, vine heights, computation of the LAI, and determination of the LSA. Each of these steps presented its own set of challenges and considerations, from the segmentation processes to the technical nuances of data acquisition. One objective is to discern any patterns or relationships among these parameters that could potentially illuminate the dynamics of the vineyard environment. A correlation matrix analysis holds promise in providing valuable insights into the interplay of these variables.
The detection of individual grape vines was based on the 3D point cloud of the study vineyard generated in April 2023. In early spring, the vines did not show any leaves, making it possible to scan the woody part of the plants for structure. Based on the assumption that every vine trunk belonged to a single vine, a surface was generated and applied at the base of the trunks. The computation of the distance from the base surface to the top of the trunks made it possible to extract the z maximum for every vine and the respective X, Y, Z coordinates. The validation showed a mean distance from the reference points to the observed location of 10.7 cm. This value shows the real possibility of detecting vines using UAS techniques. Because of computing power constraints, the algorithm was tested on one vineyard row containing 21 vines. In future work, it would be of great interest to test this approach in a different vineyard to check the robustness of the method. One methodological issue is that vines do not grow perfectly vertically. A height-based threshold of 25 cm from the surface was introduced to make sure that the z maximum value would be derived from the trunk and not any branch of the vine. But if the stem were to have an extreme growing angle, the distance to the reference point, which is taken at the ground level of the stems, would be higher.
The derivation of vine heights was based on the computation of the NDVI and an OBIA approach for the classification of vines. The surface model was established using a DTM together with a DSM. Usually, to compute a DTM, a LiDAR device is used, since the laser has the ability to penetrate through the tree cover and record the bare soil [
31]. For this project, no UAS-LiDAR was available, which made it necessary to compute the nDSM based on photogrammetric UAS datasets. The use of the cloth filtering algorithm in CloudCompare v2.12.4, together with an interpolation method to close holes, made it possible to generate a DTM from the photogrammetry data. This method shows that it is possible to perform a task that is typically performed using a LiDAR device with lower-cost equipment. After the use of the OBIA framework to distinguish the vine morphology from all other objects, it was possible to use the image object outlines to mask the nDSM and obtain the height values only for vine vegetation across the whole vineyard. The R
2 value of the linear regression on the reference measurements and the observed vine heights was 0.83, indicating a strong correlation of the automatically derived vine heights with the real-world scenario. The computation of the LAI and the LSA could not be validated over the course of this study. A follow-up study on LAI and LSA is currently in progress. This procedure can only be carried out post-harvest, since the removal of all leaves could impede grape development. As shown in
Figure 14, correlation matrix was used to display possible relationships among the collected parameters in the vineyard. The matrix was computed using the R-Studio software and an applicable R scripting language. Upon examination of the correlation matrix, some associations emerged. The LAI had a correlation coefficient of 0.22 to the observed LSA. The correlation was not strong, but could still imply that the LAI changes with changing leaf cover and pruning techniques. The overshadowed area, which is the areal extent of the vine plants from the nadir view (polygonal area of the OBIA segmentation) and leaf area, demonstrated a notably strong positive correlation, registering at 0.79. This relationship suggests a significant interdependence between these two variables, signifying that, as the overshadowed area increases, so does the leaf area. Further, a correlation coefficient of 0.58 was observed between the vine height and the leaf area, indicating a moderate positive correlation. This finding suggests that, as the vine height increases, there is a tendency for leaf area to also increase, albeit with a moderate strength of association.
The initial correlation coefficients extracted from the correlation matrix serve as a foundation for a more in-depth exploration of relationships within the vineyard ecosystem. This study lays the groundwork for the development of a yield prediction model, leveraging the observed correlations between key variables. By employing these coefficients as a starting point, a future aim will be to construct a predictive framework that can forecast harvest weight based on factors such as leaf area, vine height, and other relevant metrics. This paper demonstrates the technical feasibility of the study’s approaches, demonstrating the potential for advanced analytics to inform precision viticultural practices and enhance decision making in vineyard management.