Next Article in Journal
Radar Target Radar Cross-Section Measurement Based on Enhanced Imaging and Scattering Center Extraction
Next Article in Special Issue
Estimation of Strawberry Canopy Volume in Unmanned Aerial Vehicle RGB Imagery Using an Object Detection-Based Convolutional Neural Network
Previous Article in Journal
Petri-Net-Based Charging Scheduling Optimization in Rechargeable Sensor Networks
Previous Article in Special Issue
Utilizing High-Resolution Imaging and Artificial Intelligence for Accurate Leaf Wetness Detection for the Strawberry Advisory System (SAS)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Methods for Evaluating Crop Water Stress Using AI Techniques: A Review

1
Department of Biosystems Engineering, College of Agricultural and Life Sciences, Gyeongsang National University, 501, Jinju-daero, Jinju-si 52828, Gyeongsangnam-do, Republic of Korea
2
Division of Crop Production and Physiology, National Institute of Crop Science, Rural Development Administration, 100, Nongsaengmyeong-ro, Iseo-myeon, Wanju-gun 55365, Jeonbuk-do, Republic of Korea
3
Department of Biosystems Engineering, College of Agriculture, Life and Environment Sciences, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju-si 28644, Chungbuk-do, Republic of Korea
4
Department of Smart Agro-Industry, College of Life Science, Gyeongsang National University, Dongjin-ro 33, Jinju-si 52725, Gyeongsangnam-do, Republic of Korea
5
Department of Biosystems Machinery Engineering, Chungnam National University, Daejeon 34134, Republic of Korea
6
Environmental Microbial and Food Safety Laboratory, Agricultural Research Service, Department of Agriculture, Powder Mill Road, BARC-East, Bldg 303, Beltsville, MD 20705, USA
7
Institute of Agriculture and Life Sciences, Gyeongsang National University, 501, Jinju-daero, Jinju-si 52828, Gyeongsangnam-do, Republic of Korea
*
Authors to whom correspondence should be addressed.
Submission received: 16 August 2024 / Revised: 26 September 2024 / Accepted: 26 September 2024 / Published: 29 September 2024
(This article belongs to the Special Issue Feature Papers in Smart Agriculture 2024)

Abstract

:
This study systematically reviews the integration of artificial intelligence (AI) and remote sensing technologies to address the issue of crop water stress caused by rising global temperatures and climate change; in particular, it evaluates the effectiveness of various non-destructive remote sensing platforms (RGB, thermal imaging, and hyperspectral imaging) and AI techniques (machine learning, deep learning, ensemble methods, GAN, and XAI) in monitoring and predicting crop water stress. The analysis focuses on variability in precipitation due to climate change and explores how these technologies can be strategically combined under data-limited conditions to enhance agricultural productivity. Furthermore, this study is expected to contribute to improving sustainable agricultural practices and mitigating the negative impacts of climate change on crop yield and quality.

1. Introduction

Despite international efforts to reduce greenhouse gas emissions over the past several decades, the global surface temperature has increased by approximately 2.45 °C compared to the 19th century. This global warming has triggered sudden natural disasters worldwide, including heat waves, cold waves, heavy rain, droughts, and floods [1]. Additionally, these conditions can significantly affect agricultural ecosystems, especially climate conditions [2]. Consequently, the impact of climate change has contributed to declines in the yield and quality of agricultural products [3]. Crop damage caused by climate change has been reported in several countries, including Thailand, India, China, and the United States. Hence, various studies have been conducted in an effort to reduce the impacts of climate change over recent years [4,5]. These impacts have been highlighted not only in popular media, such as the news, but also in a lot of relevant research [5,6,7,8].
Currently, the agricultural industry consumes 80–90% of the world’s freshwater resources [9], and the intensity and frequency of precipitation caused by climate change are expected to increase the demand for irrigation water. Under these circumstances, one of the key elements of sustainable agricultural applications is to monitor the crop water stress levels [10,11]. Water stress evaluation can help to control the amount of water used and prevent excessive water consumption, which significantly impacts crops’ yield and quality.
Traditional methods for evaluating crop water stress involve measuring the soil moisture content and analyzing meteorological variables and various physiological parameters, such as water potential and stomatal conductivity [12,13]. Although these methods can provide direct information about the crop water stress level, the traditional methods are time-consuming, laborious, and destructive [14]. These are significant drawbacks for the evaluation of water stress levels in field crops. Moreover, using traditional methods would make it difficult to cover large areas [15,16].
Non-destructive and rapid crop water stress monitoring technologies have been developed to overcome these limitations. Remote sensing coupled with various optical sensors (RGB, thermal, multispectral, and hyperspectral imagery) is used, and these platforms are also applied to satellites, aircraft, drones, and handheld devices for the rapid collection of digital data. These digital data can be analyzed using artificial intelligence (AI) techniques to assess the chemical and physical properties of crops [17,18,19]. Remote sensing technologies have been applied for crop classification [20], yield prediction [21], the detection and management of diseases and pests [22], and crop water stress detection [23]. These technologies generate large datasets, including open-source satellite data from platforms such as Google Earth Engine [24,25]. To transform the collected data into meaningful information, various preprocessing steps are required. However, manually analyzing such vast amounts of data is time-consuming. Hence, AI technology is employed to address this challenge.
The use of artificial intelligence (AI) for crop water stress analysis began in earnest in the mid-1970s. For instance, Millard et al. [26] conducted a study in April 1976, measuring crop temperatures using infrared scanners and IR photography from both aircraft and ground platforms in wheat fields subjected to various water stress levels. Since then, AI technologies have played important roles in predicting optimal irrigation timing and quantity, reducing water waste, and increasing crop yields [27]. These technologies have improved the efficiency of irrigation systems and water management. In particular, AI-based drones and satellite systems enable precise irrigation monitoring for crop health, soil moisture levels, and water usage across large areas [28].
To date, AI has been widely used in crop identification [16,29], disease detection [30], and yield prediction [31]. Specifically, for crop water stress assessment, machine learning and deep learning algorithms are primarily employed. Several techniques, such as Random Forest (RF), Support Vector Machine (SVM), and Convolutional Neural Network (CNN) architectures, are intensively used, and generative AI models such as Generative Adversarial Network (GAN) models have recently been applied [32].
Accordingly, this study comprehensively reviewed the trends of various sensing platforms utilizing machine learning and deep learning techniques for crop water stress analysis over the past decade. The primary objective of this study was to present the AI techniques used in crop water stress analysis and to evaluate the related technical methods used. Specifically, we focused on collecting and analyzing AI techniques utilized in remote sensing technologies. While most previous studies have concentrated on thermal imaging, multispectral/hyperspectral sensors [13,33], or water stress studies on specific crops [34,35,36], excluding RGB, this study analyzed not only these sensing technologies but also AI techniques for evaluating water stress in a variety of field crops.
This study provides an in-depth analysis of how AI technologies can contribute to managing crop water stress and promoting efficient water resource utilization; it also aims to offer solutions for agricultural challenges caused by extreme climate conditions. In addition, optimal approaches to maximize the applicability and effectiveness of AI-based technologies in agriculture are suggested. The findings from this analysis are expected to contribute to strategies ensuring the sustainability of agriculture in the future and to further advance the application of AI technologies in the agricultural sector.

2. Materials and Methods

To investigate the latest trends in remote sensing and artificial intelligence techniques for evaluating crop water stress in the context of climate change, a systematic process of literature collection and screening was conducted. The primary databases selected for this purpose were Google Scholar, Scopus, Web of Science, and ScienceDirect. Google Scholar provides a broad range of resources across various academic fields, while Scopus offers citation analysis, allowing for the evaluation of research’s impact and quality. Web of Science is particularly strong in citation counts and impact factor analysis, while ScienceDirect offers access to the latest research in science, technology, and medicine. These databases are well-suited for a comprehensive review of studies related to agriculture, remote sensing, and artificial intelligence.
For the literature search, keywords such as “Machine learning”, “Deep learning”, “Water stress”, “Crop”, “Remote sensing”, and “Climate change” were used to target journal and conference papers published over the past decade (2013–2024). The search terms were restricted to appear in the “title”, “abstract”, or “keywords” sections of the papers. The search scope was limited to “journals”, and the document types included “research articles”, “reviews”, and “articles in press”. Additionally, the search was restricted to papers published in English.
Through this process, approximately 130 papers were collected, and additional literature was gathered using supplementary keywords such as “RGB”, “Thermal imaging”, “CWSI”, and “Hyperspectral”. Given the practicality of hyperspectral techniques in evaluating crop water stress, multispectral technology was also investigated. Table 1 provides a summary of the search strings used for each database.
During the literature collection process, basic keywords such as “Machine learning”, “Deep learning”, “Water stress”, “Crop”, “Remote sensing”, and “Climate change” were initially used. However, many irrelevant materials were retrieved, and some necessary studies were omitted. To address this issue, the search was refined in over 20 iterations by adding or removing keywords to identify the optimal search terms. Additional key terms, including “RGB”, “Thermal”, and “Hyperspectral”, were also employed. As a result, approximately 95 papers were collected, and the literature was categorized according to remote sensing techniques in order to avoid duplication and clarify the application cases of each technology.
  • Examples of crop water stress assessment using RGB imaging;
  • Examples of crop water stress assessment using thermal imaging;
  • Examples of crop water stress assessment using CWSI;
  • Examples of crop water stress assessment using hyperspectral imaging.
The selected articles were re-filtered based on their relevance to crop water stress assessment, using the following exclusion criteria to strengthen the focus and relevance of the research: First, review papers were excluded. Although review papers provide comprehensive analyses of existing studies, they do not present specific experimental data, which were necessary for this research. Second, studies employing destructive methods were also excluded. Destructive methods are not suitable for maintaining crops in actual agricultural environments, making non-destructive techniques more favorable. Third, studies that did not utilize AI-based approaches were excluded. AI methods are critical for enhancing the precision of crop water stress assessments, and they reflect the latest advancements in technology. Lastly, studies that focused on crops that were not subjected to water stress were excluded, as the primary goal of this research was to evaluate crop water stress, placing such studies outside our scope.
  • Review articles;
  • Assessment of crop water stress with destructive methods;
  • No AI learning;
  • Crops not under water stress.
In the end, 46 articles were selected. The selected articles were published by Elsevier, Springer, MDPI, IEEE, etc., and were selected based on the publishers that are commonly selected for literature reviews. The keywords and additional filtering choices for the literature selection are shown in Figure 1. The flow of the paper is shown in Figure 2.

3. Remote Sensing

Methods for measuring water stress in crops are based on the interaction between the crop and soil. In general, measuring crops’ water status and soil moisture content directly in the field is laborious, time-consuming, and destructive [37,38,39], so there is a need for a time-saving, accurate, easy, and non-destructive method to detect crops’ water status in order to curb yield and economic losses early [40]. Recent research to assess crop water stress has been conducted by using remote sensing data as an alternative to traditional measurement methods. The advantage of using remote sensing is the provision of information on the spatial and temporal variability of crops, allowing for more comprehensive analysis and forecasting [41,42,43,44,45,46,47]. A brief description of the remote sensing techniques used to assess water stress in crops is provided below (Figure 3).

3.1. RGB Imaging

Using the literature review methodology described above, a total of five papers were selected in which RGB imaging technology was used for crop water stress assessment. RGB imaging is the simplest remote sensing technique for crop detection, based on silicon sensors that are sensitive to the visible-light band (400–750 nm) and capable of two-dimensional imaging. Typically, the raw data of an image are represented as a matrix of intensity values corresponding to photon fluxes in the red (~600 nm), green (~550 nm), and blue (~450 nm) spectral bands of visible light. RGB images are widely used in crop science because of their low cost and ease of operation and maintenance [48]. Therefore, a variety of deep learning and machine learning techniques have been used to assess crop water stress [49,50,51]. The following Table 2 summarizes the use of RGB imaging to assess moisture stress in crops; it is organized by crop type, best-performing model, methodologies used, paper objectives, author, publisher, country, and year.
Monitoring crops’ water content using RGB imaging requires preset lighting conditions as well as specific leaf orientation for the camera. This limits the applicability of RGB imaging for assessing moisture content in the field [48]. Therefore, the use of RGB imaging for crop water stress assessment is considered to be limited and is mainly used in conjunction with thermal imaging techniques [57,58].

3.2. Thermal Imaging

Through the literature review methodology described above, a total of 13 papers were selected in which thermal imaging techniques were used for crop water stress assessment. High-resolution thermal imaging cameras have a spectral range of 3–14 μm, with the most commonly used wavelengths being 3–5 μm or 7–14 μm [48]. Thermal imaging cameras are relatively more expensive than simple-to-operate RGB cameras, which also have limited features when used for crop water assessment. However, thermal images perform better than RGB images in analyzing crop moisture stress, because thermal images are more reliable and sensitive to changes in crop moisture content due to their higher penetration compared to RGB wavelengths. Therefore, thermal imaging is a more suitable imaging technology for crop moisture stress analysis than RGB imaging [59,60]. The following Table 3 summarizes the use of thermal imaging to assess water stress in crops, and it is organized by crop type, best-performing model, methodologies used, paper objectives, author, publisher, country, and year of publication.

3.3. CWSI

Using the literature review methodology described above, a total of 10 articles were selected in which CWSI technology was used to assess crop moisture stress. Moisture stress is one of the most influential factors contributing to crop yield losses. Water deficit during critical stages of growth, such as during vegetative growth, flowering, or fruit development, can lead to significant yield losses [72,73]. Previous studies have used canopy temperature as an efficient way to rapidly and non-destructively monitor crops’ responses to water stress [74,75], This revealed that canopy temperature provides important clues to changes in the water status and yield of crops under stress and non-stress conditions during drought periods [76,77]. The Crop Water Stress Index (CWSI), based on canopy–air temperature difference and vapor pressure deficit (VPD), has been developed and is a promising tool for assessing water stress in crops [74]. The expression for CWSI is as follows:
C W S I = T l T w e t T d r y T w e t
where T l is the temperature of the leaf; T w e t is the lower bound of the canopy temperature, corresponding to a well-watered leaf with fully open stomata; and T d r y is the upper bound of the canopy temperature, corresponding to a leaf with fully closed stomata, i.e., a non-permeable leaf [78]. Previous research has shown that the CWSI takes less time to detect water stress at the farm level because it can measure water stress remotely; hence, this method was the most commonly used indicator for assessing water stress in crops [79,80,81,82]. However, the CWSI has not been adopted in several applications, due to the following reasons [78]: (i) Temperatures from the associated crop canopy, general leaf population, and soil backgrounds, which are mixed when measured by handheld or high-altitude airborne radiometers. (ii) The normalization of the CWSI is much more complex when atmospheric conditions change than using VPDs only [83,84,85]. However, we believe that widespread use of the CWSI could be a viable option if high-resolution canopy temperature can be accurately monitored [79]. The following Table 4 summarizes the use of the CWSI to assess water stress in crops, and it is organized by crop type, best-performing model, the methodologies used, paper objectives, author, publisher, country, and year.

3.4. Hyperspectral Imaging

Through the literature review methodology described above, a total of 18 articles were selected that used multispectral and hyperspectral imaging techniques for crop water stress assessment. The application of imaging spectroscopy for crop phenotyping originated from studies on the remote sensing of vegetation [48]. Imaging spectroscopy is a technique for detecting and classifying objects by measuring the light reflectance of finely divided wavelengths in the optical part of the electromagnetic spectrum [96]. However, multispectral satellite remote sensing cannot effectively detect early signs of stress in crops (e.g., nutrient deficiencies, crop diseases) in a timely manner, as the accuracy of the retrieved variables is often limited due to limitations in spectral resolution [97]. This has led to the need for remote sensing instruments and sensors with high spectral and spatial resolution [98]. The use and development of hyperspectral imaging have been crucial to eliminating those problems, providing hundreds of bands from which to obtain a more detailed spectral response of the target feature than multispectral imaging [20]. The following Table 5 summarizes the use of hyperspectral imaging to assess water stress in crops, and it is organized by crop type, best-performing model, the methodologies used, paper objectives, author, publisher, country, and year.

4. Artificial Intelligence

4.1. Machine Learning

Machine learning is an evolving field of computational algorithms that are designed to imitate human intelligence by learning from their surroundings. This field has a major role to play in the new era of big data [117]. Machine learning uses algorithms that learn from data, allowing computers to teach themselves information from data to solve problems. These learning algorithms are used in many fields, including image processing, prediction, analytics, and more [118]; they are broadly divided into supervised learning, unsupervised learning, and reinforcement learning.
  • Supervised learning is a method of using pairs of input data and corresponding output values to learn a function that allows the system to predict the output for new inputs [119].
  • Unsupervised learning is a method of classifying patterns among data by uncovering the hidden structure of input data without an output [120].
  • Reinforcement learning is a subfield of machine learning in which software agents learn behaviors that maximize their cumulative reward in the environment [121].
  • Reinforcement learning (RL) offers distinct advantages for real-time decision-making and automation in agriculture; its capacity to continuously learn and adapt through interactions with the environment makes it especially effective for dynamic and changing agricultural conditions. Although the use of RL in crop water stress research is currently limited [122], its potential to greatly enhance adaptive management and optimize irrigation strategies suggests that further exploration and experimentation are worthwhile [123,124,125].
The selection of the algorithm approach depends on the type of problem to be solved, the number of variables involved, and the type of model that best fits the data, among other factors [118]. In particular, SVM and PLS algorithms have been effectively used to analyze remote sensing data as models for crop water stress assessment. The following Table 6 is a summary of the use of machine learning for crop water stress analysis in the above research cases.

4.1.1. Support Vector Machines

Support Vector Machines were introduced by Vapnik in 1995 and are classification models based on statistical learning theory that can be applied to both classification and regression problems [126]. Although SVMs were developed in the late 1970s, they started to gain popularity in the field of remote sensing in 2003 [127]. SVMs primarily aim to find the hyperplane that maximizes the margin between two classes [128]. When data are linearly separable, SVMs separate the two classes using the hyperplane that achieves the widest margin. However, in cases where the data are not linearly separable, SVMs use kernel functions to map the data into a higher-dimensional feature space, where an optimal hyperplane is found. SVMs can utilize various kernel functions (e.g., linear kernel, polynomial kernel, and Gaussian kernel) to map nonlinear data into higher dimensions, and choosing an appropriate kernel function significantly affects their classification performance [129].
In this process, support vectors are the most critical data points that contribute to defining the hyperplane separating the two classes. The remaining data points do not influence the position of the hyperplane, which is one reason SVMs can achieve high classification accuracy even with a small amount of data [130]. Additionally, SVMs are known to be robust against overfitting, as they strike a balance between performance on the training data and the ability to generalize to new data [127]. The following Figure 4 shows the structure of the SVM.
Such SVMs are utilized in various fields, including medicine [131,132], statistics [133,134], and text analysis [135,136]. In the agricultural sector, SVMs are particularly applied in crop prediction and classification [137,138], as well as in yield forecasting [139,140].

4.1.2. Partial Least Squares Regression

Partial Least Squares Regression was introduced by H. Wold in 1975 [141]. Developed to handle large datasets, PLS combines path analysis, principal component analysis, and regression analysis [142,143], integrating dimensionality reduction with parameter estimation. PLS iteratively applies simple bivariate regression (least squares) between columns or rows of matrices in order to estimate covariates for each model. The process begins by generating latent factor variables from the independent variable data (X matrix), which are then used to model the relationship with the dependent variable (Y). In a PLS model, the contribution of each variable is evaluated through standardized model coefficients, which represent the relationships between variables. A larger positive coefficient indicates that the independent variable has a stronger positive influence on the dependent variable [143,144]. This method is particularly useful for modeling complex relationships involving multiple variables, making it suitable for exploratory research or studies where complex causal relationships are not yet fully understood [145,146].
PLS is often confused with principal component analysis (PCA). PCA transforms the original set of variables into principal components (PCs), where the first principal component explains most of the data’s variance, and subsequent components account for progressively less variance. Unlike PLS, which models the relationship between independent variables (X) and dependent variables (Y), PCA focuses on explaining the variance within the independent variables (X) alone, without considering their relationship to the dependent variable (Y) [147,148]. The following Figure 5 shows the structure of the PLS.
PLS is applied in various fields, including technology adoption analysis [149], leisure studies [150], linguistics and education [151], and marketing [152]. In the agricultural sector, PLS is primarily used for analyzing the behavior of agricultural practitioners [153,154,155].

4.2. Deep Learning

Deep learning is an extension of classical machine learning, using a variety of functions to add more “depth” to models and represent data in a hierarchical way with multiple levels of abstraction [156,157]. One of the benefits of deep learning is feature learning; it automatically extracts features from raw data, with higher-level features in a hierarchy formed by combinations of lower-level features [158]. Deep learning also uses more complex models than machine learning, allowing for massively parallel processing. These deep learning models excel at classification and prediction due to their hierarchical structure and large learning capacity, and they are flexible enough to adapt to diverse and complex data analyses [159]. Deep learning has been applied to a wide range of fields, including automatic speech recognition, image recognition, natural language processing, drug discovery, and bioinformatics [160]. Deep learning is a relatively new technology, especially in agriculture; however, many researchers have tried to implement it in several applications, such as disease detection and identification, fruit and object classification, and many more [161]. In particular, ANN, CNN, and RNN algorithms have been effectively used to analyze remote sensing data as models for crop water stress assessment. The following Table 7 is a summary of the use of deep learning for crop water stress analysis in the above case.

4.2.1. Artificial Neural Networks

The concept of an ANN, introduced by W.S. McCulloch and W. Pitts, is a mathematical representation of the neurons in the human brain, designed to simulate the way in which the brain processes information [162]. ANNs began to be widely used in research with the introduction of the backpropagation (BP) training algorithm for feedforward neural networks in 1986 [163]. ANNs are biologically inspired computational models composed of hundreds of single-unit artificial neurons that are trained to adjust their parameters in order to produce outputs similar to those of known datasets [164]. By learning from historical data, once sufficiently trained, ANNs can adapt to recurring changes and detect patterns in complex data [165,166]. One of the most prominent types of ANN is the Multilayer Perceptron (MLP) neural network, which consists of an input layer, an output layer, and one or more hidden layers in between, with each layer containing multiple artificial neurons. These neurons receive input signals, apply weights to calculate a weighted average, and generate outputs through an activation function [167]. The following Figure 6 shows the structure of the ANN.
ANNs can perform regression analysis on highly nonlinear problems and are applied to find nonlinear relationships between input and output datasets [168]; they are mainly utilized for classification and recognition using multispectral information [169]. In agriculture, ANN models have been used in crop development modeling [170], crop yield prediction [171,172], evapotranspiration estimation [173], and crop water stress assessment [174,175,176].

4.2.2. Convolutional Neural Networks

In 1980, K. Fukushima proposed the neocognitron, which can be considered to be the predecessor of CNNs [176]. In 1990, LeCun et al. [177] published a seminal paper that established the modern framework for CNNs, and since the early 2000s, ConvNets have had great success in detecting, segmenting, and recognizing objects and regions in images [158]. The basic building blocks of a CNN consist of three types: convolutional, pooled, and fully connected layers [178]. The convolutional layer detects local connections between features from previous layers, and the pooling layer merges semantically similar features into one [158]. The following Figure 7 shows the structure of the CNN.
CNNs can automatically learn important features from images and find hidden patterns. When learning more data, this system can be more advanced at finding deep features in images [179]. In agriculture, CNN models are being used for crop mapping [180], crop disease diagnosis [181,182,183,184], weed and crop recognition [185,186], yield prediction [187], and crop water stress detection [57].

4.3. Ensemble Learning

One of the earliest examples of ensemble learning is the work of Dasarathy and Sheela in 1979, which introduced the idea of partitioning feature space using multiple classifiers [188]. Since then, there has been an explosion of research on ensemble learning, with the main methods being bagging, boosting, and stacking [189,190]. Ensemble learning is a method that combines multiple base learners to make predictions on new inputs. Bayesian learners consist of a variety of machine learning algorithms, such as decision trees, neural networks, and linear regression, which take labeled data as inputs and create a predictive model. This method allows for predictions on new, unlabeled data [191]. Such ensemble learning can reduce the risk of overfitting owing to a variety of base models, and by combining the results of different classification algorithms, it can reduce generalization error without increasing the variance of the model [192]. In addition, traditional ensemble learning has been applied to a variety of fields by incorporating basic machine learning models [193,194]. However, in recent years, there have been many attempts to apply deep learning to ensemble learning [195,196]. Ensemble learning has a wide range of applications, including fake news detection [197], web-based attack detection [198], battery health estimation [199], dissolved oxygen prediction [200], and short-term electricity load prediction. In the agricultural field, ensemble learning has been used for growth diagnostics [201], yield prediction [202], pest classification [203], disease recognition [204], and crop classification [205]. XGBoost and RF algorithms have been effectively used to analyze remote sensing data to assess crop water stress. The following Table 8 are examples of how ensemble learning algorithms are used to analyze crop water stress in the above cases.

4.3.1. Extreme Gradient Boosting

Extreme Gradient Boosting is an algorithm based on the boosting tree model, introduced by Tianqi Chen and Carlos Guestrin in 2014, which is optimized for decision and regression trees [206]. The gradient boosting algorithm was developed for its very high predictive power; however, it has the disadvantage that it requires a lot of training time because one decision tree must be created at a time to minimize the error of the previous trees in the model. XGBoost was created to eliminate this drawback [207]. XGBoost primarily utilizes gradient-boosted decision trees, emphasizing speed and performance. Boosting is an ensemble method that adds new models to correct the errors of existing models. XGBoost generates a new model to predict the residuals (errors) left by the previous models and then adds it to the existing models to improve the final prediction. When adding new models, the algorithm uses gradient descent to minimize errors [208]. The following Figure 8 shows the structure of the Boosting algorithm. It illustrates the structure of the boosting algorithm, where multiple decision trees are sequentially trained.
XGBoost is very effective at reducing computation time and making optimal use of memory resources [209]. In the agricultural sector, XGBoost is being used in various fields, such as yield prediction [210,211,212], evapotranspiration prediction [213], crop forecasting [214], etc.

4.3.2. Random Forest

Random Forest was introduced by L. Breiman in 2001 [215]. RF is an ensemble machine learning algorithm that uses a subset of features and bootstrap samples to create regression trees [216]. The essential components of this ensemble are predictors with a tree structure, and each tree is generated by introducing randomness into the process, which is why this procedure is referred to as a “Random Forest” [217]. Random Forest is an algorithm that generates multiple decision trees through randomization and then trains them using bootstrap sampling; for classification tasks, it derives the final prediction through voting, while for regression tasks, it averages the predictions. Each tree is created by randomly selecting predictors and determining the optimal splits, and “out-of-bag” (OOB) data are used to evaluate the model’s performance [218]. The following Figure 9 shows the structure of the Bagging algorithm. Figure 9 demonstrates the process of the bagging algorithm, where multiple samples are drawn, and each decision tree is trained separately. The predictions from each tree are then aggregated to produce the final prediction.
The RF algorithm performs efficiently on large databases and can handle thousands of input variables without overfitting, achieving fast and high prediction accuracy [215]. In agriculture, RF is used in a variety of applications, including crop classification [219,220], yield prediction [221], and crop forecasting [222].

5. Case Analysis

In this section, we systematically organize the collected cases according to their respective remote sensing and AI techniques. Subsequently, we present the results of specific studies that evaluate and compare the performance of each technique. Through this analysis, we aim to provide a clearer understanding of the strengths and limitations of each technique. The cases have been organized based on the most frequently used AI techniques and have been analyzed in detail, considering aspects such as study area, study period, data acquisition methods, number of data, and accuracy. The specific applications are as follows:

5.1. SVM

Azimi et al. [52] proposed a method for identifying water stress in chickpeas based on RGB images. The SIFT and HOG feature extraction techniques were employed. For analysis, KNN, decision trees, Naive Bayes, and SVM were used, with SVM achieving the highest accuracy of 73%.
Mohite et al. [107] proposed a method for detecting water stress in maize crops using drone-based hyperspectral imagery. The hyperspectral data from the influential wavelength bands were utilized for water stress detection. SVM and RF were used for analysis, with SVM demonstrating the highest performance in the 670–780 nm wavelength range.
Sankararao et al. [108] proposed a method for detecting water stress in pearl millet crops using drone-based hyperspectral imagery. Hyperspectral data were processed with various machine-learning-based feature selection techniques to extract wavelength bands sensitive to water stress. SVM and RF were used for analysis, with SVM achieving an accuracy of 95.38%. For early stress detection, the method achieved an accuracy of 80.76%.
Zhuang et al. [113] proposed a method combining continuous wavelet transform and machine learning techniques to predict the water status of winter wheat. The wavelet transform decomposed data into frequency components at various scales, enabling simultaneous analysis of time and frequency information. The resulting multi-wavelength features were used for prediction. SVM and RF were employed for analysis, with SVM achieving an accuracy of 93% in predicting plants’ water content.
SVM has a strong advantage in handling nonlinear data and is widely used in agricultural research due to its ability to manage complex patterns. The data collected from RGB or hyperspectral images are high-dimensional and intricate, making SVM highly effective for classifying such data. SVM is particularly well suited for capturing subtle variations across different wavelength bands and, when combined with feature selection techniques, can produce more precise results. Additionally, SVM is adept at managing nonlinearity and high-dimensional data, making it superior in handling data complexity compared to other analysis methods. When analyzing data at various scales, such as with wavelet transforms, SVM excels at extracting the core patterns by combining time and frequency information. The following Table 9 provides a detailed analysis of the collected cases that utilized the SVM model.

5.2. PLS

Wan-Gyu et al. [54] proposed a method for detecting drought stress in soybeans using RGB images. Various vegetation indices were extracted from the RGB images to analyze changes in the leaf color and canopy cover of the soybean crops. PLS-DA was used as the analysis technique. The results indicated that leaf color changes were more sensitive to drought stress than canopy cover changes.
Sobejano-Paz et al. [103] proposed a method for assessing water stress in soybean and maize crops by combining hyperspectral and thermal imagery. Key variables included stomatal conductance, transpiration rate, and photosynthesis, with additional parameters such as plant temperature and canopy height included alongside hyperspectral data. PLS-R was used for the analysis, demonstrating accurate predictions for both soybeans and maize. For soybeans, temperature-related variables were identified as the primary factors, while for maize, canopy height was found to be the most significant variable.
Kang et al. [112] proposed a method for evaluating water stress in grapevines using hyperspectral imagery. Water stress was assessed by predicting leaf water potential, stomatal conductance, and soil moisture content. Spectral data were collected under diffuse lighting conditions. PLS-R was employed for analysis, demonstrating very high accuracy in predicting leaf water potential and soil moisture content.
The Partial Least Squares (PLS) algorithm effectively predicts key physiological variables such as stomatal conductance, transpiration rate, and photosynthesis. Specifically, the PLS-R model analyzes various wavelength bands to capture drought-sensitive spectra, enabling more precise predictions by utilizing wavelength bands that are not typically employed in conventional vegetation indices. Additionally, PLS excels in reducing the variability within spectral data, making it particularly strong in generating accurate predictions. The following Table 10 provides a detailed analysis of the collected cases that utilized the PLS model.

5.3. ANNs

Chandel et al. [57] proposed a method for detecting water stress in winter wheat crops using high-resolution thermal–RGB imagery combined with advanced AI techniques. The method integrated weather and soil data. LSTM was employed as the analysis technique, achieving a prediction accuracy of 96.7%.
Mazare et al. [61] proposed a thermal imaging analysis system for real-time detection of plant water stress. The system utilized a FLIR thermal camera to analyze plant temperature distribution and a deep learning algorithm was employed to learn and recognize early signs of water stress. An ANN was used as the analysis technique, achieving an accuracy of 97.8%.
Elsherbiny et al. [65] proposed a method for predicting water stress in rice crops using visible light and thermal imagery. Color and texture features were extracted from RGB images, while temperature indices were derived from thermal data. An ANN was employed as the analysis technique, utilizing a total of 21 key features as input variables. The model demonstrated a high accuracy of 99.4%.
Carrasco-Benavides et al. [66] proposed a method for predicting water stress in cherry trees using thermal imagery. Canopy temperature and relative humidity data were extracted from infrared thermal images. An ANN was used as the analysis technique to predict stem water potential and stomatal conductance, achieving accuracies of 83% and 75%, respectively. The model based on canopy temperature and stomatal conductance demonstrated an overall prediction accuracy of 81%.
King et al. [86] proposed a data-driven model for predicting the Crop Water Stress Index (CWSI) using canopy temperature in sugar beet and grape crops. A neural network model was employed for analysis, achieving a Nash–Sutcliffe efficiency greater than 0.88 in predicting the lower limit temperature (TLL) and an RMSE of less than 1.1 °C, indicating high accuracy.
Cherie et al. [87] presented a comparative study of AI techniques for calculating the Crop Water Stress Index (CWSI) in rice crops. Key meteorological variables such as relative humidity, air temperature, and canopy temperature were used to calculate the CWSI. FF-BP-ANN and SOM were employed as the analysis techniques, with FF-BP-ANN achieving the highest accuracy, at 97%.
Kumar et al. [91] proposed an AI-based method for predicting the Crop Water Stress Index (CWSI) in rice and Indian mustard crops. The model was compared with experimentally calculated CWSI values based on data collected under various irrigation levels. ANN, SVR, and ANFIS models were employed as the analysis techniques, with ANN5 (featuring five hidden neurons) achieving the highest accuracy, at 99%.
Muni et al. [95] presented a comparative study of AI techniques for predicting the Crop Water Stress Index (CWSI) in wheat crops. The CWSI values were derived from experimental data. MLP, SMOreg, M5P, RF, IBk, RT, bagging, and Kstar were used as the analysis techniques, with MLP showing the highest predictive accuracy, achieving an MAE value of 0.013.
Osco et al. [100] proposed a method for evaluating water stress in lettuce crops using hyperspectral imagery. Hyperspectral data were collected over 14 days from lettuce plants under induced stress conditions. An ANN was employed as the analysis technique, achieving an accuracy of 80% at the beginning of the experiment and 93% by the end.
Artificial Neural Networks (ANNs) excel at learning nonlinear relationships and handling multiple variables simultaneously, making them highly effective for accurately predicting complex issues such as crop water stress. Due to their hierarchical structure, ANNs can automatically learn key patterns in data, allowing for more refined predictions than other models. ANNs are particularly well suited for managing multiple physiological indicators, supporting precise water management and efficient irrigation decision-making. The following Table 11 provides a detailed analysis of the collected cases that utilized the ANN model.

5.4. CNNs

Zhuang et al. [53] proposed a method for detecting water stress in maize crops using phenotypic images of leaves. A total of 18,040 images reflecting three different water stress conditions were utilized. A CNN was employed as the analysis technique to extract feature maps, which were then used by an SVM classifier to categorize the water stress levels. The method achieved an accuracy of 88.41%.
Chandel et al. [56] proposed a system for real-time detection of crop water stress using an AI-based mobile device. GoogLeNet was employed to collect images in real time and classify water stress levels. The system achieved high accuracy rates of 97.9% for maize and 92.9% for wheat crops; additionally, it demonstrated fast processing speeds, with results generated within 200 milliseconds after image input.
Chandel et al. [57] proposed a method for detecting water stress in winter wheat crops using high-resolution thermal–RGB imagery combined with advanced AI techniques. The collected images were analyzed using the ResNet50 model, achieving high accuracy rates of 98.4% for thermal images and 96.9% for RGB images.
Melo et al. [67] proposed a method for detecting water stress in sugarcane crops using thermal imagery. Thermal images of sugarcane were collected with a thermal camera and analyzed using the Inception-ResNet-v2 model, which achieved 23% higher accuracy compared to manual evaluation. Specifically, the model attained accuracy rates of 83%, 90%, and 98% when predicting available water capacity (AWC) levels of 25%, 50%, and 100%, respectively, in sugarcane.
Aversano et al. [68] proposed a method for detecting water stress in tomato crops using thermal and optical imagery captured by drones. The VGG-19 model was employed for analysis, achieving an accuracy of 80.5% with the thermal images.
Jin et al. [70] proposed a method for detecting water stress in cotton crops under film-mulching drip irrigation using thermal imagery. The MobileNetV3 model was employed for analysis, achieving high accuracy, with an F1 score of 0.9990 and a processing speed of 44.85 ms.
Nasir et al. [102] proposed a method for estimating plants’ leaf water content using hyperspectral imagery. A CNN was employed for analysis, achieving a high accuracy of 98.4% and an RMSE of 4.183.
Sankararao et al. [104] proposed a method for detecting water stress in chickpeas using UAV-based hyperspectral imagery. The analysis was conducted using a 3D-2D CNN model, achieving a high accuracy of 95.44%.
CNN-based deep learning models demonstrate excellent performance in detecting water stress. CNNs can learn the phenotypic features of crops, allowing for non-invasive monitoring of water status and providing quantitative assessments of the degree of water stress. Models such as GoogLeNet and ResNet50 offer higher accuracy compared to other techniques, with prediction performance significantly improving through the integration of thermal imagery and multiple variables. Models such as DL-LSTM combine meteorological and soil variables to aid in real-time water management decisions, while transfer learning helps address data scarcity issues. MobileNetV3, with its fast processing speed and low computational complexity, is considered to be well-suited for agricultural applications, and 3D-2D CNN models accurately capture subtle stress variations by utilizing multiple bands. The following Table 12 provides a detailed analysis of the collected cases that utilized the CNN model.

5.5. Ensemble

Das et al. [63] proposed a method for detecting water status in vineyards using mobile thermal imaging and machine learning techniques. Random Forest (RF) was employed for analysis, achieving an R2 of 0.61 in cross-validation and 0.65 in predictions.
Yang et al. [64] proposed a method for detecting water stress in Chinese cabbage by predicting canopy temperature. Random Forest (RF) was used as the analysis technique, achieving high accuracy, with R2 values of 0.90 in the first experiment and 0.91 in the second experiment.
Wu et al. [69] proposed a method for estimating water stress in rice crops using multi-temporal temperature indices and machine learning techniques. Random Forest was employed for analysis, achieving R2 values of 0.78 for PWC, 0.77 for CWC, and 0.64 for CEWT, demonstrating high accuracy.
Wang et al. [71] proposed a method for detecting water stress in winter wheat crops using UAV-based multispectral and thermal remote sensing. The Gradient-Boosting Decision Tree (GBDT) technique was employed for analysis, achieving high accuracy, with R² values of 0.88 for NGS prediction and 0.90 for EWC prediction.
Katimbo et al. [89] proposed an AI-based model for predicting evapotranspiration (ET) and crop water stress. CatBoost and Stacked Regression were employed as the analysis techniques, achieving high accuracy, with an RMSE of 0.06–0.09 for CWSI prediction and 0.27–0.72 mm/day for ETc prediction.
Pei et al. [90] proposed a method for detecting water stress in cotton crops using UAV-based multispectral imagery and texture information. XGBoost was employed for analysis, achieving high accuracy, with an R2 of 0.90 and an RMSE of 0.05 for CWSI prediction.
Chen et al. [92] proposed a method for detecting water stress in sorghum and maize crops using UAV remote sensing and a multidimensional drought index. Random Forest Regression (RFR) was employed for analysis, achieving high accuracy, with R2 = 0.76 and RMSE = 1.15% for sorghum and maize.
Kapari et al. [94] proposed a method for detecting water stress in maize crops using multispectral and thermal images collected by UAVs, along with machine learning algorithms. Random Forest was employed for analysis, achieving high accuracy, with an R2 of 0.85 and an RMSE of 0.05 for CWSI prediction.
Loggenberg et al. [99] proposed a method for detecting water stress in grape crops using hyperspectral imaging and machine learning. Random Forest and XGBoost were employed as the analysis techniques, achieving accuracies of 83.3% and 80.0%, respectively.
Martin et al. [105] proposed a method for detecting water stress in potato crops using hyperspectral imaging and machine learning algorithms. Random Forest and XGBoost were employed for analysis, with XGBoost showing the highest performance across all growth stages, ultimately achieving an accuracy of 99.7%.
Niu et al. [106] proposed a method for predicting water stress in maize crops using UAV-based multispectral imagery. Random Forest, Artificial Neural Networks (ANNs), and Multivariate Linear Regression (MLR) were employed as analysis techniques, with the Random Forest model achieving the highest accuracy, recording an R2 of 0.89 and an RMSE of 0.066.
Thapa et al. [109] proposed a method for detecting water stress in grape crops using hyperspectral imagery and machine learning. Random Forest and Artificial Neural Networks (ANNs) were employed as analysis techniques, achieving 73% and 70% accuracy, respectively.
Mertens et al. [111] proposed a method for detecting water stress in maize using near-range thermal and hyperspectral imagery on an indoor automated plant phenotyping platform. Random Forest and LASSO were employed as analysis techniques, with the LASSO model achieving the highest accuracy, recording an R² of 0.63 and an RMSE of 0.47.
Mao et al. [114] proposed a method for predicting yield loss due to water stress in wheat crops using hyperspectral imagery. Various analysis techniques were employed, including Random Forest Regression (RFR), Partial Least Squares Regression (PLS-R), and multiple random ensembles. Among these, the multiple random ensemble model based on PLS-R demonstrated the highest accuracy.
Zhang et al. [116] proposed a method for predicting leaf water content (LWC) in rice using hyperspectral remote sensing combined with machine learning. The Gradient-Boosting Decision Tree (GBDT) technique was employed for analysis, achieving high accuracy, with R2 = 0.86 and RMSE = 0.01.
RF and GBDT have demonstrated strong performance in predicting the CWSI and crop water status, significantly improving accuracy by integrating meteorological and thermal data. CatBoost and XGBoost excel particularly in combining multispectral indices and texture information, making them crucial tools for real-time monitoring and irrigation management. LASSO has shown high accuracy in predicting evapotranspiration rates, and models utilizing spectral bands offer promising potential as efficient tools for water stress detection and management. The following Table 13 provides a detailed analysis of the collected cases that utilized the Ensemble model.

5.6. Others

Elsherbiny et al. [55] proposed a hybrid deep learning network for diagnosing the water status of wheat crops using IoT-based multimodal data. The analysis employed a hybrid model combining a CNN and LSTM, achieving an accuracy of 100%.
Das et al. [63] proposed a method for detecting water stress in wheat crops using UAV-based thermal imaging and machine learning. The analysis employed the Classification and Regression Tree (CRT) technique. The model achieved high accuracy, with an R² of 0.86 and RMSE of 41.3 g/m2 for biomass prediction, and an R2 of 0.78 and RMSE of 16.7 g/m2 for grain yield prediction.
Ismail et al. [32] proposed a method for smart agriculture that involves generating reconstructed thermal images from visible-light images. The analysis employed Generative Adversarial Networks (GANs) as the technique. The reconstructed thermal images demonstrated high accuracy and achieved visual quality comparable to actual thermal images.
Pradawet et al. [88] proposed a method for detecting water stress in maize crops using thermal imaging and machine learning models. They introduced an enhanced Crop Water Stress Index (CWSI), which demonstrated a high correlation with leaf stomatal conductance, achieving an R2 value of 0.90.
Bounoua et al. [93] proposed a method for predicting crop water stress using satellite-based remote sensing data. They employed CNN-LSTM and ConvLSTM models for analysis, with the CNN-LSTM model achieving the highest accuracy, with an RMSE of 0.119.
Asaari et al. [101] proposed a Convolutional Neural Network (CNN)-based regression method for detecting drought stress in maize crops and analyzing the recovery process using hyperspectral imaging. They combined Support Vector Machine (SVM) with K-Means Clustering to remove nonlinear effects and measure the spectral similarity of plants. The SVM classification achieved an accuracy of over 96%.
Adduru et al. [110] proposed a method for the early detection of water stress in peanut crops using UAV-based hyperspectral imaging and machine learning techniques. The analysis employed SVM, Random Forest (RF), and XGBoost. The SVM model achieved the highest accuracy, at 96.46%.
Malounas et al. [115] proposed a method for detecting drought stress in broccoli crops using hyperspectral imaging and AutoML (Automated Machine Learning). The analysis utilized PyCaret, AutoML, and PLS-DA (Partial Least Squares Discriminant Analysis). PyCaret achieved the highest accuracy, with an F1 score of 1.00.
In crop water stress research, which often involves extended study durations, the amount of available data can be limited. Therefore, Generative Adversarial Networks (GANs) hold significant potential for enhancing the accuracy of water stress detection across diverse environmental conditions through data augmentation. This capability positions GANs as a crucial tool for advancing precision agriculture. The following Table 14 provides a detailed analysis of the collected cases that utilized the different model.

6. Latest AI Technologies

6.1. Generative Adversarial Networks

To address the lack of data, researchers have used several techniques (Figure 10). The first is to augment the data by applying various geometric and color transformations, angular rotations, mirroring, etc., to the image [223,224,225,226,227,228,229]. The second way is to use transfer learning to improve performance [230,231,232,233,234,235,236]. These data augmentation techniques have led to improved model performance. In recent times, GANs [237] have gained traction as a third way to address data sparsity. GANs are a new generative modeling framework proposed by Goodfellow et al. that aims to generate data with the same characteristics as the training instances by synthesizing new data that are visually similar to the data in the training set [237]. Compared to traditional data augmentation techniques, GANs induce more variation and enrich the original dataset through representation learning, and they consistently improve model performance when combined with traditional augmentation methods and original data [238,239,240]. Various GAN architectures have been developed for image synthesis, including AutoGAN [241], BIGGAN [242], and DCGAN [243]. However, GAN models run the risk of producing low-quality images due to instability in training, and their performance is sensitive to hyperparameter settings [237,244,245]. Therefore, effective use of GANs requires careful hyperparameter tuning, network structure engineering, and various training tricks [246]. GAN operates by having the generator take random input to create fake images, while the discriminator compares the generated fake images with real images to distinguish between them. During the training process, the discriminator learns to better differentiate between real and fake images, and the generator improves its ability to create increasingly realistic fake images. Through this adversarial learning, the generator eventually produces refined images that can deceive the discriminator, making the generated images almost indistinguishable from real ones [247].
Research on crop water stress assessment using GANs has been collected in studies that propose methods to generate thermal images based on visible images [32]. Although there is a limited amount of prior research on crop water stress assessment using GANs, several works related to this topic have been carried out, including Data augmentation [248,249,250], disease detection [251], weed control [252], fruit detection [253], crop phenotyping [254], and quality assessment [255]. Considering these studies, GANs have great development potential in crop water stress assessment. Given the manual effort required, such as the tuning of hyperparameters, we believe that there is significant potential for further research in crop moisture stress assessment.

6.2. Explainable AI

Explainable AI (XAI) is a technology that enables end users to understand the learning models and decision-making processes of AI systems, helping them to trust AI systems [256]. The main goals of XAI are to build user trust, increase transparency and accountability, and improve model performance by making it possible to understand how AI models work and how they make decisions [257,258]. In recent years, XAI has rapidly gained popularity, with new interpretable machine learning methods being proposed, reviewed, and applied in various scientific fields [259,260,261,262]. XAI enhances model transparency, helps users trust the system, supports the decision-making process, and makes it easier to debug and improve models [263]. However, there are many limitations, including difficulties in interpreting complex models, variability in posterior models, data bias, and misunderstanding of causality. To overcome these limitations, using hybrid models seems to be an effective way to simplify the model and balance performance [264] (Figure 11).
Crop water stress assessments rely on various parameters (soil moisture content, canopy temperature, photosynthesis rate, chlorophyll content, etc.). XAI can help analyze and interpret these data and clarify how each parameter contributes to the assessment of water stress. However, there is still a lack of published research using XAI to assess water stress in crops. However, XAI is already being used in a variety of agricultural applications, including crop recommendations [258], agricultural data analytics [265], and yield prediction [266], and it is beneficial in detecting diseases in crops [267,268,269]. Since water stress in crops can be considered a component of disease, the applicability of XAI technology to crop water stress assessment is considered to be high. Considering the complex model interpretation, it is expected that research on crop water stress assessment using XAI will be actively conducted in the future.

7. Conclusions

The assessment of crop water stress is a crucial process for agricultural productivity and resource management. To conduct this assessment effectively, utilizing remote sensing technologies is essential. This study provides a comprehensive analysis of remote sensing and artificial intelligence (AI) techniques for the non-destructive and efficient evaluation of crop water stress induced by climate change. The main conclusions are as follows:
  • The use of remote sensing technologies has demonstrated the potential for non-destructive and precise evaluation of crop water stress. In particular, the use of thermal imaging data has proven effective, and CWSI-based thermal analysis holds significant potential for rapid and accurate water stress assessment.
  • Data analysis utilizing machine learning and deep learning models shows high potential for predicting crop water stress. Notably, CNN-based models are expected to achieve excellent performance through RGB and thermal imaging data.
  • Ensemble learning techniques combining various models have shown superior prediction performance compared to single models. Ensemble models such as RF and XGBoost can effectively learn complex data patterns and contribute to improved prediction accuracy.
  • The research on generating thermal images based on visible-light images using GANs has a high potential for addressing data scarcity issues. Reconstructed thermal images are expected to effectively assess water stress conditions.
  • Explainable AI (XAI) contributes to increasing user trust by explaining the decision-making processes of AI models. XAI is useful in interpreting the impact of various variables in water stress assessment and holds promise for future applications.
Overall, this study has derived high-accuracy models. However, developing technologies that can adapt to accelerating climate change and variable production conditions remains a crucial challenge. The application and development of various AI techniques are necessary to overcome the limitations of existing models. In particular, crop research typically takes about one year, and any management errors could prevent achieving the desired results within that year. Upon analyzing the collected cases, most studies either built models with a small number of images or increased data through automatic capture every five to ten min. However, this only increases the quantity of data, not its quality. Generative algorithms like GANs are expected to make a significant contribution to addressing this data scarcity issue. Additionally, while XAI algorithms have not yet been directly applied to water stress evaluation, they could be useful in considering complex variables by improving model transparency. This study is anticipated to contribute to solving data scarcity issues and enhancing the accuracy and efficiency of decision-making in future crop water stress assessments.

Funding

This work was carried out with the support of “Cooperative Research Program for Agriculture Science and Technology Development (Project No. RS-2023-00218387)” Rural Development Administration, Republic of Korea.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Houghton, J.T. The ipcc report 2001. In Proceedings of the The Solar Cycle and Terrestrial Climate, Solar and Space Weather Euroconference (1:2000:Santa Cruz de Tenerife, Tenerife, Spain) Proceedings of the 1st Solar and Space Weather Euroconference, Santa Cruz de Tenerife, Spain, 25–29 September 2000; p. 255. [Google Scholar]
  2. Kurukulasuriya, P.; Mendelsohn, R.O. How will climate change shift agro-ecological zones and impact African agriculture? World Bank Policy Res. Work. Pap. 2008, 4717. [Google Scholar]
  3. Knox, J.W.; Díaz, J.A.R.; Nixon, D.J.; Mkhwanazi, M. A preliminary assessment of climate change impacts on sugarcane in Swaziland. Agric. Syst. 2010, 103, 63–72. [Google Scholar] [CrossRef]
  4. Lafferty, D.C.; Sriver, R.L.; Haqiqi, I.; Hertel, T.W.; Keller, K.; Nicholas, R.E. Statistically bias-corrected and downscaled climate models underestimate the adverse effects of extreme heat on US maize yields. Commun. Earth Environ. 2021, 2, 196. [Google Scholar] [CrossRef]
  5. Arunrat, N.; Pumijumnong, N.; Sereenonchai, S.; Chareonwong, U.; Wang, C. Assessment of climate change impact on rice yield and water footprint of large-scale and individual farming in Thailand. Sci. Total Environ. 2020, 726, 137864. [Google Scholar] [CrossRef] [PubMed]
  6. Zare, M.; Azam, S.; Sauchyn, D. Simulation of Climate Change Impacts on Crop Yield in the Saskatchewan Grain Belt Using an Improved SWAT Model. Agriculture 2023, 13, 2102. [Google Scholar] [CrossRef]
  7. Arunrat, N.; Sereenonchai, S.; Wang, C. Carbon footprint and predicting the impact of climate change on carbon sequestration ecosystem services of organic rice farming and conventional rice farming: A case study in Phichit province, Thailand. J. Environ. Manag. 2021, 289, 112458. [Google Scholar] [CrossRef]
  8. Guerriero, V.; Scorzini, A.R.; Di Lena, B.; Iulianella, S.; Di Bacco, M.; Tallini, M. Impact of Climate Change on Crop Yields: Insights from the Abruzzo Region, Central Italy. Sustainability 2023, 15, 14235. [Google Scholar] [CrossRef]
  9. Fereres, E.; Evans, R.G. Irrigation of fruit trees and vines: An introduction. Irrig. Sci. 2006, 24, 55–57. [Google Scholar] [CrossRef]
  10. Gago, J.; Douthe, C.; Coopman, R.E.; Gallego, P.P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  11. Khan, N.; Ray, R.L.; Sargani, G.R.; Ihtisham, M.; Khayyam, M.; Ismail, S. Current progress and future prospects of agriculture technology: Gateway to sustainable agriculture. Sustainability 2021, 13, 4883. [Google Scholar] [CrossRef]
  12. González-Dugo, M.P.; Moran, M.S.; Mateos, L.; Bryant, R. Canopy temperature variability as an indicator of crop water stress severity. Irrig. Sci. 2006, 24, 233–240. [Google Scholar] [CrossRef]
  13. Zhou, Z.; Majeed, Y.; Naranjo, G.D.; Gambacorta, E.M.T. Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications. Comput. Electron. Agric. 2021, 182, 106019. [Google Scholar] [CrossRef]
  14. Katsoulas, N.; Elvanidi, A.; Ferentinos, K.P.; Kacira, M.; Bartzanas, T.; Kittas, C. Crop reflectance monitoring as a tool for water stress detection in greenhouses: A review. Biosyst. Eng. 2016, 151, 374–398. [Google Scholar] [CrossRef]
  15. Zhong, L.; Hu, L.; Zhou, H. Deep learning based multi-temporal crop classification. Remote Sens. Environ. 2019, 221, 430–443. [Google Scholar] [CrossRef]
  16. Campbell, J.B.; Wynne, R.H. Introduction to Remote Sensing, 5th ed.; Guilford Publications: New York, NY, USA, 2011; ISBN 9781609181772. [Google Scholar]
  17. Galieni, A.; D’Ascenzo, N.; Stagnari, F.; Pagnani, G.; Xie, Q.; Pisante, M. Past and future of plant stress detection: An overview from remote sensing to positron emission tomography. Front. Plant Sci. 2021, 11, 609155. [Google Scholar] [CrossRef]
  18. Bregaglio, S.; Frasso, N.; Pagani, V.; Stella, T.; Francone, C.; Cappelli, G.; Acutis, M.; Balaghi, R.; Ouabbou, H.; Paleari, L. New multi-model approach gives good estimations of wheat yield under semi-arid climate in Morocco. Agron. Sustain. Dev. 2015, 35, 157–167. [Google Scholar] [CrossRef]
  19. Fischer, W.A.; Hemphill, W.R.; Kover, A. Progress in remote sensing (1972–1976). Photogrammetria 1976, 32, 33–72. [Google Scholar] [CrossRef]
  20. Lu, B.; Dao, P.D.; Liu, J.; He, Y.; Shang, J. Recent Advances of Hyperspectral Imaging Technology and Applications in Agriculture. Remote Sens. 2020, 12, 2659. [Google Scholar] [CrossRef]
  21. Holzman, M.E.; Carmona, F.; Rivas, R.; Niclòs, R. Early assessment of crop yield from remotely sensed water stress and solar radiation data. ISPRS J. Photogramm. Remote Sens. 2018, 145, 297–308. [Google Scholar] [CrossRef]
  22. Yuan, L.; Bao, Z.; Zhang, H.; Zhang, Y.; Liang, X. Habitat monitoring to evaluate crop disease and pest distributions based on multi-source satellite remote sensing imagery. Optik 2017, 145, 66–73. [Google Scholar] [CrossRef]
  23. Cetin, M.; Alsenjar, O.; Aksu, H.; Golpinar, M.S.; Akgul, M.A. Estimation of crop water stress index and leaf area index based on remote sensing data. Water Supply 2023, 23, 1390–1404. [Google Scholar] [CrossRef]
  24. Yao, J.; Wu, J.; Xiao, C.; Zhang, Z.; Li, J. The classification method study of crops remote sensing with deep learning, machine learning, and Google Earth engine. Remote Sens. 2022, 14, 2758. [Google Scholar] [CrossRef]
  25. Das Suchi, S.; Menon, A.; Malik, A.; Hu, J.; Gao, J. Crop Identification Based on Remote Sensing Data Using Machine Learning Approaches for Fresno County, California. In Proceedings of the 2021 IEEE Seventh International Conference on Big Data Computing Service and Applications (BigDataService), Oxford, UK, 23–26 August 2021; pp. 115–124. [Google Scholar]
  26. Castaldi, F.; Castrignanò, A.; Casa, R. A data fusion and spatial data analysis approach for the estimation of wheat grain nitrogen uptake from satellite data. Int. J. Remote Sens. 2016, 37, 4317–4336. [Google Scholar] [CrossRef]
  27. Chakraborty, S.K.; Dubey, K. Embedded System for Automatic Real Time Weight Based Grading of Fruits. In Proceedings of the 2017 International Conference on Recent Innovations in Signal processing and Embedded Systems (RISE), Bhopal, India, 27–29 October 2017; pp. 512–515. [Google Scholar]
  28. Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258. [Google Scholar]
  29. Millard, J.P.; Jackson, R.D.; Goettelman, R.C.; Reginato, R.J.; Idso, S.B.; LaPado, R.L. Airborne Monitoring of Crop Canopy Temperatures for Irrigation Scheduling and Yield Prediction. In Proceedings of the International Symposium on Remote Sensing of Environment, Ann Arbor, MI, USA, 25–29 April 1977; pp. 1453–1461. [Google Scholar]
  30. Jha, K.; Doshi, A.; Patel, P.; Shah, M. A comprehensive review on automation in agriculture using artificial intelligence. Artif. Intell. Agric. 2019, 2, 1–12. [Google Scholar] [CrossRef]
  31. Javaid, M.; Haleem, A.; Khan, I.H.; Suman, R. Understanding the potential applications of Artificial Intelligence in Agriculture Sector. Adv. Agrochem. 2023, 2, 15–30. [Google Scholar] [CrossRef]
  32. Ismail; Budiman, D.; Asri, E.; Aidha, Z.R. The Smart Agriculture based on Reconstructed Thermal Image. In Proceedings of the 2022 2nd International Conference on Intelligent Technologies (CONIT), Hubli, India, 24–26 June 2022; pp. 1–6. [Google Scholar]
  33. Gerhards, M.; Schlerf, M.; Mallick, K.; Udelhoven, T. Challenges and Future Perspectives of Multi-/Hyperspectral Thermal Infrared Remote Sensing for Crop Water-Stress Detection: A Review. Remote Sens. 2019, 11, 1240. [Google Scholar] [CrossRef]
  34. Assefa, Y.; Staggenborg, S.A.; Prasad, V.P. V Grain sorghum water requirement and responses to drought stress: A review. Crop Manag. 2010, 9, 1–11. [Google Scholar] [CrossRef]
  35. Crespo, N.; Pádua, L.; Santos, J.A.; Fraga, H. Satellite Remote Sensing Tools for Drought Assessment in Vineyards and Olive Orchards: A Systematic Review. Remote Sens. 2024, 16, 2040. [Google Scholar] [CrossRef]
  36. El Bey, N.; Maazoun, A.; Nahdi, O.; Krima, N.B.; Aounallah, M.K. Water stress indicators in citrus, olive and apple trees: A review. J. Appl. Hortic. 2024, 26, 3–9. [Google Scholar] [CrossRef]
  37. Shi, X.; Han, W.; Zhao, T.; Tang, J. Decision support system for variable rate irrigation based on UAV multispectral remote sensing. Sensors 2019, 19, 2880. [Google Scholar] [CrossRef]
  38. Martínez-Fernández, J.; González-Zamora, A.; Sánchez, N.; Gumuzzio, A. A soil water based index as a suitable agricultural drought indicator. J. Hydrol. 2015, 522, 265–273. [Google Scholar] [CrossRef]
  39. Lahoz, I.; Pérez-de-Castro, A.; Valcárcel, M.; Macua, J.I.; Beltrán, J.; Roselló, S.; Cebolla-Cornejo, J. Effect of water deficit on the agronomical performance and quality of processing tomato. Sci. Hortic. 2016, 200, 55–65. [Google Scholar] [CrossRef]
  40. Alordzinu, K.E.; Li, J.; Lan, Y.; Appiah, S.A.; Al Aasmi, A.; Wang, H. Rapid estimation of crop water stress index on tomato growth. Sensors 2021, 21, 5142. [Google Scholar] [CrossRef] [PubMed]
  41. Zhao, T.; Stark, B.; Chen, Y.; Ray, A.L.; Doll, D. A detailed field study of direct correlations between ground truth crop water stress and normalized difference vegetation index (NDVI) from small unmanned aerial system (sUAS). In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 520–525. [Google Scholar]
  42. Zarco-Tejada, P.J.; González-Dugo, V.; Williams, L.E.; Suarez, L.; Berni, J.A.J.; Goldhamer, D.; Fereres, E. A PRI-based water stress index combining structural and chlorophyll effects: Assessment using diurnal narrow-band airborne imagery and the CWSI thermal index. Remote Sens. Environ. 2013, 138, 38–50. [Google Scholar] [CrossRef]
  43. Suárez, L.; Zarco-Tejada, P.J.; Berni, J.A.J.; González-Dugo, V.; Fereres, E. Orchard Water Stress detection using high-resolution imagery. In Proceedings of the XXVIII International Horticultural Congress on Science and Horticulture for People (IHC2010), International Symposium, Lisbon, Portugal, 22–27 August 2010; pp. 35–39. [Google Scholar]
  44. Rossini, M.; Fava, F.; Cogliati, S.; Meroni, M.; Marchesi, A.; Panigada, C.; Giardino, C.; Busetto, L.; Migliavacca, M.; Amaducci, S. Assessing canopy PRI from airborne imagery to map water stress in maize. ISPRS J. Photogramm. Remote Sens. 2013, 86, 168–177. [Google Scholar] [CrossRef]
  45. Panigada, C.; Rossini, M.; Meroni, M.; Cilia, C.; Busetto, L.; Amaducci, S.; Boschetti, M.; Cogliati, S.; Picchi, V.; Pinto, F. Fluorescence, PRI and canopy temperature for water stress detection in cereal crops. Int. J. Appl. Earth Obs. Geoinf. 2014, 30, 167–178. [Google Scholar] [CrossRef]
  46. Leroux, L.; Baron, C.; Zoungrana, B.; Traoré, S.B.; Seen, D.L.; Bégué, A. Crop monitoring using vegetation and thermal indices for yield estimates: Case study of a rainfed cereal in semi-arid West Africa. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 9, 347–362. [Google Scholar] [CrossRef]
  47. Dangwal, N.; Patel, N.R.; Kumari, M.; Saha, S.K. Monitoring of water stress in wheat using multispectral indices derived from Landsat-TM. Geocarto. Int. 2016, 31, 682–693. [Google Scholar] [CrossRef]
  48. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef]
  49. Goldstein, A.; Fink, L.; Meitin, A.; Bohadana, S.; Lutenberg, O.; Ravid, G. Applying machine learning on sensor data for irrigation recommendations: Revealing the agronomist’s tacit knowledge. Precis. Agric. 2018, 19, 421–444. [Google Scholar] [CrossRef]
  50. Chandel, N.S.; Chakraborty, S.K.; Rajwade, Y.A.; Dubey, K.; Tiwari, M.K.; Jat, D. Identifying crop water stress using deep learning models. Neural Comput. Appl. 2021, 33, 5353–5367. [Google Scholar] [CrossRef]
  51. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [PubMed]
  52. Azimi, S.; Gandhi, T.K. Water Stress Identification in Chickpea Images using Machine Learning. In Proceedings of the 2020 IEEE 8th R10 Humanitarian Technology Conference (R10-HTC), Kuching, Malaysia, 1–3 December 2020; pp. 1–6. [Google Scholar]
  53. Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned features of leaf phenotype to monitor maize water status in the fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
  54. Wan-Gyu, S.; Jun-Hwan, K.; Jae-Kyeong, B.; Dongwon, K.; Ho-Young, B.; Jung-Il, C.; Myung-Chul, S. Detection of Drought Stress in Soybean Plants using RGB-based Vegetation Indices. Korean J. Agric. For. Meteorol. 2021, 23, 340–348. [Google Scholar] [CrossRef]
  55. Elsherbiny, O.; Zhou, L.; He, Y.; Qiu, Z. A novel hybrid deep network for diagnosing water status in wheat crop using IoT-based multimodal data. Comput. Electron. Agric. 2022, 203, 107453. [Google Scholar] [CrossRef]
  56. Chandel, N.S.; Chakraborty, S.K.; Chandel, A.K.; Dubey, K.; Subeesh, A.; Jat, D.; Rajwade, Y.A. State-of-the-art AI-enabled mobile device for real-time water stress detection of field crops. Eng. Appl. Artif. Intell. 2024, 131, 107863. [Google Scholar] [CrossRef]
  57. Chandel, N.S.; Rajwade, Y.A.; Dubey, K.; Chandel, A.K.; Subeesh, A.; Tiwari, M.K. Water Stress Identification of Winter Wheat Crop with State-of-the-Art AI Techniques and High-Resolution Thermal-RGB Imagery. Plants 2022, 11, 3344. [Google Scholar] [CrossRef]
  58. Zhang, L.; Niu, Y.; Zhang, H.; Han, W.; Li, G.; Tang, J.; Peng, X. Maize Canopy Temperature Extracted From UAV Thermal and RGB Imagery and Its Application in Water Stress Monitoring. Front. Plant Sci. 2019, 10, 1270. [Google Scholar] [CrossRef]
  59. Chandel, A.K.; Khot, L.R.; Molaei, B.; Peters, R.T.; Stöckle, C.O.; Jacoby, P.W. High-Resolution Spatiotemporal Water Use Mapping of Surface and Direct-Root-Zone Drip-Irrigated Grapevines Using UAS-Based Thermal and Multispectral Remote Sensing. Remote Sens. 2021, 13, 954. [Google Scholar] [CrossRef]
  60. Chandel, A.K.; Khot, L.R.; Osroosh, Y.; Peters, T.R. Thermal-RGB imager derived in-field apple surface temperature estimates for sunburn management. Agric. For. Meteorol. 2018, 253–254, 132–140. [Google Scholar] [CrossRef]
  61. Mazare, A.G.; Ionescu, L.M.; Visan, D.; Lita, A.I.; Serban, G. Embedded System for Real Time Analysis of Thermal Images for Prevention of Water Stress on Plants. In Proceedings of the 2018 41st International Spring Seminar on Electronics Technology (ISSE), Zlatibor, Serbia, 16–20 May 2018; pp. 1–6. [Google Scholar]
  62. Gutiérrez, S.; Diago, M.P.; Fernández-Novales, J.; Tardaguila, J. Vineyard water status assessment using on-the-go thermal imaging and machine learning. PLoS ONE 2018, 13, e0192037. [Google Scholar] [CrossRef] [PubMed]
  63. Das, S.; Christopher, J.; Apan, A.; Choudhury, M.R.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Evaluation of water status of wheat genotypes to aid prediction of yield on sodic soils using UAV-thermal imaging and machine learning. Agric. For. Meteorol. 2021, 307, 108477. [Google Scholar] [CrossRef]
  64. Yang, M.; Gao, P.; Zhou, P.; Xie, J.; Sun, D.; Han, X.; Wang, W. Simulating Canopy Temperature Using a Random Forest Model to Calculate the Crop Water Stress Index of Chinese Brassica. Agronomy 2021, 11, 2244. [Google Scholar] [CrossRef]
  65. Elsherbiny, O.; Zhou, L.; Feng, L.; Qiu, Z. Integration of Visible and Thermal Imagery with an Artificial Neural Network Approach for Robust Forecasting of Canopy Water Content in Rice. Remote Sens. 2021, 13, 1785. [Google Scholar] [CrossRef]
  66. Carrasco-Benavides, M.; Gonzalez Viejo, C.; Tongson, E.; Baffico-Hernández, A.; Ávila-Sánchez, C.; Mora, M.; Fuentes, S. Water status estimation of cherry trees using infrared thermal imagery coupled with supervised machine learning modeling. Comput. Electron. Agric. 2022, 200, 107256. [Google Scholar] [CrossRef]
  67. de Melo, L.L.; de Melo, V.G.M.L.; Marques, P.A.A.; Frizzone, J.A.; Coelho, R.D.; Romero, R.A.F.; Barros, T.H.d.S. Deep learning for identification of water deficits in sugarcane based on thermal images. Agric. Water Manag. 2022, 272, 107820. [Google Scholar] [CrossRef]
  68. Aversano, L.; Bernardi, M.L.; Cimitile, M. Water stress classification using Convolutional Deep Neural Networks. JUCS J. Univers. Comput. Sci. 2022, 28, 311–328. [Google Scholar] [CrossRef]
  69. Wu, Y.; Jiang, J.; Zhang, X.; Zhang, J.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Combining machine learning algorithm and multi-temporal temperature indices to estimate the water status of rice. Agric. Water Manag. 2023, 289, 108521. [Google Scholar] [CrossRef]
  70. Jin, K.; Zhang, J.; Wang, Z.; Zhang, J.; Liu, N.; Li, M.; Ma, Z. Application of deep learning based on thermal images to identify the water stress in cotton under film-mulched drip irrigation. Agric. Water Manag. 2024, 299, 108901. [Google Scholar] [CrossRef]
  71. Wang, J.; Lou, Y.; Wang, W.; Liu, S.; Zhang, H.; Hui, X.; Wang, Y.; Yan, H.; Maes, W.H. A robust model for diagnosing water stress of winter wheat by combining UAV multispectral and thermal remote sensing. Agric. Water Manag. 2024, 291, 108616. [Google Scholar] [CrossRef]
  72. Sezen, S.M.; Yazar, A.; Eker, S. Effect of drip irrigation regimes on yield and quality of field grown bell pepper. Agric. Water Manag. 2006, 81, 115–131. [Google Scholar] [CrossRef]
  73. Kırnak, H.; Kaya, C.; Değirmenci, V. Growth and Yield Parameters of Bell Peppers With Surface and Subsurface Drip Irrigation Systems Under Different Irrigation Levels/Toprak Üstü ve Toprak Altı Damla Sulama Sistemlerinde Farklı Sulama Düzeylerinin Biber Bitkisinin Gelişim ve Verim Özellikl. Atatürk Üniv. Ziraat Fak. Derg. 2002, 33, 383–389. [Google Scholar]
  74. Jackson, R.D.; Idso, S.B.; Reginato, R.J.; Pinter, P.J., Jr. Canopy temperature as a crop water stress indicator. Water Resour. Res. 1981, 17, 1133–1138. [Google Scholar] [CrossRef]
  75. Idso, S.B.; Jackson, R.D.; Pinter, P.J., Jr.; Reginato, R.J.; Hatfield, J.L. Normalizing the stress-degree-day parameter for environmental variability. Agric. Meteorol. 1981, 24, 45–55. [Google Scholar] [CrossRef]
  76. Pradhan, A.; Aher, L.; Hegde, V.; Jangid, K.K.; Rane, J. Cooler canopy leverages sorghum adaptation to drought and heat stress. Sci. Rep. 2022, 12, 4603. [Google Scholar] [CrossRef]
  77. Pinter, P.J.; Zipoli, G.; Reginato, R.J.; Jackson, R.D.; Idso, S.B.; Hohman, J.P. Canopy temperature as an indicator of differential water use and yield performance among wheat cultivars. Agric. Water Manag. 1990, 18, 35–48. [Google Scholar] [CrossRef]
  78. Cohen, Y.; Alchanatis, V.; Meron, M.; Saranga, Y.; Tsipris, J. Estimation of leaf water potential by thermal imagery and spatial analysis. J. Exp. Bot. 2005, 56, 1843–1852. [Google Scholar] [CrossRef]
  79. Berni, J.A.J.; Zarco-Tejada, P.J.; Sepulcre-Cantó, G.; Fereres, E.; Villalobos, F. Mapping canopy conductance and CWSI in olive orchards using high resolution thermal remote sensing imagery. Remote Sens. Environ. 2009, 113, 2380–2388. [Google Scholar] [CrossRef]
  80. Garrot, D.J., Jr.; Kilby, M.W.; Stedman, S.W.; Fangmeier, D.D.; Ottman, M.J.; Harper, J.M.; Husman, S.H.; Ray, D.T. Irrigation Scheduling Using the Crop Water Stress Index in Arizona; ASAE Publication: Washington, DC, USA, 1990; pp. 281–286. [Google Scholar]
  81. Alderfasi, A.A.; Nielsen, D.C. Use of crop water stress index for monitoring water status and scheduling irrigation in wheat. Agric. Water Manag. 2001, 47, 69–75. [Google Scholar] [CrossRef]
  82. Barnes, E.M.; Pinter, P.J., Jr.; Kimball, B.A.; Hunsaker, D.J.; Wall, G.W.; LaMorte, R.L. Precision irrigation management using modeling and remote sensing approaches. In Proceedings of the Fourth Decennial Symposium, National Irrigation Symposium, Phoenix, AZ, USA, 14–16 November 2000; pp. 332–337. [Google Scholar]
  83. Jones, H.G. Use of infrared thermometry for estimation of stomatal conductance as a possible aid to irrigation scheduling. Agric. For. Meteorol. 1999, 95, 139–149. [Google Scholar] [CrossRef]
  84. Jackson, R.D.; Kustas, W.P.; Choudhury, B.J. A reexamination of the crop water stress index. Irrig. Sci. 1988, 9, 309–317. [Google Scholar] [CrossRef]
  85. Hipps, L.E.; Asrar, G.; Kanemasu, E.T. A theoretically-based normalization of environmental effects on foliage temperature. Agric. For. Meteorol. 1985, 35, 113–122. [Google Scholar] [CrossRef]
  86. King, B.A.; Shellie, K.C.; Tarkalson, D.D.; Levin, A.D.; Sharma, V.; Bjorneberg, D.L. Data-Driven Models for Canopy Temperature-Based Irrigation Scheduling. Trans. ASABE 2020, 63, 1579–1592. [Google Scholar] [CrossRef]
  87. Cherie Workneh, A.; Hari Prasad, K.S.; Ojha, C.S. Elucidating the prediction capability of neural network model for estimation of crop water stress index of rice. ISH J. Hydraul. Eng. 2023, 29, 92–103. [Google Scholar] [CrossRef]
  88. Pradawet, C.; Khongdee, N.; Pansak, W.; Spreer, W.; Hilger, T.; Cadisch, G. Thermal imaging for assessment of maize water stress and yield prediction under drought conditions. J. Agron. Crop Sci. 2023, 209, 56–70. [Google Scholar] [CrossRef]
  89. Katimbo, A.; Rudnick, D.R.; Zhang, J.; Ge, Y.; DeJonge, K.C.; Franz, T.E.; Shi, Y.; Liang, W.; Qiao, X.; Heeren, D.M.; et al. Evaluation of artificial intelligence algorithms with sensor data assimilation in estimating crop evapotranspiration and crop water stress index for irrigation water management. Smart Agric. Technol. 2023, 4, 100176. [Google Scholar] [CrossRef]
  90. Pei, S.; Dai, Y.; Bai, Z.; Li, Z.; Zhang, F.; Yin, F.; Fan, J. Improved estimation of canopy water status in cotton using vegetation indices along with textural information from UAV-based multispectral images. Comput. Electron. Agric. 2024, 224, 109176. [Google Scholar] [CrossRef]
  91. Kumar, N.; Shankar, V. Application of artificial intelligence-based modelling for the prediction of crop water stress index. Res. Sq. 2024. [Google Scholar] [CrossRef]
  92. Chen, H.; Chen, H.; Zhang, S.; Chen, S.; Cen, F.; Zhao, Q.; Huang, X.; He, T.; Gao, Z. Comparison of CWSI and Ts-Ta-VIs in moisture monitoring of dryland crops (sorghum and maize) based on UAV remote sensing. J. Integr. Agric. 2024, 23, 2458–2475. [Google Scholar] [CrossRef]
  93. Bounoua, I.; Saidi, Y.; Yaagoubi, R.; Bouziani, M. Deep Learning Approaches for Water Stress Forecasting in Arboriculture Using Time Series of Remote Sensing Images: Comparative Study between ConvLSTM and CNN-LSTM Models. Technologies 2024, 12, 77. [Google Scholar] [CrossRef]
  94. Kapari, M.; Sibanda, M.; Magidi, J.; Mabhaudhi, T.; Nhamo, L.; Mpandeli, S. Comparing Machine Learning Algorithms for Estimating the Maize Crop Water Stress Index (CWSI) Using UAV-Acquired Remotely Sensed Data in Smallholder Croplands. Drones 2024, 8, 61. [Google Scholar] [CrossRef]
  95. Muni, N.L.; Aditi, Y.; Hitesh, U.; Gopal, D.S. Prediction of Crop Water Stress Index (CWSI) Using Machine Learning Algorithms. World Environ. Water Resour. Congr. 2024, 2024, 969–980. [Google Scholar] [CrossRef]
  96. Green, R.O.; Eastwood, M.L.; Sarture, C.M.; Chrien, T.G.; Aronsson, M.; Chippendale, B.J.; Faust, J.A.; Pavri, B.E.; Chovit, C.J.; Solis, M.; et al. Imaging Spectroscopy and the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). Remote Sens. Environ. 1998, 65, 227–248. [Google Scholar] [CrossRef]
  97. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef]
  98. Krishna, G.; Sahoo, R.N.; Singh, P.; Bajpai, V.; Patra, H.; Kumar, S.; Dandapani, R.; Gupta, V.K.; Viswanathan, C.; Ahmad, T.; et al. Comparison of various modelling approaches for water deficit stress monitoring in rice crop through hyperspectral remote sensing. Agric. Water Manag. 2019, 213, 231–244. [Google Scholar] [CrossRef]
  99. Loggenberg, K.; Strever, A.; Greyling, B.; Poona, N. Modelling water stress in a Shiraz vineyard using hyperspectral imaging and machine learning. Remote Sens. 2018, 10, 202. [Google Scholar] [CrossRef]
  100. Osco, L.P.; Ramos, A.P.; Moriya, É.A.; Bavaresco, L.G.; Lima, B.C.; Estrabis, N.; Pereira, D.R.; Creste, J.E.; Júnior, J.M.; Gonçalves, W.N.; et al. Modeling Hyperspectral Response of Water-Stress Induced Lettuce Plants Using Artificial Neural Networks. Remote Sens. 2019, 11, 2797. [Google Scholar] [CrossRef]
  101. Asaari, M.S.M.; Mertens, S.; Dhondt, S.; Inzé, D.; Wuyts, N.; Scheunders, P. Analysis of hyperspectral images for detection of drought stress and recovery in maize plants in a high-throughput phenotyping platform. Comput. Electron. Agric. 2019, 162, 749–758. [Google Scholar] [CrossRef]
  102. Nasir, R.; Khan, M.J.; Arshad, M.; Khurshid, K. Convolutional Neural Network based Regression for Leaf Water Content Estimation. In Proceedings of the 2019 Second International Conference on Latest trends in Electrical Engineering and Computing Technologies (INTELLECT), Karachi, Pakistan, 13–14 November 2019; pp. 1–5. [Google Scholar]
  103. Sobejano-Paz, V.; Mikkelsen, T.N.; Baum, A.; Mo, X.; Liu, S.; Köppl, C.J.; Johnson, M.S.; Gulyas, L.; García, M. Hyperspectral and Thermal Sensing of Stomatal Conductance, Transpiration, and Photosynthesis for Soybean and Maize under Drought. Remote Sens. 2020, 12, 3182. [Google Scholar] [CrossRef]
  104. Sankararao, A.U.G.; Priyanka, G.; Rajalakshmi, P.; Choudhary, S. CNN Based Water Stress Detection in Chickpea Using UAV Based Hyperspectral Imaging. In Proceedings of the 2021 IEEE International India Geoscience and Remote Sensing Symposium (InGARSS), Ahmedabad, India, 6–10 December 2021; pp. 145–148. [Google Scholar]
  105. Duarte-Carvajalino, J.M.; Silva-Arero, E.A.; Góez-Vinasco, G.A.; Torres-Delgado, L.M.; Ocampo-Paez, O.D.; Castaño-Marín, A.M. Estimation of water stress in potato plants using hyperspectral imagery and machine learning algorithms. Horticulturae 2021, 7, 176. [Google Scholar] [CrossRef]
  106. Niu, Y.; Han, W.; Zhang, H.; Zhang, L.; Chen, H. Estimating fractional vegetation cover of maize under water stress from UAV multispectral imagery using machine learning algorithms. Comput. Electron. Agric. 2021, 189, 106414. [Google Scholar] [CrossRef]
  107. Mohite, J.; Sawant, S.; Agarwal, R.; Pandit, A.; Pappula, S. Detection of Crop Water Stress in Maize Using Drone Based Hyperspectral Imaging. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; pp. 5957–5960. [Google Scholar]
  108. Sankararao, A.U.G.; Rajalakshmi, P.; Kaliamoorthy, S.; Choudhary, S. Water Stress Detection in Pearl Millet Canopy with Selected Wavebands using UAV Based Hyperspectral Imaging and Machine Learning. In Proceedings of the 2022 IEEE Sensors Applications Symposium (SAS), Sundsvall, Sweden, 1–3 August 2022; pp. 1–6. [Google Scholar]
  109. Thapa, S.; Kang, C.; Diverres, G.; Karkee, M.; Zhang, Q.; Keller, M. Assessment of Water Stress in Vineyards Using On-The-Go Hyperspectral Imaging and Machine Learning Algorithms. J. ASABE 2022, 65, 949–962. [Google Scholar] [CrossRef]
  110. Sankararao, A.U.G.; Rajalakshmi, P.; Choudhary, S. Machine Learning-Based Ensemble Band Selection for Early Water Stress Identification in Groundnut Canopy Using UAV-Based Hyperspectral Imaging. IEEE Geosci. Remote Sens. Lett. 2023, 20, 1–5. [Google Scholar] [CrossRef]
  111. Mertens, S.; Verbraeken, L.; Sprenger, H.; De Meyer, S.; Demuynck, K.; Cannoot, B.; Merchie, J.; De Block, J.; Vogel, J.T.; Bruce, W.; et al. Monitoring of drought stress and transpiration rate using proximal thermal and hyperspectral imaging in an indoor automated plant phenotyping platform. Plant Methods 2023, 19, 132. [Google Scholar] [CrossRef] [PubMed]
  112. Kang, C.; Diverres, G.; Achyut, P.; Karkee, M.; Zhang, Q.; Keller, M. Estimating soil and grapevine water status using ground based hyperspectral imaging under diffused lighting conditions: Addressing the effect of lighting variability in vineyards. Comput. Electron. Agric. 2023, 212, 108175. [Google Scholar] [CrossRef]
  113. Zhuang, T.; Zhang, Y.; Li, D.; Schmidhalter, U.; Ata-UI-Karim, S.T.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; et al. Coupling continuous wavelet transform with machine learning to improve water status prediction in winter wheat. Precis. Agric. 2023, 24, 2171–2199. [Google Scholar] [CrossRef]
  114. Mao, B.; Cheng, Q.; Chen, L.; Duan, F.; Sun, X.; Li, Y.; Li, Z.; Zhai, W.; Ding, F.; Li, H.; et al. Multi-random ensemble on Partial Least Squares regression to predict wheat yield and its losses across water and nitrogen stress with hyperspectral remote sensing. Comput. Electron. Agric. 2024, 222, 109046. [Google Scholar] [CrossRef]
  115. Malounas, I.; Paliouras, G.; Nikolopoulos, D.; Liakopoulos, G.; Bresta, P.; Londra, P.; Katsileros, A.; Fountas, S. Early detection of broccoli drought acclimation/stress in agricultural environments utilizing proximal hyperspectral imaging and AutoML. Smart Agric. Technol. 2024, 8, 100463. [Google Scholar] [CrossRef]
  116. Zhang, X.; Xu, H.; She, Y.; Hu, C.; Zhu, T.; Wang, L.; Wu, L.; You, C.; Ke, J.; Zhang, Q.; et al. Improving the prediction performance of leaf water content by coupling multi-source data with machine learning in rice (Oryza sativa L.). Plant Methods 2024, 20, 48. [Google Scholar] [CrossRef]
  117. El Naqa, I.; Murphy, M.J. What Is Machine Learning? Springer: Berlin/Heidelberg, Germany, 2015; ISBN 3319183044. [Google Scholar]
  118. Jordan, M.I.; Mitchell, T.M. Machine learning: Trends, perspectives, and prospects. Science 2015, 349, 255–260. [Google Scholar] [CrossRef]
  119. Cunningham, P.; Cord, M.; Delany, S.J. Supervised Learning. In Machine Learning Techniques for Multimedia: Case Studies on Organization and Retrieval; Springer: Berlin/Heidelberg, Germany, 2008; pp. 21–49. [Google Scholar]
  120. Ghahramani, Z. Unsupervised Learning. In Summer School on Machine Learning; Springer: Berlin/Heidelberg, Germany, 2003; pp. 72–112. [Google Scholar]
  121. Wiering, M.A.; Van Otterlo, M. Reinforcement learning. Adapt. Learn. Optim. 2012, 12, 729. [Google Scholar]
  122. Chen, Y.; Yu, Z.; Han, Z.; Sun, W.; He, L. A Decision-Making System for Cotton Irrigation Based on Reinforcement Learning Strategy. Agronomy 2024, 14, 1–17. [Google Scholar] [CrossRef]
  123. Homod, R.Z.; Mohammed, H.I.; Abderrahmane, A.; Alawi, O.A.; Khalaf, O.I.; Mahdi, J.M.; Guedri, K.; Dhaidan, N.S.; Albahri, A.S.; Sadeq, A.M.; et al. Deep clustering of Lagrangian trajectory for multi-task learning to energy saving in intelligent buildings using cooperative multi-agent. Appl. Energy 2023, 351, 121843. [Google Scholar] [CrossRef]
  124. Homod, R.Z.; Munahi, B.S.; Mohammed, H.I.; Albadr, M.A.A.; Abderrahmane, A.; Mahdi, J.M.; Ben Hamida, M.B.; Alhasnawi, B.N.; Albahri, A.S.; Togun, H.; et al. Deep clustering of reinforcement learning based on the bang-bang principle to optimize the energy in multi-boiler for intelligent buildings. Appl. Energy 2024, 356, 122357. [Google Scholar] [CrossRef]
  125. Homod, R.Z.; Mohammed, H.I.; Ben Hamida, M.B.; Albahri, A.S.; Alhasnawi, B.N.; Albahri, O.S.; Alamoodi, A.H.; Mahdi, J.M.; Albadr, M.A.A.; Yaseen, Z.M. Optimal shifting of peak load in smart buildings using multiagent deep clustering reinforcement learning in multi-tank chilled water systems. J. Energy Storage 2024, 92, 112140. [Google Scholar] [CrossRef]
  126. Bray, M.; Han, D. Identification of support vector machines for runoff modelling. J. Hydroinformatics 2004, 6, 265–280. [Google Scholar] [CrossRef]
  127. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  128. Belousov, A.I.; Verzakov, S.A.; von Frese, J. A flexible classification approach with optimal generalisation performance: Support vector machines. Chemom. Intell. Lab. Syst. 2002, 64, 15–25. [Google Scholar] [CrossRef]
  129. Kang, J.; Zhang, H.; Yang, H.; Zhang, L. Support Vector Machine Classification of Crop Lands Using Sentinel-2 Imagery. In Proceedings of the 2018 7th International Conference on Agro-geoinformatics (Agro-geoinformatics), Hangzhou, China, 6–9 August 2018; pp. 1–6. [Google Scholar]
  130. Zheng, B.; Myint, S.W.; Thenkabail, P.S.; Aggarwal, R.M. A support vector machine to identify irrigated crop types using time-series Landsat NDVI data. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 103–112. [Google Scholar] [CrossRef]
  131. Soumaya, Z.; Drissi Taoufiq, B.; Benayad, N.; Yunus, K.; Abdelkrim, A. The detection of Parkinson disease using the genetic algorithm and SVM classifier. Appl. Acoust. 2021, 171, 107528. [Google Scholar] [CrossRef]
  132. Usha Kumari, C.; Sampath Dakshina Murthy, A.; Lakshmi Prasanna, B.; Pala Prasad Reddy, M.; Kumar Panigrahy, A. An automated detection of heart arrhythmias using machine learning technique: SVM. Mater. Today Proc. 2021, 45, 1393–1398. [Google Scholar] [CrossRef]
  133. Fan, M.; Sharma, A. Design and implementation of construction cost prediction model based on svm and lssvm in industries 4.0. Int. J. Intell. Comput. Cybern. 2021, 14, 145–157. [Google Scholar] [CrossRef]
  134. Tellez Gaytan, J.C.; Ateeq, K.; Rafiuddin, A.; Alzoubi, H.M.; Ghazal, T.M.; Ahanger, T.A.; Chaudhary, S.; Viju, G.K. AI-Based Prediction of Capital Structure: Performance Comparison of ANN SVM and LR Models. Comput. Intell. Neurosci. 2022, 2022, 8334927. [Google Scholar] [CrossRef] [PubMed]
  135. Hussain, M.G.; Hasan, M.R.; Rahman, M.; Protim, J.; Hasan, S.A. Detection of Bangla Fake News using MNB and SVM Classifier. In Proceedings of the 2020 International Conference on Computing, Electronics & Communications Engineering (iCCECE), Southend, UK, 17–18 August 2020; pp. 81–85. [Google Scholar]
  136. Tripathi, M. Sentiment analysis of nepali covid19 tweets using nb svm and lstm. J. Artif. Intell. 2021, 3, 151–168. [Google Scholar]
  137. Awad, M. Google Earth Engine (GEE) Cloud Computing Based Crop Classification Using Radar, Optical Images and Support Vector Machine Algorithm (SVM). In Proceedings of the 2021 IEEE 3rd International Multidisciplinary Conference on Engineering Technology (IMCET), Beirut, Lebanon, 8–10 December 2021; pp. 71–76. [Google Scholar]
  138. Dash, R.; Dash, D.K.; Biswal, G.C. Classification of crop based on macronutrients and weather data using machine learning techniques. Results Eng. 2021, 9, 100203. [Google Scholar] [CrossRef]
  139. Fegade, T.K.; Pawar, B.V. Crop Prediction Using Artificial Neural Network and Support Vector Machine BT—Data Management, Analytics and Innovation; Sharma, N., Chakrabarti, A., Balas, V.E., Eds.; Springer: Singapore, 2020; pp. 311–324. [Google Scholar]
  140. Teja, M.S.; Preetham, T.S.; Sujihelen, L.; Christy; Jancy, S.; Selvan, M.P. Crop Recommendation and Yield Production using SVM Algorithm. In Proceedings of the 2022 6th International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 25–27 May 2022; pp. 1768–1771. [Google Scholar]
  141. Wold, H.O.A. Soft Modelling: The basic Design and Some Extensions. In Systems under Indirect Observations: Part II; Scientific Research Publishing: North-Holland, Amsterdam, 1982; pp. 36–37. [Google Scholar]
  142. Porker, K.; Coventry, S.; Fettell, N.A.; Cozzolino, D.; Eglinton, J. Using a novel PLS approach for envirotyping of barley phenology and adaptation. F. Crop. Res. 2020, 246, 107697. [Google Scholar] [CrossRef]
  143. Gefen, D.; Straub, D.; Boudreau, M.-C. Structural equation modeling and regression: Guidelines for research practice. Commun. Assoc. Inf. Syst. 2000, 4, 7. [Google Scholar] [CrossRef]
  144. Wold, S.; Sjöström, M.; Eriksson, L. PLS-regression: A basic tool of chemometrics. Chemom. Intell. Lab. Syst. 2001, 58, 109–130. [Google Scholar] [CrossRef]
  145. Albersmeier, F.; Spiller, A. Die reputation der Fleischwirtschaft: Eine Kausalanalyse. Ger. J. Agric. Econ. 2010, 59, 258–270. [Google Scholar]
  146. Weiber, R.; Mühlhaus, D. Strukturgleichungsmodellierung; Springer: Berlin/Heidelberg, Germany, 2014; ISBN 3642350127. [Google Scholar]
  147. Lasalvia, M.; Capozzi, V.; Perna, G. A Comparison of PCA-LDA and PLS-DA Techniques for Classification of Vibrational Spectra. Appl. Sci. 2022, 12, 5345. [Google Scholar] [CrossRef]
  148. Maćkiewicz, A.; Ratajczak, W. Principal components analysis (PCA). Comput. Geosci. 1993, 19, 303–342. [Google Scholar] [CrossRef]
  149. AlHamad, M.; Akour, I.; Alshurideh, M.; Al-Hamad, A.; Kurdi, B.; Alzoubi, H. Predicting the intention to use google glass: A comparative approach using machine learning models and PLS-SEM. Int. J. Data Netw. Sci. 2021, 5, 311–320. [Google Scholar] [CrossRef]
  150. Kono, S.; Sato, M. The potentials of partial least squares structural equation modeling (PLS-SEM) in leisure research. J. Leis. Res. 2023, 54, 309–329. [Google Scholar] [CrossRef]
  151. Hair, J.; Alamer, A. Partial Least Squares Structural Equation Modeling (PLS-SEM) in second language and education research: Guidelines using an applied example. Res. Methods Appl. Linguist. 2022, 1, 100027. [Google Scholar] [CrossRef]
  152. Sarstedt, M.; Hair, J.F.; Pick, M.; Liengaard, B.D.; Radomir, L.; Ringle, C.M. Progress in partial least squares structural equation modeling use in marketing research in the last decade. Psychol. Mark. 2022, 39, 1035–1064. [Google Scholar] [CrossRef]
  153. Prihtanti, T.M.; Zebua, N.T. Agricultural extension workers’ perception, usage, and satisfaction in use of internet in the Islands region of South Nias Regency, Indonesia (An Analysis using SEM-PLS Model). World J. Adv. Res. Rev. 2023, 19, 346–362. [Google Scholar] [CrossRef]
  154. Rübcke von Veltheim, F.; Theuvsen, L.; Heise, H. German farmers’ intention to use autonomous field robots: A PLS-analysis. Precis. Agric. 2022, 23, 670–697. [Google Scholar] [CrossRef]
  155. Tama, R.A.; Hoque, M.M.; Liu, Y.; Alam, M.J.; Yu, M. An Application of Partial Least Squares Structural Equation Modeling (PLS-SEM) to Examining Farmers’ Behavioral Attitude and Intention towards Conservation Agriculture in Bangladesh. Agriculture 2023, 13, 503. [Google Scholar] [CrossRef]
  156. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [CrossRef] [PubMed]
  157. LeCun, Y.; Bengio, Y. Convolutional networks for images, speech, and time series. Handb. Brain Theory Neural Netw. 1995, 3361, 1995. [Google Scholar]
  158. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
  159. Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [Google Scholar] [CrossRef]
  160. Cao, C.; Liu, F.; Tan, H.; Song, D.; Shu, W.; Li, W.; Zhou, Y.; Bo, X.; Xie, Z. Deep learning and its applications in biomedicine. Genom. Proteom. Bioinforma. 2018, 16, 17–32. [Google Scholar] [CrossRef] [PubMed]
  161. Santos, L.; Santos, F.N.; Oliveira, P.M.; Shinde, P. Deep Learning Applications in Agriculture: A short review. In Robot 2019: Fourth Iberian Robotics Conference, Proceedings of ROBOT 2019—the Fourth Iberian Robotics Conference, Porto, Portugal, 20–22 November 2019; Springer: Cham, Switzerland, 2020; pp. 139–151. [Google Scholar]
  162. McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
  163. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation. In Parallel Distributed Processing, Explorations in the Microstructure of Cognition; Rumelhart, D.E., Mcclelland, J.L., Eds.; MIT Press: Cambridge, MA, USA, 1986; Volume 1. [Google Scholar]
  164. Zurada, J. Introduction to Artificial Neural Systems; West Publishing Co.: Eagan, MN, USA, 1992; ISBN 0314933913. [Google Scholar]
  165. Dai, X.; Huo, Z.; Wang, H. Simulation for response of crop yield to soil moisture and salinity with artificial neural network. F. Crop. Res. 2011, 121, 441–449. [Google Scholar] [CrossRef]
  166. Agatonovic-Kustrin, S.; Beresford, R. Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research. J. Pharm. Biomed. Anal. 2000, 22, 717–727. [Google Scholar] [CrossRef]
  167. Palani, S.; Liong, S.-Y.; Tkalich, P. An ANN application for water quality forecasting. Mar. Pollut. Bull. 2008, 56, 1586–1597. [Google Scholar] [CrossRef]
  168. Hsu, K.; Gupta, H.V.; Sorooshian, S. Artificial neural network modeling of the rainfall-runoff process. Water Resour. Res. 1995, 31, 2517–2530. [Google Scholar] [CrossRef]
  169. Zhang, J.-Q.; Zhang, L.-X.; Zhang, M.-H.; Watson, C. Prediction of soybean growth and development using artificial neural network and statistical models. Acta Agron. Sin. 2009, 35, 341–347. [Google Scholar] [CrossRef]
  170. Khazaei, J.; Naghavi, M.R.; Jahansouz, M.R.; Salimi-Khorshidi, G. Yield estimation and clustering of chickpea genotypes using soft computing techniques. Agron. J. 2008, 100, 1077–1087. [Google Scholar] [CrossRef]
  171. Liu, X.; Kang, S.; Li, F. Simulation of artificial neural network model for trunk sap flow of Pyrus pyrifolia and its comparison with multiple-linear regression. Agric. Water Manag. 2009, 96, 939–945. [Google Scholar] [CrossRef]
  172. Jena, P.R.; Majhi, B.; Kalli, R.; Majhi, R. Prediction of crop yield using climate variables in the south-western province of India: A functional artificial neural network modeling (FLANN) approach. Environ. Dev. Sustain. 2023, 25, 11033–11056. [Google Scholar] [CrossRef]
  173. Poblete, T.; Ortega-Farías, S.; Moreno, M.A.; Bardeen, M. Artificial Neural Network to Predict Vine Water Status Spatial Variability Using Multispectral Information Obtained from an Unmanned Aerial Vehicle (UAV). Sensors 2017, 17, 2488. [Google Scholar] [CrossRef] [PubMed]
  174. Martí, P.; Gasque, M.; González-Altozano, P. An artificial neural network approach to the estimation of stem water potential from frequency domain reflectometry soil moisture measurements and meteorological data. Comput. Electron. Agric. 2013, 91, 75–86. [Google Scholar] [CrossRef]
  175. Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
  176. Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 1980, 36, 193–202. [Google Scholar] [CrossRef] [PubMed]
  177. Le Cun, Y.; Boser, B.; Denker, J.S.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L.D. Handwritten Digit Recognition with a Back Propagation Network. In Advances in Neural Information Processing Systems 2; AT&T Bell Laboratories: Holmdel, NJ, USA, 1995; pp. 53–60. [Google Scholar]
  178. Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
  179. Yu, F.; Zhang, Q.; Xiao, J.; Ma, Y.; Wang, M.; Luan, R.; Liu, X.; Ping, Y.; Nie, Y.; Tao, Z.; et al. Progress in the Application of CNN-Based Image Classification and Recognition in Whole Crop Growth Cycles. Remote Sens. 2023, 15, 2988. [Google Scholar] [CrossRef]
  180. Wang, Y.; Zhang, Z.; Feng, L.; Ma, Y.; Du, Q. A new attention-based CNN approach for crop mapping using time series Sentinel-2 images. Comput. Electron. Agric. 2021, 184, 106090. [Google Scholar] [CrossRef]
  181. Agarwal, M.; Gupta, S.K.; Biswas, K.K. Development of Efficient CNN model for Tomato crop disease identification. Sustain. Comput. Inform. Syst. 2020, 28, 100407. [Google Scholar] [CrossRef]
  182. Thakur, P.S.; Sheorey, T.; Ojha, A. VGG-ICNN: A Lightweight CNN model for crop disease identification. Multimed. Tools Appl. 2023, 82, 497–520. [Google Scholar] [CrossRef]
  183. Peyal, H.I.; Nahiduzzaman, M.; Pramanik, M.A.H.; Syfullah, M.K.; Shahriar, S.M.; Sultana, A.; Ahsan, M.; Haider, J.; Khandakar, A.; Chowdhury, M.E.H. Plant Disease Classifier: Detection of Dual-Crop Diseases Using Lightweight 2D CNN Architecture. IEEE Access 2023, 11, 110627–110643. [Google Scholar] [CrossRef]
  184. Mehta, S.; Kukreja, V.; Vats, S. Improving Crop Health Management: Federated Learning CNN for Spinach Leaf Disease Detection. In Proceedings of the 2023 3rd International Conference on Intelligent Technologies (CONIT), Hubli, India, 23–25 June 2023; pp. 1–6. [Google Scholar]
  185. Jiang, H.; Zhang, C.; Qiao, Y.; Zhang, Z.; Zhang, W.; Song, C. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [Google Scholar] [CrossRef]
  186. Mao, M.; Zhao, H.; Tang, G.; Ren, J. In-Season Crop Type Detection by Combing Sentinel-1A and Sentinel-2 Imagery Based on the CNN Model. Agronomy 2023, 13, 1723. [Google Scholar] [CrossRef]
  187. Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Assessment of Crop Yield Prediction Capabilities of CNN Using Multisource Data BT—New Developments and Environmental Applications of Drones; Lipping, T., Linna, P., Narra, N., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 173–186. [Google Scholar]
  188. Dasarathy, B.V.; Sheela, B.V. A composite classifier system design: Concepts and methodology. Proc. IEEE 1979, 67, 708–713. [Google Scholar] [CrossRef]
  189. Polikar, R. Ensemble Learning BT—Ensemble Machine Learning: Methods and Applications; Zhang, C., Ma, Y., Eds.; Springer: New York, NY, USA, 2012; pp. 1–34. ISBN 978-1-4419-9326-7. [Google Scholar]
  190. Huang, F.; Xie, G.; Xiao, R. Research on Ensemble Learning. In Proceedings of the 2009 International Conference on Artificial Intelligence and Computational Intelligence, Shanghai, China, 7–8 November 2009; Volume 3, pp. 249–252. [Google Scholar]
  191. Sagi, O.; Rokach, L. Ensemble learning: A survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8, e1249. [Google Scholar] [CrossRef]
  192. Mohammed, A.; Kora, R. A comprehensive review on ensemble deep learning: Opportunities and challenges. J. King Saud Univ. Comput. Inf. Sci. 2023, 35, 757–774. [Google Scholar] [CrossRef]
  193. Baradaran Rezaei, H.; Amjadian, A.; Sebt, M.V.; Askari, R.; Gharaei, A. An ensemble method of the machine learning to prognosticate the gastric cancer. Ann. Oper. Res. 2023, 328, 151–192. [Google Scholar] [CrossRef]
  194. Gaikwad, D.P.; Thool, R.C. Intrusion Detection System Using Bagging Ensemble Method of Machine Learning. In Proceedings of the 2015 International Conference on Computing Communication Control and Automation, Pune, India, 26–27 February 2015; pp. 291–295. [Google Scholar]
  195. Mohammed, A.; Kora, R. An effective ensemble deep learning framework for text classification. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 8825–8837. [Google Scholar] [CrossRef]
  196. Xiao, Y.; Wu, J.; Lin, Z.; Zhao, X. A deep learning-based multi-model ensemble method for cancer prediction. Comput. Methods Programs Biomed. 2018, 153, 1–9. [Google Scholar] [CrossRef]
  197. Ahmad, I.; Yousaf, M.; Yousaf, S.; Ahmad, M.O. Fake news detection using machine learning ensemble methods. Complexity 2020, 2020, 8885861. [Google Scholar] [CrossRef]
  198. Chakir, O.; Rehaimi, A.; Sadqi, Y.; Krichen, M.; Gaba, G.S.; Gurtov, A. An empirical assessment of ensemble methods and traditional machine learning techniques for web-based attack detection in industry 5.0. J. King Saud Univ. Inf. Sci. 2023, 35, 103–119. [Google Scholar] [CrossRef]
  199. Lin, C.; Xu, J.; Hou, J.; Liang, Y.; Mei, X. Ensemble method with heterogeneous models for battery state-of-health estimation. IEEE Trans. Ind. Inform. 2023, 19, 10160–10169. [Google Scholar] [CrossRef]
  200. Kisi, O.; Alizamir, M.; Docheshmeh Gorgij, A. Dissolved oxygen prediction using a new ensemble method. Environ. Sci. Pollut. Res. 2020, 27, 9589–9603. [Google Scholar] [CrossRef] [PubMed]
  201. Talukder, M.S.H.; Sarkar, A.K. Nutrients deficiency diagnosis of rice crop by weighted average ensemble learning. Smart Agric. Technol. 2023, 4, 100155. [Google Scholar] [CrossRef]
  202. Iniyan, S.; Jebakumar, R. Mutual Information Feature Selection (MIFS) Based Crop Yield Prediction on Corn and Soybean Crops Using Multilayer Stacked Ensemble Regression (MSER). Wirel. Pers. Commun. 2022, 126, 1935–1964. [Google Scholar] [CrossRef]
  203. Kasinathan, T.; Uyyala, S.R. Machine learning ensemble with image processing for pest identification and classification in field crops. Neural Comput. Appl. 2021, 33, 7491–7504. [Google Scholar] [CrossRef]
  204. Chaudhary, A.; Thakur, R.; Kolhe, S.; Kamal, R. A particle swarm optimization based ensemble for vegetable crop disease recognition. Comput. Electron. Agric. 2020, 178, 105747. [Google Scholar] [CrossRef]
  205. Saini, R.; Ghosh, S.K. Crop classification in a heterogeneous agricultural environment using ensemble classifiers and single-date Sentinel-2A imagery. Geocarto Int. 2021, 36, 2141–2159. [Google Scholar] [CrossRef]
  206. Chen, T.; Guestrin, C. Xgboost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
  207. Ramraj, S.; Uzir, N.; Sunil, R.; Banerjee, S. Experimenting XGBoost algorithm for prediction and classification of different datasets. Int. J. Control Theory Appl. 2016, 9, 651–662. [Google Scholar]
  208. Ogunleye, A.; Wang, Q.-G. XGBoost Model for Chronic Kidney Disease Diagnosis. IEEE/ACM Trans. Comput. Biol. Bioinforma. 2020, 17, 2131–2140. [Google Scholar] [CrossRef] [PubMed]
  209. Dhaliwal, S.S.; Nahid, A.-A.; Abbas, R. Effective Intrusion Detection System Using XGBoost. Information 2018, 9, 149. [Google Scholar] [CrossRef]
  210. Li, Y.; Zeng, H.; Zhang, M.; Wu, B.; Zhao, Y.; Yao, X.; Cheng, T.; Qin, X.; Wu, F. A county-level soybean yield prediction framework coupled with XGBoost and multidimensional feature engineering. Int. J. Appl. Earth Obs. Geoinf. 2023, 118, 103269. [Google Scholar] [CrossRef]
  211. Mallikarjuna Rao, G.S.; Dangeti, S.; Amiripalli, S.S. An Efficient Modeling Based on XGBoost and SVM Algorithms to Predict Crop Yield BT—Advances in Data Science and Management; Borah, S., Mishra, S.K., Mishra, B.K., Balas, V.E., Polkowski, Z., Eds.; Springer Nature Singapore: Singapore, 2022; pp. 565–574. [Google Scholar]
  212. Mariadass, D.A.-L.; Moung, E.G.; Sufian, M.M.; Farzamnia, A. Extreme Gradient Boosting (XGBoost) Regressor and Shapley Additive Explanation for Crop Yield Prediction in Agriculture. In Proceedings of the 2022 12th International Conference on Computer and Knowledge Engineering (ICCKE), Mashhad, Iran, 17–18 November 2022; pp. 219–224. [Google Scholar]
  213. Ge, J.; Zhao, L.; Yu, Z.; Liu, H.; Zhang, L.; Gong, X.; Sun, H. Prediction of Greenhouse Tomato Crop Evapotranspiration Using XGBoost Machine Learning Model. Plants 2022, 11, 1923. [Google Scholar] [CrossRef]
  214. Nagaraju, A.; Reddy, M.A.K.; Reddy, C.V.; Mohandas, R. Multifactor Analysis to Predict Best Crop using Xg-Boost Algorithm. In Proceedings of the 2021 5th International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 3–5 June 2021; pp. 155–163. [Google Scholar]
  215. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  216. Shi, T.; Horvath, S. Unsupervised Learning With Random Forest Predictors. J. Comput. Graph. Stat. 2006, 15, 118–138. [Google Scholar] [CrossRef]
  217. Biau, G. Analysis of a random forests model. J. Mach. Learn. Res. 2012, 13, 1063–1095. [Google Scholar]
  218. Rigatti, S.J. Random forest. J. Insur. Med. 2017, 47, 31–39. [Google Scholar] [CrossRef]
  219. Ok, A.O.; Akar, O.; Gungor, O. Evaluation of random forest method for agricultural crop classification. Eur. J. Remote Sens. 2012, 45, 421–432. [Google Scholar] [CrossRef]
  220. Tatsumi, K.; Yamashiki, Y.; Canales Torres, M.A.; Taipe, C.L.R. Crop classification of upland fields using Random forest of time-series Landsat 7 ETM+ data. Comput. Electron. Agric. 2015, 115, 171–179. [Google Scholar] [CrossRef]
  221. Prasad, N.R.; Patel, N.R.; Danodia, A. Crop yield prediction in cotton for regional level using random forest approach. Spat. Inf. Res. 2021, 29, 195–206. [Google Scholar] [CrossRef]
  222. Geetha, V.; Punitha, A.; Abarna, M.; Akshaya, M.; Illakiya, S.; Janani, A.P. An Effective Crop Prediction Using Random Forest Algorithm. In Proceedings of the 2020 International Conference on System, Computation, Automation and Networking (ICSCAN), Pondicherry, India, 3–4 July 2020; pp. 1–5. [Google Scholar]
  223. Mohanty, S.P.; Hughes, D.P.; Salathé, M. Using deep learning for image-based plant disease detection. Front. Plant Sci. 2016, 7, 1419. [Google Scholar] [CrossRef] [PubMed]
  224. Liu, B.; Zhang, Y.; He, D.; Li, Y. Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry 2017, 10, 11. [Google Scholar] [CrossRef]
  225. Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A robust deep-learning-based detector for real-time tomato plant diseases and pests recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef] [PubMed]
  226. Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
  227. Dyrmann, M.; Karstoft, H.; Midtiby, H.S. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
  228. Cruz, A.C.; Luvisi, A.; De Bellis, L.; Ampatzidis, Y. X-FIDO: An effective application for detecting olive quick decline syndrome with deep learning and data fusion. Front. Plant Sci. 2017, 8, 1741. [Google Scholar] [CrossRef] [PubMed]
  229. Barbedo, J.G.A. Plant disease identification from individual lesions and spots using deep learning. Biosyst. Eng. 2019, 180, 96–107. [Google Scholar] [CrossRef]
  230. Zheng, Y.-Y.; Kong, J.-L.; Jin, X.-B.; Wang, X.-Y.; Su, T.-L.; Zuo, M. CropDeep: The crop vision dataset for deep-learning-based classification and detection in precision agriculture. Sensors 2019, 19, 1058. [Google Scholar] [CrossRef]
  231. Wang, G.; Sun, Y.; Wang, J. Automatic image-based plant disease severity estimation using deep learning. Comput. Intell. Neurosci. 2017, 2017, 2917536. [Google Scholar] [CrossRef]
  232. Too, E.C.; Yujian, L.; Njuki, S.; Yingchun, L. A comparative study of fine-tuning deep learning models for plant disease identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
  233. Suh, H.K.; Ijsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65. [Google Scholar] [CrossRef]
  234. Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J. DeepWeeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef]
  235. Ghazi, M.M.; Yanikoglu, B.; Aptoula, E. Plant identification using deep neural networks via optimization of transfer learning parameters. Neurocomputing 2017, 235, 228–235. [Google Scholar] [CrossRef]
  236. Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [Google Scholar] [CrossRef]
  237. Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. arXiv 2014, arXiv:1406.2661. [Google Scholar]
  238. Frid-Adar, M.; Diamant, I.; Klang, E.; Amitai, M.; Goldberger, J.; Greenspan, H. GAN-based synthetic medical image augmentation for increased CNN performance in liver lesion classification. Neurocomputing 2018, 321, 321–331. [Google Scholar] [CrossRef]
  239. Bowles, C.; Chen, L.; Guerrero, R.; Bentley, P.; Gunn, R.; Hammers, A.; Dickie, D.A.; Hernández, M.V.; Wardlaw, J.; Rueckert, D. Gan augmentation: Augmenting training data using generative adversarial networks. arXiv 2018, arXiv:1810.10863. [Google Scholar]
  240. Perez, L.; Wang, J. The effectiveness of data augmentation in image classification using deep learning. arXiv 2017, arXiv:1712.04621. [Google Scholar]
  241. Gong, X.; Chang, S.; Jiang, Y.; Wang, Z. Autogan: Neural Architecture Search for Generative Adversarial Networks. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 3224–3234. [Google Scholar]
  242. Brock, A.; Donahue, J.; Simonyan, K. Large scale GAN training for high fidelity natural image synthesis. arXiv 2018, arXiv:1809.11096. [Google Scholar]
  243. Radford, A.; Metz, L.; Chintala, S. Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv 2015, arXiv:1511.06434. [Google Scholar]
  244. Arjovsky, M.; Bottou, L. Towards principled methods for training generative adversarial networks. arXiv 2017, arXiv:1701.04862. [Google Scholar]
  245. Kurach, K.; Lučić, M.; Zhai, X.; Michalski, M.; Gelly, S. A large-Scale Study on Regularization and Normalization in GANs. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 10–15 June 2019; pp. 3581–3590. [Google Scholar]
  246. Lucic, M.; Kurach, K.; Michalski, M.; Gelly, S.; Bousquet, O. Are gans created equal? a large-scale study. arXiv 2018, arXiv:1711.10337. [Google Scholar]
  247. Aggarwal, A.; Mittal, M.; Battineni, G. Generative adversarial network: An overview of theory and applications. Int. J. Inf. Manag. Data Insights 2021, 1, 100004. [Google Scholar] [CrossRef]
  248. Shumilo, L.; Okhrimenko, A.; Kussul, N.; Drozd, S.; Shkalikov, O. Generative adversarial network augmentation for solving the training data imbalance problem in crop classification. Remote Sens. Lett. 2023, 14, 1129–1138. [Google Scholar] [CrossRef]
  249. Huang, Y.; Chen, Z.; Liu, J. Limited agricultural spectral dataset expansion based on generative adversarial networks. Comput. Electron. Agric. 2023, 215, 108385. [Google Scholar] [CrossRef]
  250. Akkem, Y.; Biswas, S.K.; Varanasi, A. A comprehensive review of synthetic data generation in smart farming by using variational autoencoder and generative adversarial network. Eng. Appl. Artif. Intell. 2024, 131, 107881. [Google Scholar] [CrossRef]
  251. Douarre, C.; Crispim-Junior, C.F.; Gelibert, A.; Tougne, L.; Rousseau, D. Novel data augmentation strategies to boost supervised segmentation of plant disease. Comput. Electron. Agric. 2019, 165, 104967. [Google Scholar] [CrossRef]
  252. Fawakherji, M.; Potena, C.; Pretto, A.; Bloisi, D.D.; Nardi, D. Multi-spectral image synthesis for crop/weed segmentation in precision farming. Rob. Auton. Syst. 2021, 146, 103861. [Google Scholar] [CrossRef]
  253. Luo, Z.; Yu, H.; Zhang, Y. Pine cone detection using boundary equilibrium generative adversarial networks and improved YOLOv3 model. Sensors 2020, 20, 4430. [Google Scholar] [CrossRef]
  254. Valerio Giuffrida, M.; Scharr, H.; Tsaftaris, S.A. Arigan: Synthetic Arabidopsis Plants Using Generative Adversarial Network. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops, Venice, Italy, 22–29 October 2017; pp. 2064–2071. [Google Scholar]
  255. Chou, Y.-C.; Kuo, C.-J.; Chen, T.-T.; Horng, G.-J.; Pai, M.-Y.; Wu, M.-E.; Lin, Y.-C.; Hung, M.-H.; Su, W.-T.; Chen, Y.-C. Deep-learning-based defective bean inspection with GAN-structured automated labeled data augmentation in coffee industry. Appl. Sci. 2019, 9, 4166. [Google Scholar] [CrossRef]
  256. Gunning, D.; Aha, D. DARPA’s explainable artificial intelligence (XAI) program. AI Mag. 2019, 40, 44–58. [Google Scholar]
  257. Carvalho, D.V.; Pereira, E.M.; Cardoso, J.S. Machine learning interpretability: A survey on methods and metrics. Electronics 2019, 8, 832. [Google Scholar] [CrossRef]
  258. Shams, M.Y.; Gamel, S.A.; Talaat, F.M. Enhancing crop recommendation systems with explainable artificial intelligence: A study on agricultural decision-making. Neural Comput. Appl. 2024, 36, 5695–5714. [Google Scholar] [CrossRef]
  259. Ryo, M.; Angelov, B.; Mammola, S.; Kass, J.M.; Benito, B.M.; Hartig, F. Explainable artificial intelligence enhances the ecological interpretability of black-box species distribution models. Ecography 2021, 44, 199–205. [Google Scholar] [CrossRef]
  260. Murdoch, W.J.; Singh, C.; Kumbier, K.; Abbasi-Asl, R.; Yu, B. Definitions, methods, and applications in interpretable machine learning. Proc. Natl. Acad. Sci. USA 2019, 116, 22071–22080. [Google Scholar] [CrossRef]
  261. Molnar, C. Interpretable Machine Learning; Lulu Press: Morrisville, NC, USA, 2020; ISBN 0244768528. [Google Scholar]
  262. Boehmke, B.; Greenwell, B.M. Hands-On Machine Learning with R.; Chapman and Hall/CRC: Boca Raton, FL, USA, 2019; ISBN 0367816377. [Google Scholar]
  263. Duval, A. Explainable Artificial Intelligence (XAI). Master’s Thesis, The University of Warwick, Coventry, UK, 2019. [Google Scholar]
  264. Speith, T.; Langer, M. A New Perspective on Evaluation Methods for Explainable Artificial Intelligence (XAI). In Proceedings of the 2023 IEEE 31st International Requirements Engineering Conference Workshops (REW), Hannover, Germany, 4–5 September 2023; pp. 325–331. [Google Scholar]
  265. Ryo, M. Explainable artificial intelligence and interpretable machine learning for agricultural data analysis. Artif. Intell. Agric. 2022, 6, 257–265. [Google Scholar] [CrossRef]
  266. Hu, T.; Zhang, X.; Bohrer, G.; Liu, Y.; Zhou, Y.; Martin, J.; Li, Y.; Zhao, K. Crop yield prediction via explainable AI and interpretable machine learning: Dangers of black box models for evaluating climate change impacts on crop yield. Agric. For. Meteorol. 2023, 336, 109458. [Google Scholar] [CrossRef]
  267. Mehedi, M.H.K.; Hosain, A.K.M.S.; Ahmed, S.; Promita, S.T.; Muna, R.K.; Hasan, M.; Reza, M.T. Plant Leaf Disease Detection using Transfer Learning and Explainable AI. In Proceedings of the 2022 IEEE 13th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, Canada, 12–15 October 2022; pp. 166–170. [Google Scholar]
  268. Sowmiyan, L.; Vaidya, S.; Karpagam, G.R. An Explainable AI (XAI)-Based Framework for Detecting Diseases in Paddy Crops BT—Data Science and Applications; Nanda, S.J., Yadav, R.P., Gandomi, A.H., Saraswat, M., Eds.; Springer Nature Singapore: Singapore, 2024; pp. 411–430. [Google Scholar]
  269. Rakesh, S.; Indiramma, M. Explainable AI for Crop Disease Detection. In Proceedings of the 2022 4th International Conference on Advances in Computing, Communication Control and Networking (ICAC3N), Greater Noida, India, 16–17 December 2022; pp. 1601–1608. [Google Scholar]
Figure 1. Paper selection criteria.
Figure 1. Paper selection criteria.
Sensors 24 06313 g001
Figure 2. Paper flowchart.
Figure 2. Paper flowchart.
Sensors 24 06313 g002
Figure 3. RGB, thermal, and multispectral remote sensing. RGB images capture the color and growth status of crops through visible light, while thermal imaging sensors detect temperature changes in crops to identify water stress or diseases. Multispectral imaging utilizes multiple wavelengths of light to analyze the physiological responses and health conditions of crops in greater detail.
Figure 3. RGB, thermal, and multispectral remote sensing. RGB images capture the color and growth status of crops through visible light, while thermal imaging sensors detect temperature changes in crops to identify water stress or diseases. Multispectral imaging utilizes multiple wavelengths of light to analyze the physiological responses and health conditions of crops in greater detail.
Sensors 24 06313 g003
Figure 4. The structure of the SVM is shown, with X1 and X2 as input features. The yellow circle highlights the support vectors, which help to define the margin and the hyperplane separating Group 1 and Group 2.
Figure 4. The structure of the SVM is shown, with X1 and X2 as input features. The yellow circle highlights the support vectors, which help to define the margin and the hyperplane separating Group 1 and Group 2.
Sensors 24 06313 g004
Figure 5. X1, X2, and X3 are the original features. PC1 and PC2 are principal components from PCA, while P1 and O1 are components from PLS.
Figure 5. X1, X2, and X3 are the original features. PC1 and PC2 are principal components from PCA, while P1 and O1 are components from PLS.
Sensors 24 06313 g005
Figure 6. Artificial Neural Network structure. This figure depicts the structure of an Artificial Neural Network (ANN) composed of an input layer, hidden layers, and an output layer; it shows how each node is interconnected to process data and derive output values.
Figure 6. Artificial Neural Network structure. This figure depicts the structure of an Artificial Neural Network (ANN) composed of an input layer, hidden layers, and an output layer; it shows how each node is interconnected to process data and derive output values.
Sensors 24 06313 g006
Figure 7. The input image (224 × 224 × 3) passes through convolution, max-pooling layers for feature extraction, and fully connected layers for classification. Black squares indicate focused regions at each layer.
Figure 7. The input image (224 × 224 × 3) passes through convolution, max-pooling layers for feature extraction, and fully connected layers for classification. Black squares indicate focused regions at each layer.
Sensors 24 06313 g007
Figure 8. Boosting algorithm structure (red circles: incorrect prediction and blue circles: correct prediction).
Figure 8. Boosting algorithm structure (red circles: incorrect prediction and blue circles: correct prediction).
Sensors 24 06313 g008
Figure 9. Bagging algorithm structure (red circles: incorrect prediction and blue circles: correct prediction).
Figure 9. Bagging algorithm structure (red circles: incorrect prediction and blue circles: correct prediction).
Sensors 24 06313 g009
Figure 10. The basic structure of GAN. The generator creates fake images, and the discriminator learns to distinguish them from real ones, leading to increasingly realistic images over time.
Figure 10. The basic structure of GAN. The generator creates fake images, and the discriminator learns to distinguish them from real ones, leading to increasingly realistic images over time.
Sensors 24 06313 g010
Figure 11. Comparison between general Machine Learning and Explainable AI (XAI) Approaches. The top section depicts the standard machine learning process, where models make decisions without explanations. In contrast, the bottom section shows the XAI approach, which generates explainable models and provides transparency through an explanation interface.
Figure 11. Comparison between general Machine Learning and Explainable AI (XAI) Approaches. The top section depicts the standard machine learning process, where models make decisions without explanations. In contrast, the bottom section shows the XAI approach, which generates explainable models and provides transparency through an explanation interface.
Sensors 24 06313 g011
Table 1. Search term strings per database. A comprehensive review of the literature was conducted to analyze the application of machine learning and deep learning techniques in evaluating crop water stress using different data modalities. The total number of collected research papers for each modality was as follows: for Crop Water Stress Index (CWSI), 21 papers were identified; for RGB images, 11 papers were collected; for thermal images, 32 papers were gathered; and for hyperspectral images, 31 papers were found.
Table 1. Search term strings per database. A comprehensive review of the literature was conducted to analyze the application of machine learning and deep learning techniques in evaluating crop water stress using different data modalities. The total number of collected research papers for each modality was as follows: for Crop Water Stress Index (CWSI), 21 papers were identified; for RGB images, 11 papers were collected; for thermal images, 32 papers were gathered; and for hyperspectral images, 31 papers were found.
KeywordSearch Terms and CriteriaNumber of Papers
CWSIMachine learning, Deep learning, Water stress, Crop, CWSI21
RGBMachine learning, Deep learning, Water stress, Crop, RGB11
ThermalMachine learning, Deep learning, Water stress, Crop, Thermal32
Hyperspectral imageryMachine learning, Deep learning, Water stress, Crop, Hyperspectral31
Table 2. RGB imaging for crop water stress evaluation.
Table 2. RGB imaging for crop water stress evaluation.
CropBest ModelMethodologiesObjectiveAuthorsPublisherNationYear
ChickpeaSVMSVM,
K-Nearest Neighbors,
DT,
Naive Bayes (NB),
Discriminant Analysis (DA)
Using images of chickpea shoots to identify crop water stress due to low soil moisture[52]IEEEIndia2020
MaizeConvolutional Neural Network
(CNN)
CNNRecognizing and quantifying water stress in maize using digital imagery[53]ElsevierChina2020
SoybeanPartial Least Squares Discriminant Analysis
(PLS-DA)
Partial Least Squares Discriminant AnalysisApplicability and limitations of RGB image-based crop vigor indices in determining chilling stress in soya beans[54]Korean Society of AgrometeorologyKorea2021
WheatCNN-LSTM-CNNCNN,
Long Short-Term Memory (LSTM),
CNN-CNN,
LSTM-LSTM,
CNN-LSTM-CNN
Identification and automatic detection of water stress in wheat crops[55]ElsevierChina2022
Wheat and maizeGoogLeNetAlexNet, GoogLeNet, Inception V3, MobileNet V2, ResNet-50Development of a device for real-time assessment of water stress in wheat and maize crops[56]ElsevierIndia2024
Table 3. Thermal imaging for crop water stress evaluation.
Table 3. Thermal imaging for crop water stress evaluation.
CropBest ModelMethodologiesObjectiveAuthorsPublisherNationYear
-ANNANNImplementing a system for monitoring water stress in crops[61]IEEERomania2018
Grapes-Rotation Forests (ROF),
DT
Thermal-image-based estimation and field assessment of water stress in grapes[62]PLOSSpain2018
WheatClassification and Regression Tree (CRT)CRT algorithmThermal-image-based biomass and grain yield prediction of wheat grown under moisture stress in sodic soil environments[63]ElsevierAustralia2021
BrassicaRandom Forest (RF)RFAssessing crop moisture status with simulated baseline canopy temperature and predicted CWSI for brassica in China[64]MDPIChina2021
RiceANNANNCanopy moisture content prediction based on thermal–RGB imaging in rice[65]MDPIChina2021
CherryANNANNThermal-image-based assessment of cherry moisture status[66]ElsevierChile2022
SugarcaneInception-Resnet-v2Inception-Resnet-v2Predicting water stress in sugarcane crops based on thermal imagery[67]ElsevierBrazil2022
TomatoVGG-19VGG-19Water stress classification in tomato crops based on thermal and optical aerial imagery[68]J.UCSItaly2022
WheatResNet50ANN,
K-Nearest Neighbors (KNN),
Logistic Regression (LO), SVM,
LSTM
Water stress assessment using thermal–RGB imaging in winter wheat[57]MDPIIndia2022
RiceGenerative Adversarial Network
(GAN)
GANMonitoring moisture stress with reconstructed thermal images[32]IEEEIndonesia2022
RiceRFRFMoisture-parameter-based moisture status estimation in rice using thermal imagery[69]ElsevierChina2023
CottonMobilenetV3VGG16,
ResNet-18,
MobilenetV3,
DenseNet-201,
CSPdarknet53
Predicting water stress in cotton crops based on thermal imagery[70]ElsevierChina2024
WheatGradient-Boosting Decision Tree (GBDT)GBTD,
PLS,
SVM
Diagnosing water stress in wheat growth[71]ElsevierChina2024
Table 4. CWSI for crop water stress evaluation.
Table 4. CWSI for crop water stress evaluation.
CropBest ModelMethodologiesObjectiveAuthorsPublisherNationYear
Sugar beet,
wine grape
Nash–SutcliffeNash–Sutcliffe, linear modelEstimating baseline canopy temperature for CWSI calculations[86]ASABEUSA2020
RiceFF-BP-ANNSelf-Organizing Maps (SOM),
Feedforward Backpropagation Artificial Neural Network (FF-BP-ANN)
Using machine learning techniques to determine optimal CWSI values for rice[87]Taylor & FrancisIndia2023
MaizeLinear regression (LR)LRDevelopment of a thermal-imaging-based CWSI approach for the assessment of water stress and yield prediction in maize[88]WileyThailand2023
MaizeCatBoostANN,
LSTM,
RF,
CatBoost,
SVM,
KNN,
Multiple Linear Regression (MLP),
Stacked-RF, Stacked Regression,
Weighted Ensemble
CWSI prediction for corn crops[89]ElsevierUSA2023
CottonExtreme Gradient Boosting
(XGBoost)
SVM,
XGBoost, Backpropagation Neural Network (BPNN)
Evaluation of CWSI estimation during the cotton growing season based on UAV multispectral imagery[90]ElsevierChina2024
Wheat, mustardANN5 (ANN with five hidden neurons)SVM,
ANN,
Adaptive Neuro-Fuzzy Inference System
CWSI prediction using relative humidity, air temperature, and canopy temperature[91]Research SquareIndia2024
Sorghum,
maize
RFRF,
SVM,
PLS
Comparing the applicability of the CWSI to the Three-Dimensional Drought Index (TDDI), which consists of temperature, air temperature, and five vegetation indices.[92]ElsevierChina2024
CitrusLong sequences: CNN-LSTM; short sequences: ConvLSTMConvLSTM,
CNN-LSTM
CWSI-based water stress prediction[93]MDPIMorocco2024
MaizeRFPLS,
SVM,
RF
Determining water stress indices for monitoring and mapping crop water stress variability[94]MDPISouth Africa2024
WheatMLPMLP,
SMOreg,
M5P,
RF,
IBK,
Random Trees (RT),
bagging,
Kstar
CWSI prediction for wheat crops[95]ASCEIndia2024
Table 5. Multispectral and hyperspectral imaging for crop water stress evaluation.
Table 5. Multispectral and hyperspectral imaging for crop water stress evaluation.
CropBest ModelMethodologiesObjectiveAuthorsPublisherNationYear
GrapesRFXGBoost,
RF
Hyperspectral-data-based water stress assessment in grapes[99]MDPISouth Africa2018
LettuceANNANNHyperspectral-data-based water stress assessment in lettuce[100]MDPIBrazil2019
MaizeSVM and K-means Clustering AlgorithmSVM and K-Means Clustering AlgorithmHyperspectral-data-based analysis for water stress assessment and recovery in maize[101]ElsevierBelgium2019
A variety of leavesCNNCNNEstimating leaf water content to quantify water stress[102]IEEEPakistan2019
Soybeans, maizePLSRPLSRAssessing plants’ physiological water stress responses[103]MDPIDenmark2020
Chickpeas3D to 2D CNN3D to 2D CNNAssessing water stress in chickpeas based on hyperspectral data acquired by UAVs[104]IEEEIndia2021
PotatoesRF,
XGBoost
RF,
MLP,
CNN,
SVM,
XGBoost,
AdaBoost
Hyperspectral-data-based water stress assessment in potatoes[105]MDPIColombia2021
MaizeRFRF,
ANN,
MLR
Managing water stress in maize crops and estimating crop traits[106]ElsevierChina2021
MaizeSVMRF,
SVM
Moisture stress detection and optimal wavelength region selection based on hyperspectral data during corn’s grain-filling stage[107]IEEEIndia2022
Pearl milletRFE-SVMSelectFromModel RF (SFM-RF),
SelectFromModel SVM (SFM-SVM),
SelectFromEnsemble RF (SFE-RF),
Recursive Feature Elimination SVM (RFE-SVM),
Chi2
Identifying canopy moisture stress in pearl millet crops[108]IEEEIndia2022
GrapesRFCOptimized RF Classifier (RFC),
ANN
Hyperspectral-data-based water stress assessment in grapes[109]ASABEUSA.2022
Peanuts-SelectFromModel RF (SFM-RF),
SelectFromModel SVM (SFM-SVM),
SelectFromEnsemble RF (SFE-RF),
Recursive Feature Elimination SVM (RFE-SVM)
Canopy water stress assessment based on hyperspectral data in peanuts[110]IEEEIndia2023
MaizeRFLASSO,
PLSR,
RF
Monitoring plant water stress for plant transpiration rates[111]SpringerLinkBelgium2023
GrapesPLSPLSSoil moisture and grape water stress detection based on hyperspectral data under diffuse illumination[112]ElsevierUSA2023
WheatSVMWavelet Index Model,
MLR,
RF,
SVM
Monitoring moisture status in winter wheat[113]SpringerLinkChina2023
Wheat(multi-random ensemble on PLSR)
MRE-PLSR
RFR (RF Regression),
PLSR,
MRE-PLSR
Predicting yield at different growth stages of wheat crops under moisture stress conditions[114]ElsevierChina2024
BroccoliPyCaretPyCaret,
PLS-DA
Assessment of water stress in broccoli based on AutoML and hyperspectral data[115]ElsevierGreece2024
RiceGBDTGBDTIntegrating leaf moisture data from multiple rice varieties to create a model to estimate crop moisture status[116]SpringerLinkChina2024
Table 6. Examples of using machine learning to analyze crop water stress.
Table 6. Examples of using machine learning to analyze crop water stress.
Algorithms UsedNumber UsesPercentage (%)
SVM(R)623.6%
PLS(DA)311.5%
KNN27.7%
DT27.7%
SVM-based models27.7%
DA13.8%
NB13.8%
ROF13.8%
K-Means13.8%
CRT13.8%
LO13.8%
RFC13.8%
PyCaret13.8%
SOM13.8%
LR13.8%
ANFIS13.8%
Total26100%
Table 7. Examples of using deep learning to analyze crop water stress.
Table 7. Examples of using deep learning to analyze crop water stress.
Algorithms UsedNumber UsesPercentage (%)
CNN-based models947.3%
ANN-based models631.6%
BPNN210.5%
MLP15.3%
LSTM15.3%
Total19100%
Table 8. Examples of using ensemble learning to analyze crop water stress.
Table 8. Examples of using ensemble learning to analyze crop water stress.
Algorithms UsedNumber of UsesPercentage (%)
RF538.4%
XGBoost323.1%
RF-based models215.4%
SVM and K-means Clustering Algorithm17.7%
Inception-Resnet-v217.7%
AdaBoost17.7%
Total13100%
Table 9. Case analysis with SVM.
Table 9. Case analysis with SVM.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[52]National Institute of Plant Genome Research (NIPGR)5 monthsThe images were taken indoors using a Canon camera in auto mode.A total of 8000 images were collected, with 240 images per plant.73%
[107]Institute of Agricultural Research, Chinese Academy of Agricultural SciencesFrom October 2020 to May 2022Measurements were taken directly in the field using a handheld spectrometer.A total of 246 sample data points.93%
[108]Xinxiang City in Henan Province and Xingtai City in Hebei Province, China.From October 2022 to June 2023Hyperspectral data was collected during flight using a drone.A total of 900 samples were collected from 12 treatments in Xinxiang and Xingtai, with 30 samples per treatment in Xinxiang and 45 in Xingtai.95.38%
[113]Henan Province, China2020 to 2022--93%
Table 10. Case analysis with PLS.
Table 10. Case analysis with PLS.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[54]Central Institute of Agricultural Engineering located in Bhopal, Indiafrom July to October 2020RGB images were captured from the top of the rainout shelter using a commercial digital camera.RGB images were captured a total of 26 times.-
[103]Riso Environmental Risk Assessment Facility (RERAF) in Roskilde, Denmarkfrom March to June 2018Data were collected directly using a FLIR Tau2 324 camera (thermal imaging) and a Cubert UHD 185 camera.144 soybean samples, 126 maize samples92%
[112]a Vitis vinifera L. cv. Riesling vineyard located in Prosser, Washington, USAIn 2021Spectral data were collected using a ground-based hyperspectral camera.A total of 179 leaf samples and 62 soil moisture samples89%
Table 11. Case analysis with ANN.
Table 11. Case analysis with ANN.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[57]Research farm at the Central Institute of Agricultural Engineering (CIAE) located in Bhopal, India.From 2019 to 2021.Images were collected from a distance of 1 m using an integrated thermal-RGB imaging system based on Raspberry Pi.A total of 3200 images (1600 RGB and 1600 thermal images).96.7%
[61]University of Pitesti and Polytechnic of Bucharest, RomaniaIn 2018Images were automatically captured every 1 to 10 min using a FLIR thermal camera (300 × 128 resolution).A total of 50,000 images97.8%
[65]Zhejiang University located in Hangzhou, Zhejiang, ChinaFrom 10 July to 21 September 2018Captured directly using a Canon PowerShot SX720 HS camera and a FLIR Tau2-640 thermal imaging camera.A total of 400 images were captured, of which 360 images were used.99.4%
[66]A cherry orchard spanning 13.2 hectares in the Curicó region of Chile.2017–2018 and 2018–2019.Captured at a distance of 3.5 m using a FLIR thermal imaging camera (TIS60, Fluke Corporation).Collected physical indicators and thermal imaging data from a total of 24 trees.83%
[86]Idaho, Wyoming, and Oregon in the United States.Over a period of five years.Collected based on directly measured canopy temperature and surrounding environmental data.-88%
[87]Irrigation Laboratory at the Indian Institute of Technology, Roorkee.During the rice growing season.Data collected from laboratory measurements of meteorological variables: relative humidity, air temperature, and canopy temperature.-97%
[91]Agricultural research station at the National Institute of Technology in Hamirpur, India.From 2017 to 2019.Humidity and air temperature were recorded every 10 min, and canopy temperature was measured with a portable infrared thermometer.Indian mustard: 1260 for development, 1350 for validation; wheat: 1530 for development, 1458 for validation.99%
[95]-From December 2022 to April 2023.Data were directly collected using a portable infrared radiometer and a weather observation station.--
[100]Growth chamber (Phytotron) under controlled conditions at the Federal University of Mato Grosso do Sul in Brazil.-Spectrum data were collected in the range of 325–1075 nm using a FieldSpec HandHeld ASD Spectroradiometer.360 spectral signatures were measured, with 90 collected over four days of the experiment.93%
Table 12. Case analysis with CNN.
Table 12. Case analysis with CNN.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[53]Shaanxi Province, China.The experiment began on 18 June 2014 and continued throughout the plant growth period.Images were collected from a height of 4.5 m using a CCD camera mounted on a fixed platform.A total of 18,040 digital images88.41%
[56]Central region of India.From November 2021 to June 2022.Image data were collected in the field using a Raspberry Pi device and various RGB cameras, including a Canon PowerShot SX740, Raspberry Pi camera, and smartphone.A total of 3200 RGB images97.9% for maize and 92.9% for wheat
[57]Research farm at the Central Institute of Agricultural Engineering (CIAE) located in Bhopal, India.From 2019 to 2021.Images were collected from a distance of 1 m using an integrated thermal-RGB imaging system based on Raspberry Pi.A total of 3200 images (1600 RGB and 1600 thermal images).98.4%
[67]University of São Paulo (USP/ESALQ) located in São Paulo, Brazil.From 11 August 2019, for a duration of 120 days.Images were collected using a FLIR ONE Pro LT thermal imaging camera, which was connected to a smartphone.A total of 4050 thermal images-
[68]A tomato farm near Benevento, southern Italy.The entire growth cycle of tomato crops.Collected thermal and optical images using a drone (UAV).6600 thermal and 6600 optical images80.5%
[70]Shihezi University in the Xinjiang region of China.From May 2023 to August 2023, a total of 150 days.The FLIR ONE Pro thermal imaging camera was connected to a smartphone for use.A total of 1300 thermal images-
[102]Institute of Space Technology located in Islamabad, Pakistan.-The reflection spectra of plant leaves were collected in the laboratory using a spectroradiometer.402 image data sets were collected for 11 plant species, with each image containing 3457 spectral bands.98.4%
[104]International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) located in Hyderabad, India.-Data were collected using a Pika-L Hyperspectral Imaging (HSI) camera mounted on a drone (UAV).208 data lines were collected, with 26 genomes per treatment and 8 repetitions.95.44%
Table 13. Case analysis with Ensemble.
Table 13. Case analysis with Ensemble.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[63]Conducted at two locations in the Goondiwindi region of Australia: medium salinity soil (MS) and high salinity soil (HS).From May to November 2018.Thermal images were collected with a FLIR Tau 2 camera on a DJI Matrice 600 Pro drone, along with a MicaSense RedEdge-M multispectral camera.--
[64]South China Agricultural University located in Guangzhou, China.From 27 November 2020 to 31 December 2020, and from 25 May 2021 to 20 June 2021.Canopy temperature was measured at 0.3 m every 10 min with an infrared radiometer, while weather sensors recorded air temperature, humidity, wind speed, and photosynthetically active radiation.-0.91%
[69]Luogao Experimental Base in Jiangsu Province, China.From 2019 to 2020.Thermal images were collected using a FLIR SC620 thermal camera from a height of 1 m, at 2-hour intervals between 8 AM and 4 PM.A total of 205 data78%
[71]China Agricultural University Experimental Station in Zhuozhou, Hebei Province, ChinaMarch to June, 2021 and 2022UAV multispectral and thermal remote sensing14 vegetation indices and 2 thermal indices measured over 6 key growth stages90%
[89]West Central Research, Extension, and Education Center, University of Nebraska-Lincoln, Nebraska, USA2020 and 2021Sensor data assimilation (weather sensors, soil moisture sensors, infrared thermometers, etc.), real-time climate and soil moisture monitoringh540 total data points (30 days × 3 months × 2 years)-
[90]Yuli County, Xinjiang, China, in the alluvial plain downstream of the Tarim and Peacock RiversCotton sown on 4 April 2021, and harvested on 20 September 2021UAV-based multispectral and thermal imaging using MicaSense Altum camera attached to DJI M200 V2 UAV2946 valid images collected over five field measurement dates90%
[92]Jichangbuyi Miao Township, Anshun City, Guizhou Province, ChinaCrops were sown in May 2023, with data collected until maturity in August.A DJI Matrice300 RTK UAV equipped with MS600Pro multispectral and Zenmuse H20T thermal-infrared sensors was used.Ground-based VMC data were collected from 155 samples (108 for training and 47 for validation).76%
[94]Smallholder farm in southern Africa, specifically in the Swayimana rural area, uMshwathi Local Municipality, KwaZulu-Natal Province, South Africa.8 February 2021 to 26 May 2021.Collected using a DJI Matrice 300 UAV equipped with a MicaSense Altum sensor and a handheld infrared thermometer.3576 images85%
[99]Welgevallen experimental farm, Stellenbosch, Western Cape, South Africa.-Terrestrial hyperspectral imaging using the SIMERA HX MkII hyperspectral sensor.A total of 60 leaf spectra samples83.3%
[105]Tibaitatá Research Center, Corporación Colombiana de Investigación Agropecuaria (AGROSAVIA), Cundinamarca, ColombiaIn 2021Hyperspectral imagery (400–1000 nm) using 128 spectral bands from a Surface Optics Corporation 710-VP cameraA total of 116 images99.7%
[106]A 1.13-hectare maize field located in Zhaojun Town, Inner Mongolia, China2018 and 2019 growing seasonsUAV imagery was collected using a self-developed hexacopter equipped with a MicaSense RedEdge camera for multispectral imagery, and a DJI Phantom 4 Pro for RGB imagery.A total of 165 multispectral and 161 RGB images were collected in 2018, and 135 multispectral and 134 RGB images in 2019.89%
[109]An experimental vineyard in arid southeastern Washington, USA.Data collected over two growing seasons.Hyperspectral images acquired from a ground-based utility vehicle.-73%
[111]PHENOVISION automated phenotyping platform, a semi-controlled greenhouse, Belgium.-Proximal thermal and hyperspectral imaging using a high-throughput plant phenotyping platform.14,744 images and 288 additional physiological trait images were collected.63%
[114]Comprehensive Experimental Base of the Chinese Academy of Agricultural Sciences, Xinxiang City, Henan Province and Yanli Experimental Base, Xingtai City, Hebei Province, ChinaFrom 2022 to 2023UAV-based hyperspectral data acquisition using a DJM600 Pro UAV equipped with a Resonon Pika L nano-hyperspectral scanner.--
[116]Hefei, Anhui Province and Fuyang, Anhui Province, ChinaFrom 2021 to 2022Hyperspectral remote sensing using the ASD FieldSpec 4 device for spectral data collection.A total 91 sample data points86%
Table 14. Case analysis using different models.
Table 14. Case analysis using different models.
AuthorsStudy AreaStudy PeriodData Acquisition MethodsNumber of DataAccuracy
[55]Zhejiang University, Hangzhou, Chinafrom October 2019 to January 2020.IoT-based multimodal data acquisition, including RGB images, soil moisture sensors, air temperature, relative humidity, and wind speed.876 images expanded to 5256 with augmentation.100%
[63]Goondiwindi, northeastern grains growing region, Australia.2018 wheat growing season (May to November).UAV thermal remote sensing with a FLIR Tau 2 camera on a DJI Matrice 600 Pro, collecting data--
[32]Ismail, Daddy Budiman, Ervan Asri, Zass Ressy Aidha-Visible images were used to generate thermal images using a deep learning-based GAN model.--
[88]Phitsanulok Province, Thailand (Thapo sub-district and Wang Thong district)from 2018 to 2020Data were collected using a FLIR C2 camera for thermal imaging positioned above the crop canopy, along with soil moisture sensors and weather data.-90%
[93]Ourgha Farm, Khnichet rural commune, Sidi Kacem Province, Moroccofrom 2015 to 2023Remote sensing data obtained from Landsat 8 satellite images processed through Google Earth Engine, focusing on the Crop Water Stress Index (CWSI).50 Landsat 8 satellite images-
[101]PHENOVISION high-throughput phenotyping platform, VIB, Ghent, Belgium50 days of plant growth were monitored, beginning from the V2 growth stage.Hyperspectral imaging using a VNIR-HS line scan push-broom camera (ImSpector V10E), capturing images across 194 spectral bands (400–1000 nm).1900 hyperspectral images were collected from six drought treatment groups.96%
[110]International Crops Research Institute for the Semi-Arid Tropics (ICRISAT), Hyderabad, Indiafrom November 2021 to February 2022.Hyperspectral imaging using a Resonon Pika-L camera on a DJI Matrice-600 Pro UAV, capturing 300 bands (385–1020 nm).16,000 samples were collected with 1000 per genotype per class.96.46%
[115]Agricultural University of Athens, Athens, GreeceIn 2024Hyperspectral imaging with a Snapscan VNIR camera on a three-wheel platform using natural sunlight.120 images were captured, reduced to 90 after outlier removal (42 from drought onset, 48 from acclimation).-
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cho, S.B.; Soleh, H.M.; Choi, J.W.; Hwang, W.-H.; Lee, H.; Cho, Y.-S.; Cho, B.-K.; Kim, M.S.; Baek, I.; Kim, G. Recent Methods for Evaluating Crop Water Stress Using AI Techniques: A Review. Sensors 2024, 24, 6313. https://doi.org/10.3390/s24196313

AMA Style

Cho SB, Soleh HM, Choi JW, Hwang W-H, Lee H, Cho Y-S, Cho B-K, Kim MS, Baek I, Kim G. Recent Methods for Evaluating Crop Water Stress Using AI Techniques: A Review. Sensors. 2024; 24(19):6313. https://doi.org/10.3390/s24196313

Chicago/Turabian Style

Cho, Soo Been, Hidayat Mohamad Soleh, Ji Won Choi, Woon-Ha Hwang, Hoonsoo Lee, Young-Son Cho, Byoung-Kwan Cho, Moon S. Kim, Insuck Baek, and Geonwoo Kim. 2024. "Recent Methods for Evaluating Crop Water Stress Using AI Techniques: A Review" Sensors 24, no. 19: 6313. https://doi.org/10.3390/s24196313

APA Style

Cho, S. B., Soleh, H. M., Choi, J. W., Hwang, W. -H., Lee, H., Cho, Y. -S., Cho, B. -K., Kim, M. S., Baek, I., & Kim, G. (2024). Recent Methods for Evaluating Crop Water Stress Using AI Techniques: A Review. Sensors, 24(19), 6313. https://doi.org/10.3390/s24196313

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop