Next Article in Journal
Complex-Valued 2D-3D Hybrid Convolutional Neural Network with Attention Mechanism for PolSAR Image Classification
Next Article in Special Issue
A Denoising Method Based on DDPM for Radar Emitter Signal Intra-Pulse Modulation Classification
Previous Article in Journal
LARS: Remote Sensing Small Object Detection Network Based on Adaptive Channel Attention and Large Kernel Adaptation
Previous Article in Special Issue
Unpaired Remote Sensing Image Dehazing Using Enhanced Skip Attention-Based Generative Adversarial Networks with Rotation Invariance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard

1
College of Engineering and Technology, Southwest University, Chongqing 400715, China
2
Intelligent Equipment Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
3
College of Intelligent Science and Engineering, Beijing University of Agriculture, Beijing 102206, China
4
College of Agricultural Engineering, Shanxi Agricultural University, Jinzhong 030801, China
5
Xinjiang Academy of Agricultural and Reclamation Science, Shihezi 832000, China
*
Author to whom correspondence should be addressed.
Submission received: 21 June 2024 / Revised: 24 July 2024 / Accepted: 6 August 2024 / Published: 8 August 2024

Abstract

:
High-repetitive features in unstructured environments and frequent signal loss of the Global Navigation Satellite System (GNSS) severely limits the development of autonomous robot localization in orchard settings. To address this issue, we propose a LiDAR-based odometry pipeline GLIO, inspired by KISS-ICP and DLIO. GLIO is based on a nonlinear observer with strong global convergence, effectively fusing sensor data from GNSS, IMU, and LiDAR. This approach allows for many potentially interfering and inaccessible relative and absolute measurements, ensuring accurate and robust 6-degree-of-freedom motion estimation in orchard environments. In this framework, GNSS measurements are treated as absolute observation constraints. These measurements are tightly coupled in the prior optimization and scan-to-map stage. During the scan-to-map stage, a novel point-to-point ICP registration with no parameter adjustment is introduced to enhance the point cloud alignment accuracy and improve the robustness of the nonlinear observer. Furthermore, a GNSS health check mechanism, based on the robot’s moving distance, is employed to filter reliable GNSS measurements to prevent odometry crashed by sensor failure. Extensive experiments using multiple public benchmarks and self-collected datasets demonstrate that our approach is comparable to state-of-the-art algorithms and exhibits superior localization capabilities in unstructured environments, achieving an absolute translation error of 0.068 m and an absolute rotation error of 0.856°.

1. Introduction

Continuous and reliable localization is crucial for agricultural robots in environment perception, path planning, planning and decision-making, and motion control [1,2]. In the past decade, a multitude of robot localization solutions utilizing single-type sensors, such as Inertial Navigation Systems (INS), Global Navigation Satellite Systems (GNSS), and LiDAR/camera-based map matching, have been developed. However, these methods are inherently limited in their applicability to specific agricultural environments, such as manually cultivated orchards. Generally speaking, the Inertial Measurement Unit (IMU) may initially exhibit high accuracy, but inherent measurement bias and noise can result in cumulative error over prolonged periods. GNSS is capable of achieving centimeter-level accuracy in Real-Time Kinematic (RTK) mode. Its accuracy is significantly compromised when operating under canopy cover from trees or buildings, leading to reduced precision within the decimeter or even meter range. This decrease in signal strength renders GNSS data temporarily unreliable and unstable, especially in regions characterized by dense vegetation or urban landscapes where signal obstructions are prevalent. LiDAR/camera-based odometry methods rely on the number and distribution of structured features. The absence of well-defined points, lines, and surface features in agricultural settings, such as orchards, substantially undermines their robustness. Furthermore, camera-based methods are vulnerable to fluctuations in lighting conditions, which further limits their applicability in agricultural settings. Given the constraints of single-sensor systems, a multi-source sensor fusion framework has become the primary approach for addressing localization challenges in orchards.
Early research was predominantly concentrated on mitigating the temporary decline in performance of GNSS by integrating IMU. Researchers have developed methodologies for real-time estimation of optimal state variables (e.g., position, velocity, and attitude) by integrating measurement data, noise characteristics, and dynamic models from both IMU and GNSS systems. This was accomplished through the application of filtering algorithms such as Kalman filtering [3,4], complementary filtering [5], and Mahony filtering [6]. Other researchers have also developed semi-global exponential-stable observers for IMUs and global positioning sensors such as GNSS and UWBs [7,8,9]. This method not only addresses data reliability issues arising from GNSS signal loss or interference, but also effectively compensates for cumulative errors associated with long-term IMU pre-integration. As a result, it achieves uninterrupted, precise, and dependable navigation. However, robots may encounter extended periods of GNSS instability when traversing through rows of fruit trees. Relying exclusively on IMU integration in such situations could compromise the robot’s pose estimation accuracy. Hence, it is imperative to find solution that can handle these transitions seamlessly in order to maintain effective robot localization within challenging environments.
The advancement of perceptual sensor technology, including LiDAR and cameras, has propelled robot odometry to a higher level. These sensors enable the perception of the external environment, providing a rich array of environmental features such as color, texture, and structure to estimate self-motion. Compared to cameras, LiDAR has the capability to directly capture three-dimensional structural information of the environment and is less susceptible to variations in lighting and weather conditions. Its long-range capabilities and high precision have attracted significant attention from researchers, making it a valuable tool for scientific investigation. LiDAR is commonly combined with IMU and GNSS to achieve precise global state estimation and localization free from drift. According to previous literature [1,10], odometry based on multi-sensor fusion techniques with LiDAR can be classified as loosely coupled and tightly coupled, depending on the sensor coupling method.
Loosely Coupled LiDAR-Based Odometry: This method operates on the principle that data from LiDAR and other sources are processed independently and then fused together. LOAM [11] and LeGO-LOAM [12] serve as typical examples. The pose derived from IMU measurements is utilized to correct distortions in the point cloud scan, while also providing initial motion estimates for subsequent scan registration. Extended Kalman Filtering (EKF) is a widely employed technique for loosely coupled LiDAR-based odometry. For example, adaptive Kalman filtering is utilized to integrate GPS/INS/LiDAR data for robot pose estimation, thereby enhancing localization robustness [13]. Cartographers integrate GNSS, wheel odometry, LiDAR, and IMU through graph optimization [14]. Meanwhile, LOCUS [15] and LOCUS 2.0 [16] rely on LiDAR-based odometry and employ a straightforward rate-check of sensors as a health-metric-driven dynamic switching mechanism to achieve robust positioning in challenging perception conditions. The method of loosely coupled sensor integration, which utilizes multiple sensors to obtain a single measurement, effectively addresses the limitations of single-sensor odometry by leveraging the unique strengths of each sensor. The current system, however, does not fully exploit the potential for synergistic integration of sensors, which could be further optimized.
Tightly Coupled LiDAR-based Odometry: In contrast to loosely coupled strategies, tightly coupled LiDAR-based odometry facilitates seamless data processing from multiple sensors within a unified framework. This approach improves the precision and robustness of robot pose estimation by utilizing the relationships between different sensors to optimize data fusion [17]. This strategy can be further divided into filter-based and optimization-based methods. Iterative Kalman filters (IEKF) are commonly employed in filter-based methods, as demonstrated in FAST-LIO [18] and FAST-LIO2 [19], to mitigate estimation errors. LIO-EKF [20] which utilizes point-to-point registration and the classical extended Kalman filter scheme, presents an adaptive data association method without requiring parameter adjustments. However, the fundamental principle of the EKF entails estimating the state of a nonlinear system through linearization, which introduces errors in linear approximation and presents challenges for further development. The method of graph optimization aims to achieve global optimality by constructing and refining the error graph. It has the advantages of global optimization, high precision, and strong flexibility. LIO-SAM [21] establishes a tightly coupled LiDAR-IMU framework with exceptional precision and feasibility. LIO-Vehicle [22] further enhances the capabilities of LIO by incorporating a vehicle dynamics model, expanding its applicability to vehicular scenarios. The graph optimization-based method, however, incurs significant computational overhead and demonstrates suboptimal real-time performance.
The aforementioned methods are implemented in discrete time, potentially compromising accuracy. Consequently, there has been a growing interest in continuous-time methods. Elastic LiDAR Fusion [23] optimizes continuous linear trajectories for scan distortions, while Wildcat [24,25] iteratively fits cubic B-splines to eliminate surface map distortions. Recently, CT-ICP [26] and ElasticLiDAR++ [27] have introduced continuous-time trajectories using pure LiDAR methods, which are parameterized by two poses per scan. This enables flexible scan registration during optimization process. KISS-ICP [28], inspired by CT-ICP, relies on point-to-point ICP combined with adaptive thresholding for correspondence matching a robust kernel, a simple but widely applicable motion compensation approach, and a point cloud subsampling strategy. These features endow KISS-ICP with considerable resilience when dealing with challenge environments. DLIO [29] has introduced a novel coarse-to-fine approach for constructing continuous-time trajectories to achieve precise motion correction. The essence of DLIO lies in formulating a set of analytical equations parameterized solely by time, enabling fast and parallelizable point-wise deskewing. These methods heavily rely on the accuracy of scan-to-scan or scan-to-map matching. In unstructured scenarios, sparse features can introduce misalignment in point clouds, leading to the collapse of the odometry system.
Consequently, numerous studies incorporate GNSS data as an absolute observational constraint in the odometry system to enhance robustness and precision. Gao, Y. et al. [30] incorporated integrated high-frequency pose estimations from GNSS/INS as prior information for scan matching in the LiDAR module. To further improve the system stability and robustness, Chiang, K.W. et al. [31] proposed a semi-tightly coupled integration of GNSS, IMU, and grid-based LiDAR, that fused the position and attitude information from each sensor. To fully exploit the benefits of GNSS, some literature have integrated the raw data from GNSS with LiDAR-based odometry [32]. For instance, Li, S. et al. [33] employed the EKF to directly fuse the GNSS phase and pseudorange measurements, MEMS-IMU data, sparse visual landmarks, as well as extracted LiDAR edge/planar features. LIO-Fusion [34] employed factor graph optimization to fuse LiDAR, IMUs, encoders, GNSS, and a priori for achieving real-time robust odometry and preventing sensor failures.
All of the above methods are primarily applied in indoor or road-like structured scenes. In contrast, orchards are typically unstructured agricultural scenes characterized by sparse geometric features. These features exhibit high similarity and repetitive occurrence. Consequently, this poses a significant challenge for LiDAR-based odometry which heavily relies on point cloud registration. Moreover, agricultural environments introduce additional obstacles such as uneven terrain and potential interference from surrounding vegetation. These factors contribute to the intricacy of robust odometry measurements, necessitating sophisticated algorithms and sensor fusion technology for precise localization in such settings. Furthermore, the presence of dynamic elements, such as moving livestock or machinery, introduces an additional layer of complexity that must be carefully considered to achieve reliable odometry measurements in agricultural settings. Numerous scholars have conducted extensive research and obtained commendable outcomes. Nevertheless, when implemented in the context of this study, they may encounter setbacks due to the distinctive characteristics of the aforementioned environment [35,36,37].
In view of these challenges, we propose a tightly coupled LiDAR-based odometry (GLIO) framework, which integrates multi-sensor information, such as GNSS, IMU, and LiDAR information. It can provide precise positioning and sparse mapping capabilities in agricultural environments. Based on DLIO [29] and KISS-ICP [28], we utilized GNSS as absolute observational constraints to mitigate point cloud registration errors and deviations based on IMU nonlinear trajectory planners. Furthermore, we introduce an adaptive threshold to improve the accuracy of point cloud registration. In summary, the contributions of this paper are as follows:
(1)
We employ a health-check mechanism to filter GNSS data and integrate it with the IMU, thereby facilitating the robot’s estimation poses in orchard characterized by sparse geometric features and bumpy terrain;
(2)
We propose a novel approach to enhance alignment stability in orchard by integrating IMU and GNSS information into scan-to-map using the point-to-point ICP algorithm, complemented with adaptive thresholding;
(3)
We use GNSS as absolute observational constraints to improve the robustness of LiDAR-based odometry, thereby compensating for the influence of initial values on the nonlinear estimator;
(4)
Extensive experiments conducted on various public datasets and our self-made orchard dataset demonstrate the robustness of our approach in maintaining precise and consistent odometry, even in challenging agricultural environments.

2. System Description

The spatial coordinates in our unified system are defined by incorporation data from multiple sensors, based on the following elements:
Relative Sensor Coordinate: The measurements from each sensor are captured using their respective coordinate system, indicating that the relative sensor coordinates correspond to the coordinates in which each individual sensor is positioned. Relative sensor coordinates in our framework include LiDAR coordinates ( · ) L , IMU coordinates ( · ) B , and robot body coordinates ( · ) R . The LiDAR and IMU coordinates are aligned with themselves. The robot’s body coordinate is positioned at its center of mass, where the x-axis indicates forward direction, the y-axis indicates pointing to the left, and the z-axis denotes upward direction. To ensure accurate data integration, the coordinates of the IMU, LiDAR, and robot must be aligned using an extrinsic parameter matrix. This alignment guarantees accurate measurements from different sensors are correctly integrated within the unified system, allowing for precise and reliable navigation and localization.
Absolute Sensor Coordinate: The GNSS serves as the sole absolute sensor in our system, thereby rendering the GNSS coordinates as the definitive reference for absolute position. The latitude and longitude obtained from GNSS are based on a global coordinate of Earth. To ensure consistency with other coordinates within our system, we employed the Gauss-Krueger projection for converting global coordinates into the ENU (East-North-Up) coordinate ( ) G . In this coordinate, the x-axis is oriented towards the east, the y-axis points northward, and the z-axis points upward. This conversion facilitates seamless integration of GNSS data with relative sensor coordinates, thereby establishing a unified framework for precise navigation and localization.
Local World Coordinate: The LiDAR-inertial system typically operates within the local world coordinate framework. In our system, the origin of the local world coordinate ( · ) W can be arbitrarily fixed, with the z-axis commonly aligned with gravity. In this paper, we set the robot’s initial position as the origin of the local world coordinate ( · ) W . The directional axes of this local coordinate are aligned with the robot body coordinate ( · ) R to ensure consistency and facilitate sensor data integration. This alignment enables the local world coordinate to serve as a stable reference frame for the multi-sensor-fusion-based odometry, thereby enhancing the accuracy and reliability of robot localization.
The state of the robot at time k can be defined within our framework as the following:
X k = [ p k W , q k W , v k W , b k a , b k ω , T W G ] ,
where, p W 3 is the robot’s position, q W is the quaternion vector orientation represented in Hamilton notation, v W 3 is the robot’s velocity, T W G denotes the transformation matrix from the GNSS coordinate system ( ) G to the local world coordinate ( · ) W , b a 3 represents the accelerometer bias of IMU, and b ω 3 represents the gyroscope bias of IMU.
Based on DLIO [29] and KISS-ICP [28], we leveraged GNSS as an absolute observation to constrain the estimation bias of the nonlinear observer, thereby providing an initial transformation for scan-to-map registration. The framework consists of five parts: initialization, GNSS data health checking mechanism, point cloud deskewing and prior optimization, scan-to-map registration, and a nonlinear observer to estimate the robot’s pose and construct a 3D sparse map. The overall pipeline is illustrated in Figure 1. This entire process is implemented within a robot operating system (ROS), as depicted in pseudocode Algorithm 1. Within ROS, GLIO comprises of two nodes: odometry and map, represented by the hollow circle and solid circle shown in Figure 1. The map node primarily constructs a global map by utilizing keyframes and corresponding pose information obtained from the odometry node. The odometry module is primarily responsible for processing and integrating data from LiDAR, IMU, and GNSS information to estimate the robot pose. The process is as follows: (i) GLIO employs three threads for LiDAR, IMU, and GNSS, respectively, to handle the data obtained from the sensors and initializes them according to Algorithm 1. (ii) Upon receiving new point cloud P ~ k R , we utilize IMU velocity integration v ^ k W to estimate the relative pose transformation between points for motion distortion correction. Subsequently, by combining GNSS and IMU data, we provide prior optimization T k W to finish the current cloud and submap registration. (iii) The submap S ^ k W generated by keyframe detection M ^ k W and the submap generation module is utilized for point cloud registration. During the scan-to-map phase, we evaluate the point cloud registered pose T k W and GNSS relative pose T k G in the local world coordinate to facilitate configuration assessment of point cloud registration, and then incorporate it into a nonlinear observer for pose estimation X ^ k W .
Algorithm 1. GNSS-LiDAR-Inertial Odometry.
Input: X ^ k 1 W , P k L , a k B , ω k B , g k G , σ g
Output: X ^ i W , M ^ k W
  //LiDAR Callback Thread
  1:
while  P k L  do
  2:
P ~ k R i n t i a l i z e P o i n t C l o u d P k L ; (Section 2.1)
 //motion distortion correction
  3:
for  a i B , ω k B  between  t k 1  and  t k  do
  4:
   p ^ i , v ^ i , q ^ i d e s k e w i n g X ^ k 1 W , a ^ i 1 B , ω ^ i 1 R , G ^ i W ; (Equation (6))
  5:
   T ^ i W = R ^ q ^ i | p ^ i ;
  6:
end for
  7:
for  p k n P ~ k R do
  8:
   T ^ n W * c o n t i n u o u s I n t T ^ i W * , t n ; (Equation (7))
  9:
   p ^ k n = T ^ n W * p k n ; P ^ k W . a p p e n d p ^ k n ;
10:
end for
 //scan-to-map registration
11:
S ^ k W g e n e r a t e S u b m a p M ^ k W ;
12:
T ^ k W p o i n t 2 p o i n t R e g i s t r a t i o n P ^ k W , S ^ k W , G ^ i W ; (Section 2.4)
 //state update
13:
X ^ k W u p d a t e T ^ k W , Δ t k + , G ^ i W ; (Section 2.5)
14:
//update keyframe map
15:
If  P ^ k W  is a keyframe then  M ^ k W M ^ k 1 W P ^ k W ;
16:
return  X ^ k W , M ^ k W
17:
end while
//IMU callback thread
18:
while  a i B and ω i B  do
19:
a ^ i R , ω ^ i R i n t i a l i z e I m u a i B , ω i B ; (Section 2.1)
 //state propagation
20:
X ^ i W p r o p a g a t e X ^ k W , a ^ i R , ω ^ i R , Δ t i + , G ^ i W ; (Section 2.5)
21:
return  X ^ i W
22:
end while
//GNSS callback thread
23:
while  g i G  do
24:
g ^ i W i n t i a l i z e G N S S g i G ; (Section 2.1)
25:
G ^ i W f i l t e r H e a l t h y G N S S g ^ i W , Δ t i + , σ g ; (Section 2.2)
26:
return  G ^ i W
27:
end while

2.1. Initialization

The inputs of our pipeline consist of three main components: 3D point cloud, IMU, and GNSS data, as depicted in Figure 1. It is noted that all sensors should undergo external calibration to ensure accurate and reliable localization performance within the robot body coordinate ( · ) R . Initialization processing plays a crucial role in preparing the data for input into the odometry module. The subsequent steps outline the preprocessing procedures:
(1)
Point Cloud Preprocessing. The dense 3D point cloud is collected by mechanical LiDAR, such as RS-LiDAR or Velodyne (10 Hz~20 Hz). To minimize the loss of information, we employ a 1 m3 box filter at the origin of the point cloud to account for points that potentially originate from the robot itself.
(2)
IMU initial processing: The IMU is capable of capturing the robot’s linear acceleration and angular velocity at a frequency ranging from 100 Hz to 500 Hz. The mathematical model representing measurements a ^ and ω ^ obtained from the IMU [29] is as follows:
a ^ i = ( a i g ) + b i a + n i a
ω ^ i = ω i + b i ω + n i ω
where, the index i = 1 , , M represents the M measurements within the time stamps t k 1 and t k , b i denotes the sensor bias, n i represents the sensor white noise, and g is the rotational gravity vector. In a stationary state, the IMU continues to generate output values, which significantly impact the accuracy of the sensor and GLIO system. Hence, online estimation and compensation of IMU bias errors are pivotal preprocessing steps. The process of online estimating IMU bias errors essentially entails calculating the average value of data obtained by the sensor during a 3 s static period and subsequently subtracting this average value from subsequent IMU data to rectify the sensor data.
In accordance with the principles outlined in the literature [29,38], it is imperative to align the IMU coordinate with the robot body, ensuring coinciding origins and consistent directional alignment. If the sensor does not align with the center of mass, the displacement effect on the linear acceleration measurements of the rigid body must be considered. Therefore, we account for all contributions to the linear acceleration at R by considering the product of the angular velocity and the IMU offset.
(3)
GNSS Status Check and Online Transformation Correction. When the number of received satellites is less than four, the sensor state is unreliable. Therefore, it is imperative to discard such data. The GNSS position p k G at time k − 1 is represented as a one-dimensional vector encompassing longitude, latitude, and altitude measurements. The orientation R k 1 G of GNSS is determined by the IMU. Thus, the GNSS transform matrix can be defined as the following:
T k 1 G = [ R k 1 G , t k 1 G ]
where t k 1 G = p k 1 G p 0 G is the translation vector between original of local world ( · ) W and GNSS ( ) G . p 0 G is the point alignment with the local world coordinate. The initial transformation between the GNSS and the local coordinate is assumed to be represented as T W 0 G . Subsequently, the transformation matrix at time k is as follows:
T W k 1 G = T W 0 G T k 1 G

2.2. GNSS Healthy Check Mechanism

Traditional LiDAR-inertial odometry (LIO) solely relies on scan-to-scan registration for IMU accumulated error correction. However, the point cloud matching error is relatively significant in orchard, impacting the final robot localization results. To mitigate this effect, the GLIO system incorporates GNSS measurements. While it does not consider the reliability of GNSS data during integration due to canopy interference, GNSS sensor measurements may be erroneous or contain valid signals. Consequently, signal noise alone cannot serve as the sole criterion for the health metrics. To address this issue, we propose a distance-based health check mechanism.
The robot’s motion can be seen as continuous, and its positional changes are differentiable. Therefore, the predicted position p ^ R k W at timestamp k is utilized for evaluate the GNSS position p k G as follows:
( σ k G ) 2 = ( T W k 1 G p k G p ^ R k W ) , 0 < ( σ k G ) 2 < ( d G ) 2
where, the distance confidence threshold d G is set to 5 m, aligning with the distance used for selecting point cloud keyframes, to optimize computational resources. By setting the distance confidence threshold, we can ensure that only positions within this range are deemed reliable, providing a practical measure for assessing the accuracy of GNSS data. This approach effectively harnesses the robot’s capabilities while maintaining a high level of confidence in its positioning information.

2.3. Point Cloud De-Skewing and Optimization Prior

The orchard scene exhibits sparsely distributed geometric features, which are characterized by high repetitiveness. In such scenarios, achieving successful registration of the point clouds requires an accurate initial value. Otherwise, failure is likely to occur, impacting the outcomes of the nonlinear observer (as discussed in Section 2.5). Furthermore, continuous movement of the robot during data acquisition brings motion distortion to the collected point clouds. It will impact the scan-to-scan registration result. To tackle this concern, we leverage IMU data for computing intra-frame pointwise continuous pose information.
Assuming that t k is the time when the point cloud P k R is received. There are N number of accumulated points during this time period. Let t k + t k n be the timestamp of a specific point p k n in the cloud. To estimate the position of each point in the local world coordinate W, we employ numerical integration of IMU measurements to calculate the position between t k 1 and t k + t k N as follows:
p ^ i = p ^ i 1 + v ^ i 1 Δ t i + 1 2 R ^ ( q ^ i 1 ) a ^ i 1 Δ t i 2 + 1 6 j ^ i Δ t i 3 , v ^ i = v ^ i 1 + R ^ ( q ^ i 1 ) a ^ i 1 Δ t i , q ^ i = q ^ i 1 + 1 2 ( q ^ i 1 ω ^ i 1 ) Δ t i + 1 4 ( q ^ i 1 α ^ i ) Δ t i 2 ,
where, index i = 1 , , M represents the M number of IMU measurements between the timestamps of two continuous scan frame, and j ^ k = 1 t i ( R ^ ( q ^ i ) a ^ i R ^ ( q ^ i 1 ) a ^ i 1 ) and α i = 1 t i ( ω ^ i ω ^ i 1 ) are the time-varying rates of linear acceleration and angular acceleration of the IMU, respectively.
Furthermore, we also add GNSS absolute state constraints to correct the long-term accumulated error of IMU. Thus, Equation (6) can be reformulated as follows:
p ^ i = p ^ i 1 + v ^ i 1 Δ t i + 1 2 R ^ ( q ^ i 1 ) a ^ i 1 Δ t i 2 + 1 6 j ^ i Δ t i 3 + κ ( p i G p ^ i ) , v ^ i = v ^ i 1 + R ^ ( q ^ i 1 ) a ^ i 1 Δ t i + κ ( v i G v ^ i ) , q ^ i = q ^ i 1 + 1 2 ( q ^ i 1 ω ^ i 1 ) Δ t i + 1 4 ( q ^ i 1 α ^ i ) Δ t i 2 ,
where κ is constant. If the timestamps of GNSS and IMU measurements are synchronized and the GNSS data is deemed healthy, then κ equals 1; Otherwise it equals 0. p i G and v i G are the position and velocity of GNSS measurements in local coordinate W, respectively. The vehicle’s movement speed v i G can be calculated as follow:
v i G = p i G p i 1 G Δ t i
Therefore, the set of homogeneous transformations T ^ i W that correspond to p ^ i and q ^ i subsequently define the coarse discrete-time trajectory during a sweep, as shown in Figure 2. Consequently, this process yields a motion-corrected point cloud that is also approximately aligned with the map in W, which inherently incorporates the optimization prior utilized for scan-to-map registration.

2.4. Scan-to-Map Registration

Scan-to-map registration is the key step in the GLIO framework, as its outcomes significantly impact the accuracy of robot state estimation. Generally, point clouds are matched through feature association computation, wherein features are extracted from the point cloud using a feature descriptor. While this approach has proved effective in urban environments, the challenges arise when applying in orchards with sparse and highly repetitive features. To avoid the complicated parameter tuning process, we employ direct point cloud registration algorithms for aligning the current scan with the submap.
By simultaneously correcting for motion distortion and incorporating the point-to-point registration optimization prior into the point cloud, GLIO can directly perform scan-to-map registration and bypass the scan-to-scan procedure required in previous methods. When receiving a new scan, we initially employ the IMU-predicted state to deskew the point cloud of the scan and transform it into the local world coordinate P ^ k W . Then, we adopt a subsampling strategy proposed in KISS-ICP to effectively reduce the point density within the scan and generate a local map S ^ k W using a voxel grid. The correspondences between the current point cloud P ^ k W and the local map S ^ k W are computed by employing voxel hashing for a nearest neighbor search, resulting in a set of correspondences as follows:
C = { ( i , j ) : min i , j p i s j 2 < τ }
where p i is the i-th point in the down-sample point cloud, s j is the j-th point in the submap, and τ is the maximum distance allowed for the corresponding point from p i and s j . Therefore, we can get the transformation Δ T ^ k between the current scan P ^ k W and submap S ^ k W through Equation (9):
Δ T ^ k = arg min Δ T k ε ( Δ T ^ k P ^ k W , S ^ k W )
where the global transformation T ^ k W = Δ T ^ k T ^ M W can be acquired to construct the global map and serves as the update signal for the nonlinear geometric observer. Among them, T ^ M W is the result of the integration of the IMU and GNSS measurements of the last point, index, and M represents the sparse local map from the previous iteration process.
The effect of outlier correspondences in Equation (9) is mitigated by incorporating a maximum correspondence distance threshold τ . The threshold τ in KISS-ICP can be dynamically determined based on relative pose uncertainty and sensor noise. In our pipeline, the robot’s pose between the current scan and the submap is calculated through IMU/GNSS accumulated integration. Consequently, we can estimate the potential limits by analyzing the angle at which the robot may deviate from the motion prediction over time. The point-to-point distance δ p resulting from the robot’s relative motion can be mathematically formulated as the following:
d ( R , t ) = 2 r max sin ( 1 2 Θ ( R ) ) + | Δ t | 2
where Θ ( R ) represents the angle corresponding to the rotation matrix derived from the IMU, and r max denotes the maximum range of the LiDAR.
Furthermore, we also account for sensor noise in our analysis. Given that the calculation between two scan frames introduces the absolute error from the GNSS, we solely focus on evaluating the impact of noise on LiDAR measurement distance accuracy, which is denoted as δ r a n g e 2 . Hence, we assume their independence, the threshold τ , can be expressed as the following:
τ = δ p 2 + δ r a n g e 2

2.5. Nonlinear Observer

The scan-to-map registration transformation T ^ k W , as described in DLIO, is integrated with IMU measurements to generate global state estimate X ^ k through a hierarchical nonlinear geometric observer. It is proven through construction theory that the robot state X ^ can globally converge to X in the deterministic setting with minimal computation. According to the IMU integration principle, the nonlinear observer establishes a hierarchical structure to ensure convergence of estimates towards their true values. Thus, it incorporates two branches: translation and orientation. It is well known from the literature [29,38] that the nonlinear observer is an open-loop control system. The input values directly affect the accuracy of the nonlinear observer. Therefore, it is imperative to have accurate inputs for the observer when implementing in orchard. In the GLIO system, we integrate the GNSS measurements and scan-to-map alignment results as the input of the nonlinear observer, as shown in Figure 1.
If we let γ l { 1 , , 5 } be the positive constants and Δ t k + be the time between scan-to-map registration result pose. If q e : = ( q e o , q e ) = q ^ i q ^ k and p e = p ^ k p ^ i (errors between propagated and measured poses) then the state correction takes the form of the following:
q ^ i q ^ i + Δ t k + γ 1 q ^ i [ 1 | q e o | sgn ( q e o ) q e ] , b ^ i ω b ^ i ω Δ t k + γ 2 q e o q e , p ^ i p ^ i + Δ t k + γ 3 p e , v ^ i v ^ i + Δ t k + γ 4 p e , b ^ i a b ^ i a Δ t k + γ 5 R ^ ( q ^ i ) T p e .
where we must note that Equation (13) is hierarchical as the attitude update (first two equations) is completely decoupled from the translation update (last three eqs.). Meanwhile, we use the GNSS position difference during the time period Δ t k + to evaluate the point cloud matching accuracy. If the translation distance in the scan-to-map registration result is within a certain threshold ρ of the translation distance computed by GNSS, it will be used as an input for the nonlinear observer. Otherwise, we will input the translation vector obtained from GNSS measurement relative distance together with the rotation matrix of the registration result into the nonlinear estimator module. The distance ρ in our pipeline is set 0.5. At the same time, Equation (13) is a fully nonlinear update which allows one to guarantee the state estimates are accurate enough to directly perform scan-to-map registration with an GNSS/IMU prior without the need for scan-to-scan.

3. Experimental Results

This work proposes a GLIO system that integrates GNSS/IMU/LiDAR multi-sensors, enabling high-frequency and accurate pose estimation in various environments without parameter adjustments. In this section, we will present real experimental evaluation to support our main contributions, namely that the GLIO system: (1) is comparable to state-of-the-art LO in terms of estimation accuracy, and (2) can provide accurate pose estimation in outdoor environments characterized by frequently recurring features.

3.1. Experiment Setup

We utilized extensive datasets and common evaluation methods to evaluate the performance of our approach [39]. Since this work focuses on robot localization in unstructured agricultural environments, it is worth noting that existing public datasets primarily assess structured scenarios, particularly urban environments. Consequently there is a dearth of datasets specifically designed for unstructured scenes. To address this gap, we have developed OrchardSet, a self-made orchard dataset, to evaluate our framework against state-of-the-art LO methods. Additionally, we performed evaluations using public datasets, such as urbanNav [40], M2DGR [41], and NCLT [42], to assess the performance of our approach across diverse scenarios.
OrchardSet: We utilized the robotic platform depicted in Figure 3a to collect data within a manually managed orchard. The robotic platform was equipped with advanced sensors including LiDAR (RS ruby plus), INS (INS-D_E1), and IMU (N100). Among them, INS integrated GNSS and IMU information were used to estimate the robot position and attitude. Thus, we used the GNSS value measurement from INS as the absolute constrains in GLIO. The extrinsic calibration parameters between these sensors are presented in Table 1. The robot adopted straight-line cursing strategy to collect data in the orchard, as depicted by the yellow line in Figure 3b. The data collected by INS served as ground truth for subsequent comparisons of the orchard data set, owing to potential errors associated with GNSS data caused by obstructions within the canopy.
NCLT [42]: is a large-scale dataset that was collected using a wheeled robot on the north campus of the University of Michigan. It encountered various challenging elements, such as dynamic obstacles (e.g., pedestrians, bicycles, and cars), changing viewpoints, and weather conditions. This dataset served as a valuable resource for investigating long-term navigation, localization, and mapping capabilities in demanding environments. In this paper, we selected the sequence of 2012-06-15 to test.
urbanNav [40]: is an open-source multi-sensor dataset specifically designed for benchmarking localization algorithms in urban environments. It primarily focuses on addressing the challenges of localizing in urban canyons, where dense high-rise buildings cause severe significant degradation of GNSS signal. In this paper, we selected data recorded in the Hong Kong area for testing.
M2DGR [41]: Multi-scenario dataset is specifically designed for ground robotic SLAM research. It introduces significant challenges to existing localization solutions by capturing the robot’s long-term trajectories in diverse indoor and outdoor environments. For our study, we primarily utilized the outdoor scenes of this dataset due to the incorporation of GNSS data. The trajectory of the previously described public dataset is illustrated in Figure 4.

3.2. Performance on OrchardSet

In this section, we present the evaluated results of various odometry systems on our self-made orchard dataset. Due to the limited availability of open-source GLIO methods, it is not possible to compare them directly. Therefore, LO or LIO systems are selected as benchmark methods in our experiments. However, considering the artificial plant pattern of the orchard where rows exhibit significant similarity, LO or LIO systems relying on scan-to-scan registration may introduce notable errors when the robot transitions between rows. To ensure fairness, we carried out two sets of comparative experiments which included the following:
(1)
We firstly evaluated the performance on individual fruit rows. We selected the fruit rows where the start points and the end points were located. The fruit rows with the first point that was located were labeled as 01, while the rows with the end points located were labeled as 02;
(2)
We also assessed the odometry performance for complete rows of orchards, specifically focusing on trajectory 03.
The KITTI metric [43], computed by evo [39], was employed as the evaluation criterion in these experiments. Absolute trajectory error (ATE) and relative pose error (RPE) were utilized to evaluate odometry performance in various datasets, which can be calculated using the follow equations:
A T E = 1 N t = 1 N p t e s t p t g t 2
R P E t r a n s ( t , Δ t ) = p t + Δ t e s t p t + Δ t g t 2
R P E r o t ( t , Δ t ) = arccos ( t r a c e ( R t + Δ t g t R t + Δ t e s t   T ) 1 2 )
where, p e s t and p g t represent the estimated position and ground truth, respectively. R e s t and R g t represent the rotation matrix of estimation and ground truth. Index t represents the timestamp, and t + t represents the time interval beween adjacent frames.
The experimental results are shown in Table 1 and Figure 5, while the global map of orchard is presented in Figure 6. From the results, it can be seen that DLIO and KISS-ICP exhibited similar performances in our self-made datasets. Their trajectories are far from the ground truth due to accumulated scan-to-scan alignment errors over time. The maximum error can reach one fruit row spacing (3.5 m), as depicted in Figure 5c. Thus, GLIO corrected the point cloud registration errors by integrating GNSS measurements to improve the robot locate accuracy. However, since GNSS only provide transform distance (position), the quaternion input distribution of the nonlinear observer still relies on scan-to-map registration results. Consequently, this phenomenon renders it susceptible to potential errors in aligning point clouds. The rotational metrics evaluated in Table 2 are comparable with other methods, but robot motion trajectories also show deviations from ground truth.

3.3. Comparison to State-of-the-Art Systems on Public Datasets

The performance of the proposed method was further analyzed across multiple datasets, scenarios, and devices. Given the reliance on GNSS information in our approach, we employed the urbanNav, the NCLT, and M2DGR datasets for performance evaluation. In this experiment, we compared our method with the state-of-the-art odometry systems such as DLO, DLIO, KISS-ICP, and LIO-EKF. It should be noted that unless explicitly stated otherwise, all experiments utilized the default parameters of each algorithm as established during implementation, with the exception of the extrinsic parameters between sensors. The results and trajectory are presented in Table 3, Figure 7 and Figure 8.
For the urbanNav dataset, we selected the sequence named 2020-03-14-16-45-35.bag, which contained two loops. Since KISS-ICP and LIO-EKF employ submaps for scan-to-map registration, the transformation dynamically adjusts with robot motion. Consequently, when the system operates over an extended period of time, significant errors in height trajectory are observed due to cumulative errors in point cloud registration (Figure 5a). Despite utilizing a tight coupling strategy between IMU and LiDAR, LIO-EKF fails to eliminate this error. To mitigate altitude deviation (z-axis) in our system, we incorporated GNSS measurement. Although our method still exhibits attitude errors caused by cumulative alignment errors in point clouds, it outperforms alternative approaches.
We used the 20120615 sequence in the NCLT datasets to evaluate various odometry performances. It can be observed from Table 3 that these methods exhibited similarity in absolute rotation error, while displaying significant disparities in terms of absolute translation errors. Moreover, due to the presence of at least 50% unstructured features within the scene where the sequence was situated, the initialization stage of the point cloud encountered mismatching issues. Therefore, we incorporated GNSS information based on DLIO to augment its stability in both the point cloud alignment and initialization stage, thereby yielding superior outcomes compared to DLIO.

3.4. Computation Efficiency

The computational speed of our method and SOTA approach on various datasets is analyzed in this section. The average processing time of a scan served as the evaluation metric. The average processing time of the point cloud refers to the time interval starting from the reception of a new scan and ending with the completion of the entire point cloud processing. The point cloud process consisted of sensor initialization, point cloud de-skewing, scan-to-map registration, and state update in GLIO. Furthermore, we compared the average processing time of the scan with other SOTA methods, including DLO, DLIO, KISS-ICP, and LIO-EKF. All the methods were run on a NUC machine with an Intel I7-10700 CPU at 2.90 GHz and 32 GB of RAM. The results are shown in Table 4.
We have observed significant variations in the results obtained from different datasets, as well as when employing different methods on the same dataset. The lack of point cloud deskewing processing in DLIO lead to lower computational efficiency across all datasets (except for the orchard dataset). KISS-ICP is a purely LiDAR-based odometry method that utilizes uniform sensor motion to perform point cloud deskewing, resulting in less time required for point cloud distortion correction and an overall reduction in processing duration. On the other hand, LIO-EKF and DLIO utilized IMU measurement data for point cloud deskewing. LIO-EKF was faster due to its use of traditional EKF without multiple iterations, but it sacrificed accuracy. However, both DLIO and our approach adopt a coarse-to-fine estimation of the point state, which provided higher computational efficiency compared to others. We synchronized data from the three sensors within the point cloud processing time by incorporating GNSS information. Additionally, due to the unstructured nature of the orchard environment and M2DGR dataset, performing odometry on this dataset required relatively more computation time. It should be noted that DLIO’s average processing time on the M2DGR dataset was poor due to initialization failure; however, we included this result solely for data integrity purposes.

4. Discussion

The GLIO system, which integrates the strengths of diverse sensors while compensating for their respective limitations, is proposed to facilitate subsequent robot localization and navigation in agricultural scenarios. After thorough evaluation, we conducted a comparative analysis of GLIO’s performance in orchard environments and other public datasets against state-of-the-art methods. The findings indicate that GLIO exhibits promising feasibility in orchard settings. However, its effectiveness diminishes when applied to the urbanNav dataset containing loops. Furthermore, our evaluation solely focused on assessing the translational aspect of scan-to-map registration results while neglecting the rotational component. Consequently, operational failures may arise if there are any issues with the rotation matrix. In the future, we intend to undertake comprehensive research on those directions.

5. Conclusions

In this paper, we proposed a GNSS/LiDAR/IMU fusion framework based on DLIO and KISS-ICP, which enables resource-constrained mobile robots to achieve accurate real-time state estimation and generate sparse maps, thereby enhancing their robustness in unstructured environments. The framework we proposed comprised of initialization, point cloud de-skewing and prior optimization, scan-to-map registration, and a nonlinear observer. The nonlinear observer, being an open-loop controller reliant on the accuracy of input data, is the key of the GLIO system. It is directly affected by the alignment precision during the scan-to-map stage, consequently impacting its localization accuracy. To solve this issue, we improved the scan-to-map registration pipeline and proposed an adaptive data association method. Meanwhile, considering that moving dynamic obstacles can make the stationary robot appear to move, at this point, we fused the GNSS information to improve the state update of the robot. In addition, we added a GNSS information health checking mechanism for the GNSS information to improve the accuracy of the whole system. Finally, we designed experiments to evaluate our system’s ability to accurately location in complex agricultural settings. The experiments conducted on both the autonomous orchard dataset and the public dataset demonstrated that our proposed GLIO method exhibits comparable performance to other methods. On the autonomous orchard dataset, GLIO demonstrated global consistency with an average relative translation error of 0.037 m and an absolute translation error of 0.068 m.

Author Contributions

Conceptualization, N.S., Q.Q. and C.Z.; methodology, N.S., Q.Q., T.L. and C.Z.; software, N.S. and M.R.; validation, N.S., Q.Q. and C.Z.; formal analysis, Q.Q.; investigation, N.S., Q.Q. and C.Z.; resources, Q.Q., Q.F. and C.Z.; writing—original draft preparation, N.S.; writing—review and editing, N.S., M.R., Q.Q., T.L., C.J., Q.F. and C.Z.; visualization, N.S.; supervision, Q.Q., T.L. and C.Z.; project administration, T.L., C.J., Q.F. and C.Z.; funding acquisition Q.F., T.L., C.J. and C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Science and Technology Cooperation Project of Xinjiang Production and Construction Crops, grant number 2022BC007, Key Laboratory of Modern Agricultural Intelligent Equipment in South China, grant number HNZJ202206, Youth Research Foundation of Beijing Academy of Agriculture and Forestry Sciences, grant number QNJJ202318, Beijing Nova Program, grant number 20220484023, and BAAFS Innovation Capacity Building Project, grant number KJCX20210414.

Data Availability Statement

The data that support the findings of this study are available upon request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhu, J.; Li, H.; Zhang, T. Camera, LiDAR, and IMU Based Multi-Sensor Fusion SLAM: A Survey. Tsinghua Sci. Technol. 2024, 29, 415–429. [Google Scholar] [CrossRef]
  2. Yin, H.; Xu, X.; Lu, S.; Chen, X.; Xiong, R.; Shen, S.; Stachniss, C.; Wang, Y. A Survey on Global LiDAR Localization: Challenges, Advances and Open Problems. Int. J. Comput. Vis. 2023, 132, 3139–3171. [Google Scholar] [CrossRef]
  3. Nilchan, N.; Supnithi, P.; Phakphisut, W. Improvement of Kalman Filter for GNSS/IMU Data Fusion with Measurement Bias Compensation. In Proceedings of the 2020 35th International Technical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC), Nagoya, Japan, 3–6 July 2020. [Google Scholar]
  4. Hidayatullah, F.H.; Abdurohman, M.; Putrada, A.G. Accident Detection System for Bicycle Athletes Using GPS/IMU Integration and Kalman Filtered AHRS Method. In Proceedings of the 2021 International Conference Advancement in Data Science, E-learning and Information Systems (ICADEIS), Bali, Indonesia, 13 October 2021; pp. 1–6. [Google Scholar]
  5. De Miguel, G.; Goya, J.; Uranga, J.; Alvarado, U.; Adin, I.; Mendizabal, J. GNSS Complementary Positioning System Performance in Railway Domain. In Proceedings of the 2017 15th International Conference on ITS Telecommunications (ITST), Warsaw, Poland, 29–31 May 2017; pp. 1–7. [Google Scholar]
  6. Jouybari, A.; Ardalan, A.A.; Rezvani, M.-H. Experimental Comparison between Mahoney and Complementary Sensor Fusion Algorithm for Attitude Determination by Raw Sensor Data of Xsens IMU on Buoy. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2017, XLII-4/W4, 497–502. [Google Scholar] [CrossRef]
  7. Berkane, S.; Tayebi, A.; De Marco, S. A Nonlinear Navigation Observer Using IMU and Generic Position Information. Automatica 2021, 127, 109513. [Google Scholar] [CrossRef]
  8. Hashim, H.A.; Eltoukhy, A.E.E.; Vamvoudakis, K.G.; Abouheaf, M.I. Nonlinear Deterministic Observer for Inertial Navigation Using Ultra-Wideband and IMU Sensor Fusion. In Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Detroit, MI, USA, 1 October 2023; pp. 3085–3090. [Google Scholar]
  9. Suzuki, T. Attitude-Estimation-Free GNSS and IMU Integration. IEEE Robot. Autom. Lett. 2024, 9, 1090–1097. [Google Scholar] [CrossRef]
  10. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
  11. Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-Time. In Proceedings of the Robotics: Science and Systems X; Robotics: Science and Systems Foundation, Los Angeles, CA, USA, 12 July 2014. [Google Scholar]
  12. Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4758–4765. [Google Scholar]
  13. Gao, L.; Xia, X.; Zheng, Z.; Ma, J. GNSS/IMU/LiDAR Fusion for Vehicle Localization in Urban Driving Environments within a Consensus Framework. Mech. Syst. Signal Process. 2023, 205, 110862. [Google Scholar] [CrossRef]
  14. Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-Time Loop Closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar]
  15. Palieri, M.; Morrell, B.; Thakur, A.; Ebadi, K.; Nash, J.; Chatterjee, A.; Kanellakis, C.; Carlone, L.; Guaragnella, C.; Agha-mohammadi, A. LOCUS: A Multi-Sensor Lidar-Centric Solution for High-Precision Odometry and 3D Mapping in Real-Time. IEEE Robot. Autom. Lett. 2021, 6, 421–428. [Google Scholar] [CrossRef]
  16. Reinke, A.; Palieri, M.; Morrell, B.; Chang, Y.; Ebadi, K.; Carlone, L.; Agha-Mohammadi, A.-A. LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time 3D Mapping. IEEE Robot. Autom. Lett. 2022, 7, 9043–9050. [Google Scholar] [CrossRef]
  17. Chen, K.; Lopez, B.T.; Agha-Mohammadi, A.-A.; Mehta, A. Direct LiDAR Odometry: Fast Localization with Dense Point Clouds. IEEE Robot. Autom. Lett. 2022, 7, 2000–2007. [Google Scholar] [CrossRef]
  18. Xu, W.; Zhang, F. FAST-LIO: A Fast, Robust LiDAR-Inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
  19. Bai, C.; Xiao, T.; Chen, Y.; Wang, H.; Zhang, F.; Gao, X. Faster-LIO: Lightweight Tightly Coupled Lidar-Inertial Odometry Using Parallel Sparse Incremental Voxels. IEEE Robot. Autom. Lett. 2022, 7, 4861–4868. [Google Scholar] [CrossRef]
  20. Wu, Y.; Guadagnino, T.; Wiesmann, L.; Klingbeil, L.; Stachniss, C.; Kuhlmann, H. LIO-EKF: High Frequency LiDAR-Inertial Odometry Using Extended Kalman Filters. arXiv 2024, arXiv:2311.09887. [Google Scholar]
  21. Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-Coupled Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020; pp. 5135–5142. [Google Scholar]
  22. Xiao, H.; Han, Y.; Zhao, J.; Cui, J.; Xiong, L.; Yu, Z. LIO-Vehicle: A Tightly-Coupled Vehicle Dynamics Extension of LiDAR Inertial Odometry. IEEE Robot. Autom. Lett. 2022, 7, 446–453. [Google Scholar] [CrossRef]
  23. Park, C.; Moghadam, P.; Kim, S.; Elfes, A.; Fookes, C.; Sridharan, S. Elastic LiDAR Fusion: Dense Map-Centric Continuous-Time SLAM. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1206–1213. [Google Scholar]
  24. Ramezani, M.; Khosoussi, K.; Catt, G.; Moghadam, P.; Williams, J.; Kottege, N. Wildcat: Online Continuous-Time 3D Lidar-Inertial SLAM. arXiv 2022, arXiv:2205.12595. [Google Scholar]
  25. Knights, J.; Vidanapathirana, K.; Ramezani, M.; Sridharan, S.; Fookes, C.; Moghadam, P. Wild-Places: A Large-Scale Dataset for Lidar Place Recognition in Unstructured Natural Environments. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May–2 June 2023. [Google Scholar]
  26. Dellenbach, P.; Deschaud, J.-E.; Jacquet, B.; Goulette, F. CT-ICP: Real-Time Elastic LiDAR Odometry with Loop Closure. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022. [Google Scholar]
  27. Park, C.; Moghadam, P.; Williams, J.; Kim, S.; Sridharan, S.; Fookes, C. Elasticity Meets Continuous-Time: Map-Centric Dense 3D LiDAR SLAM. IEEE Trans. Robot. 2022, 38, 978–997. [Google Scholar] [CrossRef]
  28. Vizzo, I.; Guadagnino, T.; Mersch, B.; Wiesmann, L.; Behley, J.; Stachniss, C. KISS-ICP: In Defense of Point-to-Point ICP—Simple, Accurate, and Robust Registration If Done the Right Way. IEEE Robot. Autom. Lett. 2023, 8, 1029–1036. [Google Scholar] [CrossRef]
  29. Chen, K.; Nemiroff, R.; Lopez, B.T. Direct LiDAR-Inertial Odometry: Lightweight LIO with Continuous-Time Motion Correction. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 29 May 2023; pp. 3983–3989. [Google Scholar]
  30. Gao, Y.; Liu, S.; Atia, M.; Noureldin, A. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef]
  31. Chiang, K.W.; Tsai, G.J.; Chang, H.W.; Joly, C.; EI-Sheimy, N. Seamless Navigation and Mapping Using an INS/GNSS/Grid-Based SLAM Semi-Tightly Coupled Integration Scheme. Inf. Fusion 2019, 50, 181–196. [Google Scholar] [CrossRef]
  32. Liu, X.; Wen, W.; Hsu, L.-T. GLIO: Tightly-Coupled GNSS/LiDAR/IMU Integration for Continuous and Drift-Free State Estimation of Intelligent Vehicles in Urban Areas. IEEE Trans. Intell. Veh. 2023, 9, 1412–1422. [Google Scholar] [CrossRef]
  33. Li, S.; Li, X.; Wang, H.; Zhou, Y.; Shen, Z. Multi-GNSS PPP/INS/Vision/LiDAR Tightly Integrated System for Precise Navigation in Urban Environments. Inf. Fusion 2023, 90, 218–232. [Google Scholar] [CrossRef]
  34. Wu, W.; Zhong, X.; Wu, D.; Chen, B.; Zhong, X.; Liu, Q. LIO-Fusion: Reinforced LiDAR Inertial Odometry by Effective Fusion With GNSS/Relocalization and Wheel Odometry. IEEE Robot. Autom. Lett. 2023, 8, 1571–1578. [Google Scholar] [CrossRef]
  35. Tan, H.; Zhao, X.; Zhai, C. Design and experiments with a SLAM system for low-density canopy environments in greenhouses based on an improved Cartographer framework. Front. Plant Sci. 2024, 15, 1276799. [Google Scholar] [CrossRef]
  36. Tang, B.; Guo, Z.; Huang, C. A fruit-tree mapping system for semi-structured orchards based on multi-sensor-fusion SlAM. IEEE Access 2024. [Google Scholar] [CrossRef]
  37. Zhao, Z.; Zhang, Y.; Shi, J. Efficient and adaptive lidar—Visual-inertial odometry for agricultural unmanned ground vehicle. Int. J. Adv. Robot. Syst. 2022, 19, 2. [Google Scholar] [CrossRef]
  38. Lopez, B.T. A Contracting Hierarchical Observer for Pose-Inertial Fusion. arXiv 2023, arXiv:2303.02777. [Google Scholar]
  39. Grupp, M. Evo: Python Package for the Evaluation of Odometry and SLAM. 2017. Available online: https://github.com/MichaelGrupp/evo (accessed on 1 June 2017).
  40. Hsu, L.-T.; Huang, F.; Ng, H.-F.; Zhang, G.; Zhong, Y.; Bai, X.; Wen, W. Hong Kong UrbanNav: An Open-Source Multisensory Dataset for Benchmarking Urban Navigation Algorithms. Navi 2023, 70, navi.602. [Google Scholar] [CrossRef]
  41. Yin, J.; Li, A.; Li, T.; Yu, W.; Zou, D. M2DGR: A Multi-Sensor and Multi-Scenario SLAM Dataset for Ground Robots. IEEE Robot. Autom. Lett. 2022, 7, 2266–2273. [Google Scholar] [CrossRef]
  42. Carlevaris-Bianco, N.; Ushani, A.K.; Eustice, R.M. University of Michigan North Campus Long-Term Vision and Lidar Dataset. Int. J. Robot. Res. 2016, 35, 1023–1035. [Google Scholar] [CrossRef]
  43. Geiger, A.; Lenz, P.; Urtasun, R. Are We Ready for Autonomous Driving? The KITTI Vision Benchmark Suite. In Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR), Providence, RI, USA, 16–21 June 2012. [Google Scholar]
Figure 1. GLIO system architecture.
Figure 1. GLIO system architecture.
Remotesensing 16 02907 g001
Figure 2. Continuous-time motion correction and robot initial state estimate.
Figure 2. Continuous-time motion correction and robot initial state estimate.
Remotesensing 16 02907 g002
Figure 3. Self-made orchard dataset. (a) The robotic platform utilized to collect the environment information. (b) Robot motion trajectories in OrchardSet. The yellow line represents the robot’s travel path.
Figure 3. Self-made orchard dataset. (a) The robotic platform utilized to collect the environment information. (b) Robot motion trajectories in OrchardSet. The yellow line represents the robot’s travel path.
Remotesensing 16 02907 g003
Figure 4. The trajectory of the public dataset adopted in our method. (a) The trajectory of the sequence named Hongkong 20200314, in urbanNav dataset. (b) The trajectory of the NCLT dataset. We selected sequence 20120615 to evaluate. (c) The M2DGR dataset. We chose gate01 as the test sequence.
Figure 4. The trajectory of the public dataset adopted in our method. (a) The trajectory of the sequence named Hongkong 20200314, in urbanNav dataset. (b) The trajectory of the NCLT dataset. We selected sequence 20120615 to evaluate. (c) The M2DGR dataset. We chose gate01 as the test sequence.
Remotesensing 16 02907 g004
Figure 5. The performance of various LiDAR-based odometry techniques evaluated on our custom dataset, OrchardSet. Subsection (a) presents the results of experiment I-01, while subsection (b) showcases the findings from experiment I-02. Additionally, subsection (c) details the outcomes of experiment II.
Figure 5. The performance of various LiDAR-based odometry techniques evaluated on our custom dataset, OrchardSet. Subsection (a) presents the results of experiment I-01, while subsection (b) showcases the findings from experiment I-02. Additionally, subsection (c) details the outcomes of experiment II.
Remotesensing 16 02907 g005
Figure 6. Local map generated by our method in our self-made dataset.
Figure 6. Local map generated by our method in our self-made dataset.
Remotesensing 16 02907 g006
Figure 7. Performance with different LiDAR-based odometry in different dataset. (a) urbanNav dataset. (b) NCLT dataset. (c) M2DGR dataset.
Figure 7. Performance with different LiDAR-based odometry in different dataset. (a) urbanNav dataset. (b) NCLT dataset. (c) M2DGR dataset.
Remotesensing 16 02907 g007
Figure 8. Generated sparse maps for different datasets utilizing our method. (a) urbanNav dataset. (b) NCLT dataset. (c) M2DGR dataset.
Figure 8. Generated sparse maps for different datasets utilizing our method. (a) urbanNav dataset. (b) NCLT dataset. (c) M2DGR dataset.
Remotesensing 16 02907 g008
Table 1. Extrinsic calibrated parameters between the sensors in OrchardSet.
Table 1. Extrinsic calibrated parameters between the sensors in OrchardSet.
SensorTransformX (m)Y (m)Z (m)RollPitchYaw
LiDAR X b o d y , L i D A R 0.00.00.00.0°0.0°0.0°
IMU X b o d y , I M U 0.2800.00.623180.0°0.0°180.0°
INS X b o d y , I N S 0.1540.0200.623180.0°0.0°180.0°
Table 2. Comparison with OrchardSet dataset.
Table 2. Comparison with OrchardSet dataset.
SequenceMethodAvg.
tra.
Avg.
rot.
ATE.
tra.
ATE.
rot.
Experiment I
01DLIO0.0660.0410.0700.302
KISS-ICP0.0830.0410.0880.263
ours0.0300.0410.0270.218
02DLIO0.0740.0410.0990.213
KISS-ICP0.0870.0400.0950.171
ours0.0310.0460.0330.201
Experiment II
03DLIO0.0740.5010.2130.834
KISS-ICP0.0920.5020.2660.834
ours0.0370.0500.0680.856
Avg.tra. and Avg.rot. denote the relative translation error [m] and relative rotation error [deg/m] using KITTI [43] metrics, respectively. ATE.tra. and ATE.rot. denote the absolute translation error [m] and absolute rotation error [deg], respectively. Bold represents the best results.
Table 3. Odometry performance comparison within the public datasets.
Table 3. Odometry performance comparison within the public datasets.
MethodTypeUrbanNavNCLTM2DGR
ATE.
tra. *
ATE.
rot. *
ATE.
tra.
ATE.
rot.
ATE.
tra.
ATE.
rot.
DLO [16]LO0.8732.0232.6802.8280.1922.134
DLIO [29]LIO1.7882.0006.9832.800122.9732.426
KISS-ICP [28]LO7.9362.4693.7172.8286.9832.155
LIO-EKF [20]LIO5.1792.01410.5612.8270.5652.116
OursGLIO2.2021.9671.9782.8280.1602.123
* ATE.tra. and ATE.rot. denote the absolute translation error [m] and absolute rotation error [deg] using KITTI [39] metrics, respectively. Bold represents the best results.
Table 4. Comparison of the average processing time (ms).
Table 4. Comparison of the average processing time (ms).
MethodAverage Computation Processing Time [ms]
OrchardurbanNavNCLTM2DGR
DLO [16]-29.87 43.9340.87
DLIO [29]40.5023.4742.058.35
KISS-ICP [28]25.1620.9337.4483.82
LIO-EKF [20]-23.7714.2520.46
ours45.526.5640.0650.88
Bold represents the best results.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sun, N.; Qiu, Q.; Li, T.; Ru, M.; Ji, C.; Feng, Q.; Zhao, C. GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard. Remote Sens. 2024, 16, 2907. https://doi.org/10.3390/rs16162907

AMA Style

Sun N, Qiu Q, Li T, Ru M, Ji C, Feng Q, Zhao C. GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard. Remote Sensing. 2024; 16(16):2907. https://doi.org/10.3390/rs16162907

Chicago/Turabian Style

Sun, Na, Quan Qiu, Tao Li, Mengfei Ru, Chao Ji, Qingchun Feng, and Chunjiang Zhao. 2024. "GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard" Remote Sensing 16, no. 16: 2907. https://doi.org/10.3390/rs16162907

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop