Agricultural Robot Navigation System Integrating Drone Remote Sensing and BeiDou Positioning

In recent years, the integration of robot technology in agriculture has gained significant attention due to the growing demand for precision farming and labor shortages. Autonomous agricultural robots are increasingly deployed for tasks such as crop monitoring, spraying, and harvesting. However, achieving high-precision navigation in complex farmland environments remains a challenge. Traditional navigation systems, including those based solely on Global Navigation Satellite Systems (GNSS), often suffer from limitations such as low accuracy, susceptibility to environmental interference, and inadequate real-time performance. To address these issues, we propose a novel navigation system that combines drone-based remote sensing with BeiDou Real-Time Kinematic (RTK) positioning. This approach leverages the strengths of both technologies to enhance the autonomy and precision of agricultural robots.

The core of our system involves using an unmanned aerial vehicle (UAV) equipped with imaging sensors to capture high-resolution farmland images. These images are processed to generate two-dimensional orthophotos, which serve as a detailed map for path planning. Concurrently, the agricultural robot utilizes BeiDou RTK positioning, dual-antenna orientation, and an inertial measurement unit (IMU) for real-time localization and attitude estimation. By fusing these data sources with a compensated Kalman filter algorithm, we achieve robust navigation control, enabling the robot to follow predefined paths with minimal deviation. This integration of robot technology not only improves operational efficiency but also ensures adaptability to varying terrain conditions.

Our agricultural robot platform is designed with a four-wheel differential steering mechanism, measuring 1050 mm in length, 640 mm in width, and 750 mm in height. It has a maximum payload of 50 kg and a top speed of 1.17 m/s. The navigation system comprises several key components: a UAV for image acquisition, a BeiDou RTK module for centimeter-level positioning, a dual-antenna system for heading determination, an IMU for attitude sensing, and an industrial computer running the Robot Operating System (ROS) for path planning and control. Wireless data transmission modules facilitate communication between the robot and the base station, ensuring seamless data flow. This setup exemplifies the advanced application of robot technology in dynamic environments.

The farmland map construction begins with the UAV capturing overlapping images at a flight height of 25 meters, with a forward overlap of 60% and side overlap of 80%. These images are processed using DJI Terra software to create a 2D orthophoto with a precision of 1.1 cm. The map is georeferenced using the WGS-84 coordinate system, providing a accurate base for path planning. This step is crucial for the robot technology, as it replaces manual point collection, reducing labor intensity and enhancing map accuracy. The orthophoto allows for the identification of crop rows and obstacles, enabling precise navigation path definition.

For positioning and orientation, the BeiDou RTK system provides real-time location data with a planar accuracy of 0.8 cm + 1 ppm. The dual-antenna setup, with a baseline length of 0.3 m, offers a heading accuracy of 0.2 degrees per meter. The RTK data is transmitted in NMEA-0183 format at 20 Hz, and the heading is calculated based on the phase difference between the two antennas. To mitigate issues like data latency from the dual-antenna system and cumulative errors from the IMU, we employ a compensated Kalman filter. This algorithm fusion the heading angle from the dual-antenna system with the angular velocity from the gyroscope, resulting in a more accurate and stable heading estimate. The state-space model for the Kalman filter is defined as follows:

$$ x_k = A x_{k-1} + B u_k + w_k $$

$$ z_k = H x_k + v_k $$

where \( x_k \) is the state vector (e.g., heading angle and angular rate), \( A \) is the state transition matrix, \( B \) is the control input matrix, \( u_k \) is the control vector, \( w_k \) is the process noise, \( z_k \) is the measurement vector, \( H \) is the observation matrix, and \( v_k \) is the measurement noise. The compensation step adjusts the gyroscope data when dual-antenna updates are available, ensuring continuous heading correction. This fusion is a key innovation in robot technology for agricultural applications.

Coordinate transformation is essential for aligning the GNSS-based WGS-84 coordinates with the robot’s local frame. We use the Gauss-Krüger projection to convert latitude and longitude (\( L, B \)) to plane coordinates (\( X, Y \)). The transformation equations are:

$$ X = X_0 + \frac{l^2}{2} N \sin B \cos B + \frac{l^4}{24} N \sin B \cos^3 B (5 – t^2 + 9 \eta^2 + 4 \eta^4) + \frac{l^6}{720} N \sin B \cos^5 B (61 – 58 t^2 + t^4 + 270 \eta^2 – 330 \eta^2 t^2) $$

$$ Y = l N \cos B + \frac{l^3}{6} N \cos^3 B (1 – t^2 + \eta^2) + \frac{l^5}{120} N \cos^5 B (5 – 18 t^2 + t^4 + 14 \eta^2 – 58 \eta^2 t^2) $$

where \( l = L – L_0 \) (with \( L_0 \) as the central meridian), \( t = \tan B \), \( \eta = e^2 \cos B \), \( N = a / \sqrt{1 – e_1^2 \sin^2 B} \) is the radius of curvature in the prime vertical, and \( X_0 \) is the meridian arc length. This transformation ensures that the path planning is conducted in a Cartesian coordinate system aligned with the robot’s movements.

Path planning and tracking involve defining a series of target points along the crop rows, spaced approximately 1 meter apart. The robot navigates by following straight-line segments between consecutive points. The lateral deviation \( d \) and heading deviation \( \theta \) are computed in real-time to control the robot’s motion. For a line segment between points \( A(x_n, y_n) \) and \( B(x_{n+1}, y_{n+1}) \), the line equation is \( ax + by + c = 0 \), where \( a = y_{n+1} – y_n \), \( b = x_n – x_{n+1} \), and \( c = x_{n+1} y_n – x_n y_{n+1} \). The lateral deviation from the robot’s current position \( C(x_m, y_m) \) is given by:

$$ d = \frac{|a x_m + b y_m + c|}{\sqrt{a^2 + b^2}} $$

The heading deviation \( \theta \) is the angle between the robot’s current orientation and the direction of the target line. A dual-loop motion controller adjusts the wheel speeds based on these deviations, using a differential drive model. The outer loop processes position and heading feedback, while the inner loop controls motor speeds via encoders. This closed-loop system ensures accurate path following, a critical aspect of robot technology in unstructured environments.

To validate our navigation system, we conducted experiments in a farmland setting with predefined paths. The robot was tested at speeds of 0.3, 0.5, and 0.7 m/s, with multiple runs for each speed. We measured the lateral deviation from the target path and the heading deviation at target points. The results, summarized in Table 1, demonstrate that the average lateral deviation increased with speed but remained within acceptable limits for agricultural tasks. For instance, at 0.7 m/s, the average lateral deviation was 7.6 cm with a standard deviation of 3.1 cm. Similarly, the heading deviation averaged 9.7° with a standard deviation of 6.9°. These findings highlight the robustness of our robot technology in maintaining precision under varying operational conditions.

Table 1: Navigation Performance at Different Speeds
Speed (m/s) Average Lateral Deviation (cm) Standard Deviation of Lateral Deviation (cm) Average Heading Deviation (°) Standard Deviation of Heading Deviation (°)
0.3 4.1 2.1 6.8 4.0
0.5 6.2 2.5 8.2 6.1
0.7 7.6 3.1 9.7 6.9

In addition to path tracking, we evaluated the system’s performance at specific target points. The robot paused at each target point for 3 minutes to record deviations. As shown in Table 2, the maximum lateral deviation at 0.7 m/s was 12.9 cm, and the maximum heading deviation was 25.4°. The slight increase in errors at higher speeds is attributed to the controller’s response time and larger displacement between corrections. However, the overall accuracy meets the requirements for typical agricultural operations, such as crop monitoring and spraying. This underscores the effectiveness of integrating drone-based mapping with BeiDou RTK in robot technology.

Table 2: Target Point Navigation Deviations
Speed (m/s) Max Lateral Deviation (cm) Average Lateral Deviation at Targets (cm) Standard Deviation at Targets (cm) Max Heading Deviation (°) Average Heading Deviation at Targets (°) Standard Deviation at Targets (°)
0.3 8.3 4.3 2.1 14.6 6.8 4.0
0.5 10.3 6.1 2.4 18.6 8.2 6.1
0.7 12.9 7.0 2.8 25.4 9.7 6.9

The compensated Kalman filter plays a vital role in enhancing the navigation accuracy. By fusing the dual-antenna heading (with updates at 20 Hz) and the IMU data (at a higher rate), the filter reduces the impact of noise and delays. The prediction and update steps are as follows:

Prediction:
$$ \hat{x}_{k|k-1} = F_k \hat{x}_{k-1|k-1} + B_k u_k $$
$$ P_{k|k-1} = F_k P_{k-1|k-1} F_k^T + Q_k $$

Update:
$$ y_k = z_k – H_k \hat{x}_{k|k-1} $$
$$ S_k = H_k P_{k|k-1} H_k^T + R_k $$
$$ K_k = P_{k|k-1} H_k^T S_k^{-1} $$
$$ \hat{x}_{k|k} = \hat{x}_{k|k-1} + K_k y_k $$
$$ P_{k|k} = (I – K_k H_k) P_{k|k-1} $$

where \( F_k \) is the state transition matrix, \( B_k \) is the control matrix, \( u_k \) is the control input, \( P \) is the error covariance, \( Q_k \) is the process noise covariance, \( H_k \) is the observation matrix, \( R_k \) is the measurement noise covariance, and \( K_k \) is the Kalman gain. This algorithm ensures that the heading estimate remains stable even during temporary signal loss, a common issue in robot technology deployed in rural areas.

In discussion, our system compares favorably with existing approaches. For example, methods relying solely on GNSS or inertial navigation often exhibit larger errors or cumulative drift. The integration of drone遥感 provides a high-resolution map that is updated frequently, reducing reliance on pre-surveyed points. This is particularly beneficial in dynamic farmland environments where crop growth can alter the terrain. The use of BeiDou RTK offers superior accuracy compared to standard GPS, especially in regions with strong satellite coverage. The robot technology demonstrated here achieves centimeter-level precision, which is essential for tasks like inter-row navigation and targeted interventions.

Future work could explore the integration of additional sensors, such as LiDAR or multispectral cameras, to further enhance perception and adaptability. Moreover, machine learning algorithms could be incorporated to improve path planning in real-time based on environmental changes. The scalability of this robot technology makes it suitable for large-scale farms, contributing to sustainable agriculture practices.

In conclusion, we have developed and tested a robust navigation system for agricultural robots that combines drone remote sensing with BeiDou RTK positioning. The system achieves high accuracy in path following, with average lateral deviations below 8 cm even at higher speeds. The compensated Kalman filter effectively fusion multiple data sources to provide reliable heading estimates. This work advances the field of robot technology by demonstrating a practical solution for autonomous navigation in complex farmland. The integration of these technologies not only improves operational efficiency but also reduces the need for manual labor, paving the way for smarter agricultural practices.

Scroll to Top