Our team has developed a hexapod agricultural bionic robot based on an STM32 microcontroller. This project integrates mechanical design, electronic circuit design, and embedded programming to create a functional prototype. The robot’s core components include a six-legged mechanical structure, an STM32-based control unit, and a suite of environmental sensors. The mechanical design, inspired by insect locomotion, provides significant stability and reliability, enabling operation across complex and uneven terrain. Simulation and experimental results demonstrate that the STM32 controller, utilizing multi-sensor data fusion techniques, facilitates autonomous navigation, environmental adaptation, and data collection, thereby enhancing the overall intelligence of the robotic system for agricultural applications.
The advancement of robotics technology has steadily permeated the agricultural sector, offering substantial improvements in production efficiency and management quality. Conventional agricultural machinery, predominantly reliant on wheeled or tracked systems, often struggles with mobility and stability on rough, unstructured terrain typical of farm fields. This limitation has spurred research into alternative locomotion methods, with legged robots, particularly those mimicking insect morphology, emerging as a promising solution. Our work focuses on the design and implementation of a hexapod bionic robot, leveraging the STM32 platform to create a versatile agent capable of performing monitoring and light intervention tasks in agricultural environments. This bionic robot offers superior adaptability to ground irregularities compared to traditional platforms.
Research and development in the field of legged agricultural robots are active globally. Internationally, entities in the United States, Japan, and Europe have pioneered sophisticated platforms. For instance, Boston Dynamics’ “Spot” exemplifies a highly mobile quadruped robot with applications spanning inspection and data collection, concepts readily transferable to agriculture. Japanese researchers have developed hexapod platforms like “Mantis” for operations in flooded paddy fields, demonstrating the viability of legged systems in specific crop environments. Domestically, increased focus on agricultural modernization has accelerated related research. Institutions have developed hexapod robots for field operations such as planting, fertilization, and spraying. These developments underscore a clear trend towards leveraging bionic robot designs to overcome terrain challenges in agriculture. Our project contributes to this trajectory by presenting an integrated system design centered on a widely accessible STM32 controller.
System Architecture and Requirements
The overarching goal was to create a robust, autonomous platform for agricultural scouting and environmental monitoring. The system requirements were defined across several domains:
Mechanical Structure: The robot must feature a hexapod (six-legged) configuration capable of stable and flexible movement on uneven soil. Key parameters such as size, weight, and payload capacity were optimized for typical field operations.
Electronic Control: An STM32 microcontroller serves as the computational core, requiring real-time and stable control software to manage locomotion and process sensor data.
Sensory Suite: The robot must be equipped with sensors to perceive its state and environment, including inertial measurement for posture, and environmental sensors for field data acquisition.
Power System: A high-capacity battery pack must provide sufficient, stable power for extended operation, with weight and capacity balanced against mobility.
Operational Capability: The design must facilitate the integration of agricultural tools (e.g., spray nozzles, grippers) for potential interactive tasks, with precise control enabled by the mechanical and electronic systems.
Based on these requirements, our team designed the system architecture. The STM32 central controller manages all core functions: it processes data from environmental sensors, executes gait algorithms to control the limb actuators via motor drivers, handles communication with a remote IoT platform, and can manage peripheral devices like a display or video module. The integrated sensory system allows this bionic robot to perceive its surroundings actively.
| Subsystem | Key Components | Primary Function / Specification |
|---|---|---|
| Control Core | STM32F4 Series MCU | High-performance ARM Cortex-M4, real-time control, multiple communication interfaces (UART, I2C, SPI, CAN). |
| Locomotion | 18x Servo Motors (6 legs x 3 DOF) | Provides three degrees of freedom per leg for omnidirectional movement and terrain adaptation. |
| Environmental Sensing | Digital Temperature/Humidity Sensor, NDIR CO₂ Sensor, Photoresistor/Photodiode | Measures air temperature, humidity, CO₂ concentration, and light intensity for crop environment monitoring. |
| State Estimation | 6-Axis IMU (Gyroscope + Accelerometer) | Provides robot attitude (roll, pitch, yaw) and acceleration data for balance and gait stabilization. |
| Communication | Wi-Fi/4G Module | Enables IoT connectivity for data transmission to cloud platforms and remote monitoring/control via mobile App. |
| Power | Lithium Polymer (LiPo) Battery Pack, Voltage Regulators | Provides high-current capacity for servos and stable voltage rails for electronics. |
Mechanical and Hardware Design
The mechanical design of this bionic robot is critical for its mobility. Each leg possesses three rotational joints, corresponding to the coxa, femur, and tibia segments found in insects. This 3-DOF configuration allows the leg to move forward/backward, up/down, and extend/retract, enabling the robot to traverse obstacles and maintain stability on slopes. The body and leg segments were designed using CAD software and fabricated from lightweight yet sturdy materials like aluminum alloy and carbon fiber to maximize the payload-to-weight ratio.

The hardware architecture is built around the STM32 microcontroller. The motion control subsystem interfaces the MCU with servo motor drivers. The STM32 generates precise Pulse Width Modulation (PWM) signals to control the angular position of each servo. The power distribution network ensures that the high-current demands of the 18 servos are met without causing voltage drops that could reset the sensitive digital electronics. The sensor subsystem connects various transducers to the MCU via appropriate communication buses (e.g., I2C for the IMU and temperature/humidity sensor, analog or digital IO for light sensing).
Kinematic Model and Gait Planning
The movement of a hexapod bionic robot is governed by forward and inverse kinematics. Each leg is treated as a serial manipulator. We establish a coordinate frame for each joint. The forward kinematics calculates the foot endpoint position \((x_f, y_f, z_f)\) relative to the body based on the three joint angles \((\theta_1, \theta_2, \theta_3)\):
$$
\begin{aligned}
x_f &= L_1 \cos(\theta_1) + L_2 \cos(\theta_1 + \theta_2) + L_3 \cos(\theta_1 + \theta_2 + \theta_3) \\
y_f &= L_1 \sin(\theta_1) + L_2 \sin(\theta_1 + \theta_2) + L_3 \sin(\theta_1 + \theta_2 + \theta_3) \\
z_f &= 0 \quad \text{(in the leg plane, simplified model)}
\end{aligned}
$$
Where \(L_1, L_2, L_3\) are the lengths of the coxa, femur, and tibia links, respectively. For locomotion, inverse kinematics is more critical: given a desired foot trajectory in body coordinates, the required joint angles are computed. For our simplified 3-DOF leg in a plane perpendicular to the ground during a step cycle, the inverse kinematics can be derived geometrically. Let the desired foot position relative to the coxa joint be \((x, y)\). The distance from the coxa to the foot is \(r = \sqrt{x^2 + y^2}\).
$$
\begin{aligned}
\theta_1 &= \arctan2(y, x) \\
\theta_3 &= \pi – \arccos\left(\frac{L_2^2 + L_3^2 – r^2}{2 L_2 L_3}\right) \quad \text{(for a knee-forward configuration)} \\
\theta_2 &= \arctan2(y, x) – \arctan2(L_3 \sin(\theta_3), L_2 + L_3 \cos(\theta_3))
\end{aligned}
$$
Gait generation defines the sequence and timing of leg movements. A stable and efficient gait for a hexapod bionic robot is the alternating tripod gait. In this gait, the six legs are divided into two groups of three (a front and back leg on one side with the middle leg on the opposite side). These two groups move in alternation: one group is in the stance phase (supporting the body and propelling it forward) while the other is in the swing phase (lifting and moving forward). This gait provides static stability at all times. The trajectory of each foot during the swing phase is typically a cycloid or a simple parabolic lift to avoid dragging.
| Parameter | Symbol | Typical Value | Description |
|---|---|---|---|
| Stride Length | \(S\) | 80 mm | Forward distance covered by body per gait cycle. |
| Swing Phase Ratio | \(\beta\) | 0.4 | Fraction of cycle time a leg is in swing (lifting/moving). |
| Stance Phase Ratio | \(1-\beta\) | 0.6 | Fraction of cycle time a leg is on the ground pushing. |
| Leg Lift Height | \(H_{swing}\) | 30 mm | Maximum height of foot during swing phase. |
| Duty Factor | \(\delta\) | 0.6 | For each leg, fraction of cycle time it is in stance. For tripod gait, \(\delta=0.5\). |
Sensor Fusion and Control Software Design
The intelligence of our bionic robot stems from its software architecture and sensor data processing. The control program, written in C/C++ using the STM32 HAL/LL libraries, is structured around a real-time loop with interrupt service routines for time-critical tasks.
Main Control Loop: The primary loop manages gait state machines, reads sensor data at fixed intervals, runs data fusion algorithms, and executes high-level decisions (e.g., changing direction based on a planned path or sensor input).
Sensor Data Fusion: Data from the IMU is crucial for stabilization. A complementary filter or a Kalman filter is implemented to fuse accelerometer and gyroscope data, providing a robust estimate of the robot’s orientation (roll and pitch angles). This attitude estimate can be used to adapt the body posture on slopes. The environmental sensor data (temperature \(T\), humidity \(RH\), CO₂ concentration \(C\), light intensity \(L\)) are processed and packaged for transmission. A simple fusion rule for an environmental “comfort index” \(I\) for a hypothetical crop could be:
$$
I = w_1 \cdot \frac{|T – T_{opt}|}{\Delta T} + w_2 \cdot \frac{|RH – RH_{opt}|}{\Delta RH} + w_3 \cdot \frac{|C – C_{opt}|}{\Delta C} + w_4 \cdot \frac{|L – L_{opt}|}{\Delta L}
$$
where \(T_{opt}, RH_{opt}, C_{opt}, L_{opt}\) are optimal values, \(\Delta\) terms are acceptable ranges, and \(w_i\) are weighting factors summing to 1. A lower \(I\) indicates a more favorable environment.
IoT Integration: The software includes drivers for the communication module. Sensor data and system status are formatted into JSON packets and transmitted periodically via MQTT or HTTP to a cloud server. This allows remote monitoring through a web or mobile application. The system can also receive basic commands (stop, start, return home) from the remote platform.
| Module | Key Functions | Implementation Details |
|---|---|---|
| Gait Engine | Generates coordinated PWM signals for 18 servos based on selected gait and speed. | Uses timer interrupts for precise pulse generation. Implements trajectory interpolation for smooth motion. |
| IMU Fusion | Estimates robot attitude from raw accelerometer and gyro data. | Embedded Kalman filter or complementary filter. Runs at ~100Hz update rate. |
| Environmental Monitor | Reads and filters sensor data, calculates derived indices. | Uses I2C and ADC polling. Data is averaged over several samples to reduce noise. |
| Communication Handler | Manages data transmission/reception with cloud/IoT platform. | State machine for network connection. Implements protocol layers (TCP/IP, MQTT). |
| System Manager | Coordinates modules, manages power states, handles error conditions. | Main super-loop with finite state machine for overall robot behavior (IDLE, WALKING, DATA_COLLECTING, etc.). |
System Integration and Performance Evaluation
The integration of mechanical, electronic, and software components is paramount for the successful operation of the bionic robot. After assembly, the system underwent rigorous testing. Calibration procedures were established for the servo motors to ensure neutral positions were correctly aligned, and for the sensors to verify their readings against reference instruments.
Performance was evaluated in both laboratory and simulated field conditions:
Locomotion Tests: The robot demonstrated stable walking on flat surfaces, carpet, gravel, and gentle slopes using the alternating tripod gait. Speed and power consumption were measured. The average power draw \(P_{avg}\) during steady walking can be modeled as:
$$
P_{avg} = N_{stance} \cdot P_{stance} + N_{swing} \cdot P_{swing} + P_{electronics}
$$
where \(N_{stance}\) and \(N_{swing}\) are the number of motors in stance and swing phases respectively, \(P_{stance}\) and \(P_{swing}\) are the average power per motor in each phase, and \(P_{electronics}\) is the power for the MCU and sensors. \(P_{swing}\) is typically higher as motors work against gravity.
Sensor Data Accuracy: The environmental sensor readings were compared with commercial-grade handheld meters, showing acceptable agreement for agricultural monitoring purposes.
Communication Reliability: The range and stability of the IoT link were tested, ensuring data could be reliably sent from inside a greenhouse or across an open field to a base station.
Autonomous Operation: Basic autonomous behaviors were tested, such as following a pre-programmed path and stopping when an obstacle (detected via an added ultrasonic sensor) was within a threshold distance \(d_{min}\).
| Test Category | Metric | Result | Condition / Notes |
|---|---|---|---|
| Locomotion | Maximum Speed | 0.12 m/s | On flat concrete, tripod gait. |
| Slope Traversal | Up to 15° incline | Stable ascent and descent. | |
| Endurance | Continuous Operation Time | ~45 minutes | With mixed walking and stationary sensing. |
| Standby Time (Sensing only) | ~8 hours | Periodic sensor data transmission. | |
| Sensing | Temperature Accuracy | ±0.5 °C | Compared to calibrated thermometer. |
| Humidity Accuracy | ±3% RH | Compared to calibrated hygrometer. | |
| Data Transmission Success Rate | > 98% | Within 50m of Wi-Fi access point. | |
| Stability | Attitude Control Error | < ±2° (roll/pitch) | While walking on uneven terrain. |
Conclusion and Future Work
The developed hexapod agricultural bionic robot, centered on the STM32 microcontroller, proves to be a viable platform for autonomous field scouting and environmental monitoring. Its bio-inspired leg configuration provides the necessary mobility for challenging terrain where conventional wheeled robots fail. The integration of a multi-sensor system with IoT capabilities transforms the platform into a mobile data acquisition node, contributing to precision agriculture frameworks. The project successfully demonstrates key aspects of mechatronic system design, including kinematic modeling, real-time embedded control, sensor fusion, and wireless communication.
Future enhancements for this bionic robot are manifold. Firstly, the mechanical design could be optimized for weight reduction and increased payload capacity using topology optimization and advanced composites. Secondly, the sensory suite can be expanded to include multispectral or thermal cameras for plant health assessment, and more sophisticated navigation sensors like LiDAR or stereo vision for SLAM (Simultaneous Localization and Mapping). This would enable true autonomous navigation in complex, GPS-denied crop rows. Thirdly, the control algorithms can be advanced by implementing more adaptive gaits that dynamically adjust to sensed terrain, potentially using machine learning models trained on proprioceptive data. Finally, the development of a modular tool attachment system would allow the same bionic robot platform to perform physical interventions such as targeted spraying, micro-weeding, or even delicate fruit harvesting, greatly expanding its utility in the agricultural ecosystem.
