The integration of robotic systems into challenging operational environments necessitates platforms with exceptional adaptability and reliability. Traditional wheeled or tracked robots often fail in unstructured or highly complex terrain, creating a significant technological gap for critical applications such as exploration, search and rescue, and environmental monitoring. This gap has driven substantial interest in biologically inspired designs that mimic the locomotion of animals adept in such environments. This paper details the research and development of a novel aerial-terrestrial amphibious bionic robot. This bionic robot synthesizes the capabilities of an unmanned aerial vehicle (UAV) and a multi-legged ground robot into a single, cohesive platform. The core innovation lies in its ability to seamlessly transition between aerial flight for rapid deployment and long-range reconnaissance, and stable, adaptive ground locomotion for detailed inspection and manipulation in complex, confined, or hazardous areas where flight may be impractical or unsafe. The design philosophy of this bionic robot is deeply rooted in principles of biomimetics, control theory, and machine vision, aiming to create a robust and versatile tool for intelligent tasks in demanding scenarios.

The overarching system architecture is meticulously partitioned into three primary functional units: the Control Unit, the Power Unit, and the Propulsion & Actuation Unit. This modular approach ensures clear responsibility separation and system robustness. A summary of the core hardware components and their roles is presented in the table below.
| System Unit | Sub-System | Core Components | Primary Function |
|---|---|---|---|
| Control Unit | Terrestrial Control | Raspberry Pi, Robot HAT Expansion Board | High-level perception processing, gait generation, web server hosting, and servo control. |
| Aerial Control | Custom Flight Controller (e.g., LingXiao IMU), Futaba Receiver | Sensor fusion, attitude estimation, flight stabilization, and motor PWM output. | |
| Power Unit | Dual Power Supply | 18650 Battery Pack (Terrestrial), Lithium Polymer (LiPo) Battery (Aerial) | Provides stable, isolated voltage rails for the aerial and terrestrial systems independently. |
| Propulsion & Actuation Unit | Aerial Propulsion | Brushless DC Motors, Electronic Speed Controllers (ESCs) | Generates lift and control forces for quadrotor flight. |
| Terrestrial Actuation | Digital Servo Motors (e.g., GDW-DS041MG) | Provides precise joint angle control for legged locomotion and posture adjustment. |
The terrestrial locomotion of this amphibious bionic robot is achieved through a quadrupedal configuration, directly inspired by the stability and adaptability of four-legged animals. Each of the four legs is endowed with multiple degrees of freedom (DOF), enabling complex movements for navigating obstacles, climbing uneven surfaces, and maintaining balance. The kinematic model of a single leg is fundamental for coordinated motion planning and control. Using the Denavit-Hartenberg (D-H) convention, a coordinate frame is attached to each joint. For a simplified model of a leg with three main links (coxa, femur, tibia) and corresponding joint angles, the forward kinematics defining the position $(P_x, P_y, P_z)$ of the foot relative to the body can be derived. Assuming the joint angles are $\theta_1$ (hip abduction/adduction), $\theta_2$ (hip flexion/extension), and $\theta_3$ (knee flexion/extension), and the link lengths are $L_1$, $L_2$, $L_3$, the position is given by:
$$
\begin{aligned}
P_x &= L_1 \cos(\theta_1) + L_2 \cos(\theta_1)\cos(\theta_2) + L_3 \cos(\theta_1)\cos(\theta_2 + \theta_3) \\
P_y &= L_1 \sin(\theta_1) + L_2 \sin(\theta_1)\cos(\theta_2) + L_3 \sin(\theta_1)\cos(\theta_2 + \theta_3) \\
P_z &= -L_2 \sin(\theta_2) – L_3 \sin(\theta_2 + \theta_3)
\end{aligned}
$$
These equations allow the control system to calculate where the foot will be placed given a set of joint commands. Conversely, inverse kinematics solves for the required joint angles $(\theta_1, \theta_2, \theta_3)$ to achieve a desired foot position $(P_x, P_y, P_z)$, which is crucial for implementing stable walking gaits and foothold planning on uneven terrain. The inverse solution often involves geometric or algebraic methods and may have multiple valid configurations.
A cornerstone feature of the terrestrial mode is its self-balancing capability, essential for stability on slopes or when interacting with the environment. This is achieved through a closed-loop control system centered on an Inertial Measurement Unit (IMU), such as an MPU6050, which provides real-time orientation data (roll $\phi$, pitch $\theta$). A cascaded Proportional-Integral-Derivative (PID) control architecture is typically employed. The innermost loop is the PD “upright” or “balancing” ring, which calculates a corrective body movement based on the current tilt angles and angular rates to keep the torso level:
$$
\text{Output}_{\text{upright}} = K_{P,\phi} \cdot \phi + K_{D,\phi} \cdot \dot{\phi} + K_{P,\theta} \cdot \theta + K_{D,\theta} \cdot \dot{\theta}
$$
This output is then integrated into the gait generator, which adjusts the leg joint angles—effectively changing the height and position of the feet—to counteract the tilt. For instance, if the robot tilts forward, the algorithm will command the rear legs to extend and/or the front legs to retract slightly, shifting the center of gravity backward to restore equilibrium. This active posture control significantly enhances the robot’s robustness compared to statically stable platforms, allowing this unique bionic robot to traverse more demanding inclines and surfaces.
The second major operational mode of this amphibious system is aerial flight, configured as a quadrotor. The dynamics of a quadrotor are well-established. The total thrust $T$ generated by the four rotors is the sum of individual thrusts $T_i$, and is oriented along the body’s z-axis. The thrust is proportional to the square of the rotational speed $\omega_i$ of each motor: $T_i = k_T \omega_i^2$, where $k_T$ is a thrust coefficient. The moments (roll $M_\phi$, pitch $M_\theta$, and yaw $M_\psi$) are generated by differential thrust between opposite motors. For a standard X-configuration, the mapping from motor speeds to forces and moments can be represented as:
$$
\begin{bmatrix}
T \\ M_\phi \\ M_\theta \\ M_\psi
\end{bmatrix}
=
\begin{bmatrix}
k_T & k_T & k_T & k_T \\
0 & -k_T \cdot l & 0 & k_T \cdot l \\
-k_T \cdot l & 0 & k_T \cdot l & 0 \\
k_Q & -k_Q & k_Q & -k_Q
\end{bmatrix}
\begin{bmatrix}
\omega_1^2 \\ \omega_2^2 \\ \omega_3^2 \\ \omega_4^2
\end{bmatrix}
$$
Here, $l$ is the arm length from the center of mass to a motor, and $k_Q$ is a torque coefficient related to rotor drag. The flight controller uses data from its IMU and other sensors to solve the inverse of this allocation problem, determining the required $\omega_i$ for each motor to achieve a desired attitude and thrust. This aerial agility allows the bionic robot to overcome large terrain obstacles, survey areas from above, and deploy rapidly to a target zone before transitioning to its ground mode for detailed work.
To enable intelligent interaction with its environment, the robot is equipped with a sophisticated vision system. The primary sensor is a camera module connected to the terrestrial control’s Raspberry Pi. The software stack is built around the OpenCV library, enabling real-time image processing. A key functionality is dynamic color-based object tracking. This process relies on converting the captured image from the default RGB (Red, Green, Blue) color space to the HSV (Hue, Saturation, Value) color space. HSV is more intuitive for color detection as it separates the color type (Hue) from its intensity (Value) and purity (Saturation), making detection more robust under varying lighting conditions.
The transformation and detection pipeline is as follows: Let $I_{rgb}(x,y)$ be the input RGB image. The conversion to HSV, $I_{hsv}(x,y) = \text{RGBtoHSV}(I_{rgb}(x,y))$, is performed. A binary mask $M(x,y)$ is then created by thresholding the Hue ($H$) and Saturation ($S$) channels to isolate a specific color range, typically defined by lower and upper bounds $(H_{low}, S_{low}, V_{low})$ and $(H_{high}, S_{high}, V_{high})$:
$$
M(x,y) =
\begin{cases}
1, & \text{if } H_{low} \leq H(x,y) \leq H_{high} \text{ and } S_{low} \leq S(x,y) \leq S_{high} \\
0, & \text{otherwise}
\end{cases}
$$
Morphological operations (erosion and dilation) are applied to $M(x,y)$ to reduce noise. Contours are then extracted from the cleaned mask. The centroid $(c_x, c_y)$ of the largest contour is computed, providing the target’s pixel coordinates in the image frame. This coordinate can be used for visual servoing, allowing the bionic robot to autonomously follow or approach an object of interest. The following table provides example HSV ranges for common colors within the OpenCV 8-bit representation (H: 0-179, S/V: 0-255).
| Color | Hmin | Hmax | Smin | Smax | Vmin | Vmax |
|---|---|---|---|---|---|---|
| Red | 0 | 10 | 100 | 255 | 50 | 255 |
| Red (wrap) | 170 | 179 | 100 | 255 | 50 | 255 |
| Green | 35 | 85 | 50 | 255 | 50 | 255 |
| Blue | 100 | 130 | 50 | 255 | 50 | 255 |
| Yellow | 20 | 35 | 100 | 255 | 50 | 255 |
Furthermore, the system implements a real-time video streaming server using a lightweight web framework like Flask. This allows an operator to view the robot’s camera feed remotely through a web browser by connecting to the Raspberry Pi’s IP address, facilitating teleoperation and situational awareness. The control interface for the terrestrial mode can also be hosted as a web application, enabling command inputs (e.g., directional movement, gait selection, posture adjustment) from the same browser window. The aerial mode is typically controlled via a dedicated radio transmitter (e.g., TX16s) for low-latency, reliable piloting. This dual-control paradigm (web-based for ground, RC for air) provides operational flexibility and redundancy, ensuring that the bionic robot remains controllable even if one communication channel is compromised.
The integration of aerial and terrestrial systems presents unique challenges, particularly in power management, weight distribution, and control mode switching. The dual power supply system is critical, ensuring that high-current demands of the aerial motors do not cause voltage sags that could reset the sensitive terrestrial electronics. The mechanical design must carefully balance the weight of the flight system (motors, ESCs, flight controller, large battery) against the need for a lightweight yet strong leg structure. The transition protocol—from flight to ground mode—is a key research area. It involves a controlled landing, potential locking mechanisms for the legs or rotors, and a software handover from the flight controller to the terrestrial controller. This seamless transition is what ultimately defines the operational utility of this amphibious bionic robot.
In conclusion, the development of an aerial-terrestrial amphibious bionic robot represents a significant step forward in creating versatile robotic platforms for complex environments. By fusing the strategic mobility of aerial drones with the tactical stability and endurance of legged robots, this hybrid bionic robot overcomes the limitations inherent in single-domain systems. Its design incorporates biomimetic principles for locomotion, advanced cascaded PID control for self-balancing, and computer vision for environmental perception and interaction. While challenges in optimal weight distribution, seamless mode transition, and integrated autonomous navigation remain active areas of development, the proposed system demonstrates a feasible and highly promising architecture. Such amphibious bionic robots have the potential to revolutionize applications in disaster response, where they can fly over rubble to locate survivors and then walk through debris to deliver aid; in industrial inspection of complex infrastructure like pipelines or wind turbines; and in scientific exploration of remote and rugged natural environments. The continuous refinement of these integrated systems will undoubtedly unlock new frontiers in robotics and autonomous operation.
