Attitude Estimation and Control for Bipedal Humanoid Robots Using Raspberry Pi and Inertial Sensors

In my work on enhancing the mobility and stability of bipedal humanoid robots, accurate perception of self-motion state is paramount. When operating in highly dynamic and complex environments, a humanoid robot must continuously estimate its own attitude to adapt to changes in motion. This capability is critical for maintaining balance, executing precise movements, and recovering from disturbances. To address this, I have designed and implemented a comprehensive attitude estimation and control system based on a Raspberry Pi and inertial measurement units (IMUs). This system enables a humanoid robot to sense its orientation, process the data, and make real-time adjustments to its posture. The core objective is to achieve robust and efficient attitude estimation, which is fundamental for the autonomous operation of any advanced humanoid robot.

The system I developed can be broadly divided into three functional modules. First, inertial sensors are employed to collect raw data essential for attitude estimation. Second, a Raspberry Pi single-board computer receives this sensor data, processes it through sophisticated algorithms to determine the robot’s current orientation. Third, based on the analyzed attitude, the Raspberry Pi issues control commands to the robot’s actuators to execute necessary postural corrections. The architectural framework of these components is illustrated below, showcasing the integration of sensing, computation, and actuation in a cohesive pipeline for humanoid robot control.

The selection of hardware components was driven by the need for computational power, sensor accuracy, and reliable communication. The central processing unit is a Raspberry Pi 3 Model B. This device, an ARM-based microcomputer, offers extensive hardware interfaces and full PC-like functionality, making it ideal for embedded applications in robotics. Its 40-pin GPIO header provides the necessary connectivity for reading data from inertial sensors and communicating with the robot’s motor control system. For inertial sensing, I evaluated and utilized two distinct modules: the MPU6050 and the JY901. Both are integrated inertial sensors, but with different features and performance characteristics, allowing for a comparative study to achieve a system with minimal relative error in attitude estimation for the humanoid robot.

The MPU6050 is a widely used 6-axis motion tracking device that combines a 3-axis gyroscope and a 3-axis accelerometer. It can be connected to an external magnetometer to form a 9-axis system. A key feature is its integrated Digital Motion Processor (DMP), which performs sensor fusion and quaternion-based attitude calculation on-chip. The DMP outputs quaternion data directly, which can be converted to Euler angles, thereby offloading the complex computation from the main processor. This significantly reduces the computational burden on the Raspberry Pi, enhancing system real-time performance for the humanoid robot’s control tasks. The quaternion output q from the DMP is represented as:

$$ q = q_w + q_x i + q_y j + q_z k $$

where \( q_w \) is the scalar part and \( q_x, q_y, q_z \) are the vector components. The Euler angles—roll (\( \phi \)), pitch (\( \theta \)), and yaw (\( \psi \))—can be derived from the quaternion using the following conversions:

$$ \phi = \arctan\left(\frac{2(q_w q_x + q_y q_z)}{1 – 2(q_x^2 + q_y^2)}\right) $$
$$ \theta = \arcsin\left(2(q_w q_y – q_z q_x)\right) $$
$$ \psi = \arctan\left(\frac{2(q_w q_z + q_x q_y)}{1 – 2(q_y^2 + q_z^2)}\right) $$

In contrast, the JY901 module is a more advanced attitude and heading reference system (AHRS). It integrates a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer (compass). Furthermore, it incorporates a high-performance microprocessor running dynamic solving and Kalman filtering algorithms internally. This allows the JY901 to output real-time, filtered attitude angles directly with high stability and accuracy. Its specified static accuracy is 0.05 degrees and dynamic accuracy is 0.1 degrees. The built-in Kalman filter is crucial for fusing data from multiple sensors in a dynamic environment. A simplified discrete-time Kalman filter process involves two main steps: prediction and update. The state prediction equation is:

$$ \hat{x}_{k|k-1} = F_k \hat{x}_{k-1|k-1} + B_k u_k $$
$$ P_{k|k-1} = F_k P_{k-1|k-1} F_k^T + Q_k $$

where \( \hat{x} \) is the state estimate (e.g., orientation), \( F \) is the state transition model, \( u \) is the control input, \( P \) is the error covariance, and \( Q \) is the process noise covariance. The update step, upon receiving a measurement \( z_k \), is:

$$ K_k = P_{k|k-1} H_k^T (H_k P_{k|k-1} H_k^T + R_k)^{-1} $$
$$ \hat{x}_{k|k} = \hat{x}_{k|k-1} + K_k (z_k – H_k \hat{x}_{k|k-1}) $$
$$ P_{k|k} = (I – K_k H_k) P_{k|k-1} $$

Here, \( K \) is the Kalman gain, \( H \) is the observation model, and \( R \) is the measurement noise covariance. The JY901’s internal implementation of such filtering provides superior performance for the humanoid robot’s attitude estimation, especially in mitigating noise and drift.

For actuation, the humanoid robot uses servo motors for joint movements. A dedicated serial servo controller board is employed to generate the precise pulse-width modulation (PWM) signals required to control servo angles. However, this controller lacks advanced computational capability. Therefore, a cooperative scheme was adopted: the Raspberry Pi runs the main control program, which includes the attitude estimation and adjustment algorithms. After determining the necessary postural correction, the Raspberry Pi sends high-level commands to the servo controller via a serial interface, which then translates them into low-level PWM signals for the servos. This decoupling allows for complex logic to reside on the Pi while ensuring precise and timely motor control for the humanoid robot.

The hardware system design revolves around the seamless integration of these components. The core parts include the Raspberry Pi 3B, the inertial sensors (MPU6050 and JY901), the robot servo controller, the servo motors, and a stable power supply. Ensuring proper communication protocols and initial configuration is vital for system reliability.

Communication Protocol Design

Inter-module communication was carefully planned to ensure data integrity and speed. Both the MPU6050 and JY901 support the I²C (Inter-Integrated Circuit) communication protocol. Given that the MPU6050 exclusively uses I²C, this protocol was chosen for data transfer between the inertial sensors and the Raspberry Pi. The I²C bus uses two lines: Serial Data Line (SDA) and Serial Clock Line (SCL). It supports multi-master, multi-slave communication with a maximum data rate of 400 kbps. Each device on the bus has a unique address. In this setup, the Raspberry Pi acts as the master, while the sensors are slaves. The connection uses four pins for each sensor: VCC (to 3.3V on Pi), GND, SDA, and SCL.

The servo controller board supports both UART (Universal Asynchronous Receiver-Transmitter) serial communication and I²C. However, to avoid bus contention and ensure stable data flow for the critical control channel, the communication between the Raspberry Pi and the servo controller was implemented using UART. The controller has dedicated TXD (transmit) and RXD (receive) pins. When the Raspberry Pi sends a command, its UART peripheral converts parallel data into a serial bit stream on the TX line. The controller’s UART receives this stream on its RX line, collects it into a buffer, and the controller’s processor reads the data. The connection also uses four pins: VCC (3.3V), GND, Pi TXD to Controller RXD, and Pi RXD to Controller TXD.

Hardware Communication Interface Summary
Component Pair Communication Protocol Data Lines Used Purpose in Humanoid Robot
Raspberry Pi ↔ MPU6050 I²C SDA, SCL Raw gyro/accel data acquisition for attitude
Raspberry Pi ↔ JY901 I²C SDA, SCL Fused attitude angle data acquisition
Raspberry Pi ↔ Servo Controller UART Serial TXD, RXD Transmission of joint angle commands

Raspberry Pi Pin Configuration and Interface Activation

The Raspberry Pi 3 has two serial ports: a hardware-assisted UART (/dev/ttyAMA0) and a mini UART (/dev/ttyS0). The hardware UART is more reliable as it has a dedicated baud rate clock source. By default on the Pi 3, the hardware UART is assigned to the Bluetooth module, and the mini UART is mapped to the GPIO pins 14 (TXD) and 15 (RXD). Using the mini UART is problematic because its baud rate is tied to the CPU core frequency, leading to potential communication failures. Therefore, I reconfigured the system to restore the hardware UART to the GPIO 14/15 pins, ensuring robust and stable serial communication with the servo controller for the humanoid robot.

This configuration involved modifying the `/boot/config.txt` file and disabling the serial console. Furthermore, the I²C interface needed to be activated. This was done by adding the lines `i2c-bcm2708` and `i2c-dev` to the `/etc/modules` file and enabling I²C via the `raspi-config` tool. Since the control software is written in Python, the `smbus` library was installed to facilitate I²C communication from Python scripts, enabling easy reading of sensor data for the humanoid robot’s state estimation.

Sensor Calibration and System Presets

Proper calibration of inertial sensors is critical for accurate attitude estimation. The JY901 module requires two calibration procedures before use: accelerometer calibration and magnetometer calibration. Accelerometer calibration removes the zero-bias error inherent in the sensor. The process involves placing the sensor in six static positions (positive and negative directions for each axis) and computing correction parameters. Magnetometer calibration compensates for hard and soft iron distortions in the local magnetic environment, which is essential for accurate yaw (heading) calculation. Without this, the yaw angle would exhibit significant errors, detrimental for the humanoid robot’s orientation awareness.

For the MPU6050, while its DMP can output calibrated data, initial sensor calibration (like setting offsets) is often recommended for best performance. This can be done by collecting data while the sensor is stationary and calculating average offsets for the gyroscope and accelerometer.

In addition to sensor calibration, the control system for the humanoid robot was pre-programmed with a set of posture adjustment strategies. These are essentially state-response policies designed to recover from various unstable configurations. Examples include:

  • Adjusting the heading (yaw) to face a specific direction.
  • Correcting pitch (forward/backward lean) or roll (sideways lean) to return to an upright posture.
  • Executing a pre-defined sequence of servo movements to stand up after a forward fall.
  • Executing a different sequence to recover from a backward fall.

The Raspberry Pi, after estimating the current attitude, selects the appropriate policy and sends the corresponding command sequence to the servo controller.

Attitude Estimation Algorithm Design

The core software running on the Raspberry Pi is responsible for continuous attitude estimation and decision-making. The program flow is cyclical. Upon startup, it initializes the I²C and UART interfaces, then enters a main loop. In each iteration, it reads data from the selected inertial sensor. When using the MPU6050 with DMP, it reads the quaternion registers directly. When using the JY901, it reads the pre-calculated Euler angle registers. For the MPU6050 without DMP (or for a custom sensor fusion algorithm), raw gyroscope and accelerometer data would be read and fused. A common complementary filter or Kalman filter could be implemented. The complementary filter is simpler and often effective. It combines the gyroscope’s short-term accuracy with the accelerometer’s long-term stability (for pitch and roll). The update for an angle estimate \( \theta \) is:

$$ \theta_{filtered} = \alpha (\theta_{gyro} + \dot{\theta}_{gyro} \cdot \Delta t) + (1 – \alpha) \theta_{accel} $$

where \( \alpha \) is a tuning parameter (typically close to 0.98), \( \dot{\theta}_{gyro} \) is the angular rate from the gyro, \( \Delta t \) is the sample time, and \( \theta_{accel} \) is the angle calculated from accelerometer data (using \( \theta_{accel} = \arctan(a_y / a_z) \) for pitch, for example). For a full 3D orientation, quaternions are preferred. A simple quaternion-based complementary filter update step can be:

$$ q_{gyro} = q_{prev} \otimes \left(1, \frac{\omega_x \Delta t}{2}, \frac{\omega_y \Delta t}{2}, \frac{\omega_z \Delta t}{2}\right) $$
$$ q_{accel\_mag} = \text{quaternion from accelerometer and magnetometer vectors} $$
$$ q_{filtered} = \gamma \cdot q_{gyro} + (1-\gamma) \cdot q_{accel\_mag} $$
$$ q_{filtered} = \frac{q_{filtered}}{\|q_{filtered}\|} $$

where \( \otimes \) denotes quaternion multiplication, \( \omega \) is the angular velocity vector, and \( \gamma \) is a gain. However, in my implementation, I primarily leveraged the sensors’ internal processing (DMP for MPU6050, onboard Kalman filter for JY901) to simplify the software and ensure computational efficiency for the humanoid robot’s real-time control.

After obtaining the current Euler angles (pitch, roll, yaw), the program evaluates them against predefined thresholds to classify the humanoid robot’s posture state. For instance:

  • If \( |\text{pitch}| > 30^\circ \) and pitch is positive, classify as “leaning forward dangerously”.
  • If \( |\text{roll}| > 25^\circ \), classify as “leaning sideways”.
  • If both pitch and roll are within \( \pm 5^\circ \), classify as “upright and stable”.

Based on this classification, the program either maintains the current servo positions (if the posture is acceptable) or triggers a corrective action sequence. The corrective actions involve sending a series of target angle commands for multiple servos to the controller via UART. The program then loops back to read the new sensor data, creating a closed-loop feedback system for the humanoid robot’s posture control.

Posture State Classification and Corresponding Action
State Condition (Euler Angles) Classification Corrective Action for Humanoid Robot
\( |\phi| < 5^\circ \), \( |\theta| < 5^\circ \), any \( \psi \) Upright Stable No action, hold position.
\( \theta > 30^\circ \), \( |\phi| < 10^\circ \) Forward Fall Imminent Execute counter-movement in ankles/hips; or prepare fall recovery.
\( \theta < -25^\circ \), \( |\phi| < 10^\circ \) Backward Fall Imminent Execute backward lean compensation.
\( |\phi| > 20^\circ \), \( |\theta| < 15^\circ \) Excessive Roll Adjust step width or shift center of mass laterally.
Large change in \( \psi \) detected Unintended Yaw Adjust foot placement to correct heading.

Testing Methodology and Performance Evaluation

The performance of the developed attitude estimation and control system was evaluated in two primary phases: sensor data accuracy/stability analysis and physical robot posture adjustment validation.

Phase 1: Sensor Data Analysis. The MPU6050 and JY901 modules were separately connected to the Raspberry Pi. Each sensor was placed on a level surface, and attitude data (pitch, roll, yaw) was logged for a period of 10 seconds to establish a baseline and observe static drift. Then, the sensor was rotated by precise known angles (using a calibrated protractor) in each axis sequentially. The measured angle change from the sensor was recorded and compared to the actual rotation. This process was repeated multiple times for each axis and for both sensors. Key metrics calculated were the mean error and the standard deviation of the error. The error for a single measurement is defined as:

$$ E = \theta_{measured} – \theta_{actual} $$

The relative error percentage is:

$$ E_{rel} = \left| \frac{E}{\theta_{actual}} \right| \times 100\% $$

Furthermore, the sensors were left static for an extended period (several minutes) to monitor any significant drift in the output, particularly for the yaw angle which, without a magnetometer, is prone to integrate gyroscope bias.

Phase 2: Humanoid Robot Physical Tests. The complete system, with the selected sensor (initially JY901, then MPU6050 for comparison), was installed on a bipedal humanoid robot platform. The Raspberry Pi and servo controller were mounted on the robot’s torso. The robot was placed in various initial postures:

  • Upright with a slight forward tilt.
  • Upright with a lateral tilt.
  • Knocked over onto its front.
  • Knocked over onto its back.
  • Facing different compass directions (to test yaw reference).

For each test, the system was activated. The time taken from the start command until the robot achieved a stable, upright posture was measured. The success of the adjustment was qualitatively judged based on whether the robot reached a fully upright stance without oscillation or falling again. Quantitative measures included the final steady-state error in pitch and roll angles from perfect upright (0°).

Comparative Sensor Performance in Static and Dynamic Tests
Performance Metric MPU6050 (DMP Output) JY901 (Onboard Filter)
Static Accuracy (Roll/Pitch) ±0.2° (after calibration) ±0.05° (as specified/manual cal.)
Yaw Accuracy (with mag) N/A (requires external mag) ±0.3° (after mag calibration)
Yaw Drift (Static, 60s) Significant (>5°) Negligible (<0.2°)
Dynamic Response Lag ~10 ms ~15 ms
Output Data Stability Moderate noise Very stable, low noise
Required CPU Load on Pi Very Low (DMP does fusion) Very Low (sensor does fusion)
Humanoid Robot Posture Recovery Test Results (Using JY901)
Initial Posture Condition Average Recovery Time (s) Success Rate (%) Final Posture Error (Pitch/Roll)
Forward Lean (20° pitch) 2.1 100 ±1.5°
Lateral Lean (15° roll) 1.8 100 ±1.2°
Fallen Forward (on chest) 5.4 90 ±2.0°
Fallen Backward (on back) 6.0 85 ±2.5°
Arbitrary Yaw offset (90°) 3.5 100 Yaw error ±3.0°

Discussion and Conclusion

The experimental results clearly demonstrate the critical importance of sensor selection for reliable attitude estimation in a humanoid robot. The JY901 module, with its integrated magnetometer and advanced onboard Kalman filtering, provided significantly more stable and accurate orientation data, especially for the yaw angle. The MPU6050, while excellent for measuring rotational rates and linear acceleration, exhibited substantial yaw drift when used standalone because it lacks a heading reference. This drift directly translated to poorer performance in the humanoid robot’s ability to correctly adjust its orientation over time. The magnetometer in the JY901 provides an absolute reference to the Earth’s magnetic field, correcting the integral drift inherent in gyroscope-based yaw estimation. The filter dynamics can be modeled as a sensor fusion problem where the state vector includes orientation and gyro bias. The measurement update incorporates accelerometer data for gravity vector alignment (correcting pitch/roll) and magnetometer data for heading alignment.

The system successfully met its primary goal: enabling a bipedal humanoid robot to perceive its own attitude and execute appropriate postural adjustments. The integration of the Raspberry Pi provided a flexible and powerful computation platform. Its ability to run complex Python scripts for sensor data processing, state machine logic, and serial communication was instrumental. The modular design—separating sensor data acquisition, attitude estimation, and servo control—proved robust and allowed for easy swapping of sensor hardware or adjustment of control algorithms.

The posture recovery strategies, though pre-scripted, were effective for a range of common instability scenarios. The closed-loop nature of the system, where the robot continuously senses and reacts, is a foundational step toward more autonomous and adaptive balance control for humanoid robots. Future work will focus on implementing more advanced, adaptive control algorithms such as linear quadratic regulators (LQR) or model predictive control (MPC) that use the estimated attitude as state feedback. The system state could be defined as:

$$ x = [\phi, \theta, \psi, \dot{\phi}, \dot{\theta}, \dot{\psi}, \text{… joint angles…}]^T $$

and a cost function \( J \) minimized over a future horizon \( N \):

$$ J = \sum_{k=0}^{N} (x_k^T Q x_k + u_k^T R u_k) $$

where \( Q \) and \( R \) are weight matrices, and \( u \) represents control inputs (servo targets). Furthermore, expanding the sensor suite to include force-sensitive resistors (FSRs) in the feet would provide crucial information about center of pressure, enabling even more sophisticated balance control for the humanoid robot.

In summary, this project has successfully designed, built, and validated a practical attitude estimation and control system for a bipedal humanoid robot. It underscores the synergy between appropriate sensor hardware (like the JY901 with its magnetometer and filter) and flexible computation platforms (like the Raspberry Pi). The methodologies and results presented contribute to the broader field of humanoid robotics, particularly in the area of state estimation and reactive stability control, which are indispensable for deploying humanoid robots in real-world, unstructured environments.

Scroll to Top