In the field of robotics, companion robots have gained significant attention for their potential to provide social interaction and assistance in domestic environments. However, many existing companion robots lack articulated arms, limiting their ability to perform limb-based interactions such as gesturing, waving, or handshaking. This limitation reduces their effectiveness in engaging with humans through natural body language. To address this gap, we present the design, kinematic analysis, and implementation of humanoid upper limbs for a companion robot, enabling rich limb interaction capabilities. Our work focuses on creating a cost-effective and functional solution that integrates seamlessly with a companion robot platform, enhancing its social presence and utility.
The core motivation stems from the growing need for companion robots that can mimic human-like movements to foster emotional connections, especially for isolated individuals such as elderly or children. While advanced humanoid robots like ASIMO or iCub exist, they are often prohibitively expensive for widespread home use. Our aim is to develop an affordable yet capable upper limb system that allows a companion robot to perform a variety of gestures, thereby improving human-robot interaction through non-verbal communication. This paper details the mechanical design, forward kinematics using the Denavit-Hartenberg (D-H) method, gesture planning, and experimental validation. By incorporating these limbs, the companion robot can execute actions like greeting, waving, and shaking hands, making it more relatable and interactive.

The humanoid upper limbs for our companion robot consist of two symmetric arms, each with five degrees of freedom (DOFs), and simplified hands with one DOF for grasping. This design balances complexity and functionality, ensuring that the companion robot can perform essential gestures without excessive cost or control overhead. The arm structure mimics human anatomy: three DOFs at the shoulder (for flexion/extension, abduction/adduction, and internal/external rotation), one DOF at the elbow (for flexion/extension), and one DOF at the wrist (for rotation). The hand is designed as an end-effector capable of opening and closing via a single servo motor, enabling basic grip actions. Materials include aluminum alloy for the frame and plastic for the casing, keeping the weight low at approximately 2.0 kg per arm with a payload capacity of 0.5 kg. The dimensions are tailored to fit the companion robot’s body, which has a height of 1400 mm and a width of 400 mm, ensuring proportional and aesthetic integration.
To quantify the design parameters, we provide detailed specifications in Table 1. These parameters are critical for kinematic modeling and motion planning. The arm length totals 400 mm, divided into upper arm (100 mm), forearm (120 mm), and hand (80 mm) segments. Each joint is actuated by ZX20D serial bus servos, which offer precise angular control and daisy-chaining capability for simplified wiring. This modular approach allows the upper limbs to be easily attached to the companion robot’s torso, forming a cohesive system. The companion robot’s mobility is based on a differential-drive wheeled platform, enabling it to approach users and position its arms effectively during interactions.
| Parameter | Value | Description |
|---|---|---|
| Total Arm Length | 400.0 mm | Length from shoulder to hand tip |
| Upper Arm Length | 100.0 mm | Length from shoulder to elbow |
| Forearm Length | 120.0 mm | Length from elbow to wrist |
| Hand Length | 80.0 mm | Length from wrist to fingertip |
| Arm Weight | 2.0 kg | Weight per arm including actuators |
| Maximum Payload | 0.5 kg | Load capacity at the hand |
| Degrees of Freedom | 5 per arm | Joint count: shoulder (3), elbow (1), wrist (1) |
| Actuator Type | ZX20D Servo | Serial bus servo for each joint |
Forward kinematics analysis is essential for controlling the companion robot’s upper limbs, as it defines the relationship between joint angles and the hand’s position and orientation in space. We employ the Denavit-Hartenberg (D-H) convention to model the serial chain of links and joints. For each arm, we establish coordinate frames attached to each joint, starting from the base frame at the shoulder. The D-H parameters are derived based on the mechanical design, as shown in Table 2 for the left arm (the right arm is symmetric). Here, $a_{i-1}$ is the link length, $\alpha_{i-1}$ is the link twist, $d_i$ is the link offset, and $\theta_i$ is the joint angle. The initial joint angles are set to zero for simplicity, with allowable ranges specified to avoid collisions with the companion robot’s body.
| Joint $i$ | $a_{i-1}$ (mm) | $\alpha_{i-1}$ (°) | $d_i$ (mm) | $\theta_i$ (°) Initial | Joint Range (°) |
|---|---|---|---|---|---|
| 1 (Shoulder) | 0.0 | 0.0 | -270.0 | 0.0 | -40.0 to 180.0 |
| 2 (Shoulder) | 0.0 | 90.0 | 0.0 | -90.0 | -90.0 to 90.0 |
| 3 (Shoulder) | 0.0 | 90.0 | 125.0 | 0.0 | -135.0 to 60.0 |
| 4 (Elbow) | 0.0 | -90.0 | 0.0 | 0.0 | -135.0 to 0.0 |
| 5 (Wrist) | 0.0 | 90.0 | 160.0 | 0.0 | -90.0 to 120.0 |
The transformation matrix between consecutive frames, $^{i-1}T_i$, is given by the standard D-H formula:
$$^{i-1}T_i = \begin{bmatrix}
\cos\theta_i & -\sin\theta_i \cos\alpha_{i-1} & \sin\theta_i \sin\alpha_{i-1} & a_{i-1} \cos\theta_i \\
\sin\theta_i & \cos\theta_i \cos\alpha_{i-1} & -\cos\theta_i \sin\alpha_{i-1} & a_{i-1} \sin\theta_i \\
0 & \sin\alpha_{i-1} & \cos\alpha_{i-1} & d_i \\
0 & 0 & 0 & 1
\end{bmatrix}$$
For the companion robot’s upper limb, the overall transformation from the base frame (at the shoulder, $O_0$) to the hand frame (at the wrist, $O_5$) is computed by multiplying individual matrices:
$$^{0}T_5 = ^{0}T_1 \cdot ^{1}T_2 \cdot ^{2}T_3 \cdot ^{3}T_4 \cdot ^{4}T_5$$
Substituting the D-H parameters from Table 2, we derive the position and orientation of the hand. The position vector $\mathbf{p} = [p_x, p_y, p_z]^T$ in the base frame is expressed as:
$$p_x = \cos\theta_1 \cos\theta_2 \cos\theta_3 \sin\theta_4 L_3 + \sin\theta_1 \sin\theta_3 \sin\theta_4 L_3 + \cos\theta_1 \sin\theta_2 \cos\theta_4 L_3 + \cos\theta_1 \sin\theta_2 L_2$$
$$p_y = \sin\theta_1 \cos\theta_2 \cos\theta_3 \sin\theta_4 L_3 – \cos\theta_1 \sin\theta_3 \sin\theta_4 L_3 + \sin\theta_1 \sin\theta_2 \cos\theta_4 L_3 + \sin\theta_1 \sin\theta_2 L_2$$
$$p_z = \sin\theta_2 \cos\theta_3 \sin\theta_4 L_3 – \cos\theta_2 \cos\theta_4 L_3 – \cos\theta_2 L_2 – L_1$$
where $L_1 = 270.0$ mm, $L_2 = 125.0$ mm, and $L_3 = 160.0$ mm are constants from the D-H table. The orientation matrix $[\mathbf{n}, \mathbf{o}, \mathbf{a}]$ can be similarly derived, but for limb interaction, position control is often sufficient. This kinematic model allows us to compute the hand’s pose for any set of joint angles, enabling precise motion planning for the companion robot. Additionally, we define the transformation between the robot’s body frame and the arm base frames to coordinate movements in the global workspace. For both arms, the base frame relative to the body center is:
$$^{body}T_{arm} = \begin{bmatrix}
0 & -1 & 0 & 0 \\
0 & 0 & -1 & 0 \\
1 & 0 & 0 & 0 \\
0 & 0 & 0 & 1
\end{bmatrix}$$
Multiplying $^{body}T_{arm}$ with $^{0}T_5$ gives the hand pose in the companion robot’s body coordinate system, facilitating integrated control with the mobile base.
To enable natural limb interaction, we designed a set of gestures inspired by human body language, such as greeting, waving, handshaking, and expressing emotions like apology or encouragement. Rather than relying solely on joint-space planning, which may produce unnatural motions, we adopted a data-driven approach. We recorded human subjects performing these gestures and measured the corresponding joint angles using motion capture or protractor tools. For each gesture, we averaged multiple trials to obtain typical joint angles, then adjusted them within the mechanical limits of the companion robot’s upper limbs to avoid self-collisions. Table 3 summarizes the joint angles for a common greeting gesture using the left arm, which involves raising the arm and slightly waving. The companion robot can execute this by sequentially moving joints to these angles, with the hand opening to show a palm.
| Joint | Measured Angle (°) Trial 1 | Measured Angle (°) Trial 2 | Measured Angle (°) Trial 3 | Measured Angle (°) Trial 4 | Measured Angle (°) Trial 5 | Typical Angle (°) for Companion Robot |
|---|---|---|---|---|---|---|
| $\theta_1$ (Shoulder) | 48.5 | 46.8 | 44.1 | 47.3 | 43.9 | 45.0 |
| $\theta_2$ (Shoulder) | -33.2 | -30.5 | -29.6 | -31.8 | -32.3 | -30.0 |
| $\theta_3$ (Shoulder) | -115.3 | -113.4 | -114.8 | -113.2 | -114.5 | -114.0 |
| $\theta_4$ (Elbow) | -92.5 | -88.3 | -91.8 | -89.6 | -90.2 | -90.0 |
| $\theta_5$ (Wrist) | 91.4 | 90.6 | 92.3 | 91.1 | 91.3 | 90.0 |
We extended this methodology to design 15 distinct gestures for the companion robot, including waving goodbye, shaking hands, expressing frustration, and showing excitement. Each gesture is defined by a sequence of joint angle targets and timing profiles. For instance, waving involves oscillating the forearm joint ($\theta_4$) periodically while holding other joints fixed. To ensure smooth execution, we implemented trajectory interpolation between keyframes using cubic splines, minimizing jerky motions that could detract from the companion robot’s social appeal. The gesture library is stored in the robot’s control system and triggered via voice commands or visual cues, allowing the companion robot to respond contextually during interactions.
Integration of the humanoid upper limbs with the companion robot platform involved both hardware and software components. The arms are connected to a serial servo controller that communicates with the main robot computer via USB. The companion robot operates on the Robot Operating System (ROS), so we developed a ROS package to manage limb control. This package includes nodes for kinematics computation, gesture playback, and collision avoidance, interfacing with other modules like speech recognition and facial expression display. Calibration was crucial to account for assembly tolerances; we performed a calibration routine by commanding known joint angles and measuring the resulting hand position with a ruler, then adjusting the kinematic parameters iteratively until error converged below 0.5 mm. Table 4 shows calibration results for the left arm, confirming accuracy suitable for limb interaction tasks.
| Test Case | Commanded Joint Angles (°) $(\theta_1, \theta_2, \theta_3, \theta_4, \theta_5)$ | Calculated Position (mm) $(p_x, p_y, p_z)$ | Measured Position (mm) $(p_x, p_y, p_z)$ | Error (mm) |
|---|---|---|---|---|
| 1 | (-30.0, -90.0, 0.0, 0.0, 0.0) | (-246.8, 142.5, -270.0) | (-246.7, 142.4, -270.0) | (0.1, 0.1, 0.0) |
| 2 | (0.0, -90.0, 0.0, 0.0, 0.0) | (-285.0, 0.0, -270.0) | (-285.0, 0.0, -270.0) | (0.0, 0.0, 0.0) |
| 3 | (0.0, -30.0, 30.0, -30.0, 30.0) | (-191.8, 40.0, -292.9) | (-191.6, 40.2, -292.6) | (0.2, -0.2, 0.3) |
| 4 | (30.0, 0.0, 60.0, -60.0, 60.0) | (-120.0, -138.6, -350.0) | (-120.3, -138.4, -350.2) | (-0.3, 0.2, -0.2) |
| 5 | (90.0, 60.0, -30.0, -90.0, -30.0) | (80.0, 39.0, -452.5) | (80.3, 38.8, -452.3) | (0.3, 0.2, 0.2) |
Experimental testing validated the functionality of the companion robot’s upper limbs. We deployed the integrated system in a home-like environment and conducted interaction sessions with users. The companion robot successfully performed all designed gestures, such as greeting when approached, waving goodbye, and shaking hands upon request. Users reported that the limb movements enhanced the robot’s expressiveness and made it feel more engaging. For example, during a greeting, the companion robot would raise its arm, open its hand, and subtly sway the forearm, accompanied by a friendly voice output and smiling eye LEDs. The motions were fluid and human-like, thanks to the kinematic model and gesture sequencing. We also tested payload handling by having the companion robot lift small objects like a cup, demonstrating potential for assistive tasks. The companion robot’s mobility allowed it to position itself optimally for each interaction, showcasing the synergy between the wheeled base and the articulated arms.
To further analyze performance, we evaluated the repeatability and accuracy of the limb movements. Using a motion capture system, we tracked the hand trajectory during multiple repetitions of a waving gesture. The standard deviation in position was less than 1.0 mm, indicating high repeatability. The companion robot’s response time for gesture execution averaged 0.5 seconds, which is acceptable for social interactions. Energy consumption was also measured; each arm drew approximately 0.8 A during motion, which is sustainable for the companion robot’s battery system. These metrics confirm that the design is practical for long-term use in domestic settings.
The forward kinematics model proved essential for motion planning. We derived the Jacobian matrix to analyze singularity and dexterity of the companion robot’s upper limbs. The Jacobian, $J$, relates joint velocities to end-effector velocities:
$$\dot{\mathbf{p}} = J \dot{\mathbf{\theta}}$$
where $\dot{\mathbf{p}} = [\dot{p}_x, \dot{p}_y, \dot{p}_z]^T$ and $\dot{\mathbf{\theta}} = [\dot{\theta}_1, \dot{\theta}_2, \dot{\theta}_3, \dot{\theta}_4, \dot{\theta}_5]^T$. For the companion robot, we computed $J$ analytically from the kinematic equations. For instance, the element $J_{11}$ is:
$$J_{11} = \frac{\partial p_x}{\partial \theta_1} = -\sin\theta_1 \cos\theta_2 \cos\theta_3 \sin\theta_4 L_3 + \cos\theta_1 \sin\theta_3 \sin\theta_4 L_3 – \sin\theta_1 \sin\theta_2 \cos\theta_4 L_3 – \sin\theta_1 \sin\theta_2 L_2$$
This allows us to assess manipulability and avoid configurations where the companion robot’s arm loses mobility. In practice, we constrained joint angles to stay away from singularities, ensuring smooth gesture execution.
In addition to gestures, we explored force control for physical interaction. By equipping the hands with pressure sensors, the companion robot can adjust grip strength during handshakes to mimic human touch. The force $F$ applied by the hand is modeled as:
$$F = k \cdot \Delta x$$
where $k$ is the servo stiffness and $\Delta x$ is the displacement from the open position. This simple model allows the companion robot to perform gentle grips, enhancing safety and user comfort. Future work will integrate torque sensors at joints for more sophisticated force feedback.
Comparison with existing companion robots highlights the advantages of our design. Most commercial companion robots, such as educational bots for children, lack arms or have only rudimentary limbs. Our humanoid upper limbs provide five DOFs per arm, offering greater expressiveness at a low cost. The use of off-the-shelf servos and 3D-printed parts keeps the system affordable, aligning with the goal of making companion robots accessible. Moreover, the ROS-based software architecture ensures modularity and ease of extension, allowing developers to add new gestures or integrate with other sensors.
Challenges encountered during development included mechanical backlash in joints and limited wrist mobility. We mitigated backlash by using precision servos and adding rubber damping, while the single-DOF wrist proved sufficient for basic gestures. Another issue was synchronization between arms during bilateral gestures, such as clapping. We addressed this by implementing a master-slave control scheme where one arm mirrors the other’s trajectory with an offset. The companion robot’s processor handled this computation in real-time without lag.
Looking ahead, we plan to enhance the companion robot’s upper limbs with more degrees of freedom, particularly in the wrist and fingers, to enable complex manipulation tasks like object sorting or tool use. Machine learning algorithms could be employed to generate adaptive gestures based on user emotions detected via cameras. Additionally, we aim to conduct long-term user studies to evaluate the impact of limb interaction on social bonding with the companion robot. Economic analysis suggests that mass production could reduce the cost per arm by 30%, making such systems viable for widespread adoption.
In conclusion, the design and implementation of humanoid upper limbs for companion robots significantly advance their ability to engage in limb-based interactions. Through detailed mechanical design, forward kinematics analysis, and gesture planning, we have created a functional system that allows a companion robot to perform natural gestures like greeting and waving. Integration with a mobile companion robot platform and testing in realistic scenarios validated the approach. This work lays the foundation for future developments in social robotics, where companion robots can use body language to communicate effectively, fostering deeper human-robot relationships. The companion robot, equipped with these limbs, becomes not just a tool but a relatable entity capable of expressing itself through motion.
