Design of an Autonomous Family Companion Robot Based on ROS

In recent years, with rapid economic development and social changes, an increasing number of young people have left their hometowns to work in distant locations, leading to a growing population of left-behind children, empty-nest elderly, and individuals with limited mobility. These groups often suffer from loneliness due to lack of companionship, which can have severe impacts on their physical and mental health, sometimes resulting in self-neglect or even suicidal tendencies. While various companion robots are available on the market, most are designed for children, such as early education robots like Alpha Egg by iFlytek, Wukong by Ubtech, and Luka by Wuling. Others, like Honda’s Asimo and SoftBank’s Pepper, are advanced humanoid robots but are often expensive or lack specific features for elderly companionship. To address these gaps, we have designed an autonomous family companion robot that leverages the Robot Operating System (ROS) to provide comprehensive陪伴 through natural language interaction, physical communication, and user-friendly interfaces.

The primary goal of this companion robot is to offer emotional and practical support through communication, which includes both verbal and non-verbal interactions. Verbal communication is achieved via natural language processing for activities like chatting, storytelling, and singing, while non-verbal communication involves eye expressions and humanoid upper limbs for gestures such as waving, greeting, and object manipulation. Additionally, the companion robot utilizes a user interface (UI) combined with machine vision and networking capabilities to enable further human-robot interactions. This design ensures that the companion robot can cater to diverse family needs, making it a versatile tool for enhancing quality of life.

To realize these functionalities, the family companion robot is built on the ROS architecture, which provides a modular and distributed framework for robot software development. The overall design consists of a humanoid structure with a height of 140 cm and a weight of 30 kg, comprising a head, upper limbs, torso, lower limbs, and a mobile base. The head, torso, and lower limbs are made from molded plastic shells with dimensions of 240 mm × 460 mm × 1200 mm. The head integrates an eye expression screen, cameras, and a voice interaction module, while the torso houses an embedded computer system, a touchscreen, and a depth-sensing camera. The upper limbs are designed with human-like自由度, featuring five degrees of freedom (DOF) per arm—three at the shoulder, one at the elbow, and one at the wrist—all driven by DC servo motors. The hands use additional motors to perform grasping motions. The mobile base employs a differential drive wheeled structure with two driving wheels and two caster wheels for stability, measuring 400 mm × 200 mm (H) with a ground clearance of 35 mm and a payload capacity of 30 kg. This physical design ensures that the companion robot can move autonomously and interact effectively in home environments.

The hardware system of the companion robot is centered around an embedded computer system, supported by various modules for interaction and control. Below is a summary of the key hardware components:

Module Components Specifications
Embedded Computer NVIDIA Jetson TX1 Maxwell GPU, >1 TFLOPS, 16 GB eMMC, extended with 120 GB SSD
Voice Interaction Mini-speaker, omnidirectional microphone, external sound card USB-connected for input/output
Human-Robot Interaction 5 MP camera, Intel RealSense ZR300, 5-inch LCD, 10.1-inch touchscreen Facial recognition, SLAM, 15 eye expressions, 2K HDR display
Upper Limb Control ZX20D serial bus servos (6 per arm), serial servo controller 20 kg·cm torque, addressable, angle feedback
Mobile Control Arduino MEGA2560, DC motors, encoders, ultrasonic sensors, RPLIDAR-A2 PID control, 500-line encoders, 6 ultrasonic sensors, laser scanning
Power Supply 12V 10,000 mAh Li-ion battery, DC/DC converters Powers all modules via voltage regulation

The embedded computer system uses the Jetson TX1 for its high computational power, essential for AI tasks like machine vision and natural language processing. The voice interaction module enables the companion robot to engage in conversations, while the human-robot interaction module combines visual and tactile elements for immersive experiences. The upper limb control module allows precise仿人 motions, and the mobile control module ensures accurate navigation and obstacle avoidance. The power system is designed to support prolonged operation, making the companion robot suitable for daily use in homes.

On the software side, the companion robot runs on Ubuntu 16.04 with ROS Kinetic, leveraging ROS nodes and packages for functionality. The software design is layered into planning and interaction layers, focusing on four core features: voice interaction, physical interaction, autonomous mobility, and UI-based interaction. Each feature is implemented as a set of ROS nodes that communicate via topics and services.

Voice interaction is achieved through a pipeline involving speech recognition, semantic understanding, and speech synthesis. We use iFlytek’s online API for Chinese speech recognition and Turing Robot API for AI-based semantic analysis. The process can be modeled as a sequence: $$ \text{Speech Input} \rightarrow \text{Recognition} \rightarrow \text{Text} \rightarrow \text{Semantic Analysis} \rightarrow \text{Response Text} \rightarrow \text{Synthesis} \rightarrow \text{Speech Output} $$. During this, the companion robot同步 changes eye expressions or triggers肢体 actions. The success rate of voice interaction is high, as tested in real scenarios, making the companion robot reliable for日常沟通.

Physical interaction relies on the upper limbs performing human-like gestures. We use ROS’s MoveIt! package for motion planning. The kinematics of the arm can be described using the Denavit-Hartenberg (DH) parameters, where each joint angle $\theta_i$ is controlled to achieve desired poses. For a joint $i$, the transformation matrix is given by: $$ T_i = \begin{bmatrix} \cos\theta_i & -\sin\theta_i \cos\alpha_i & \sin\theta_i \sin\alpha_i & a_i \cos\theta_i \\ \sin\theta_i & \cos\theta_i \cos\alpha_i & -\cos\theta_i \sin\alpha_i & a_i \sin\theta_i \\ 0 & \sin\alpha_i & \cos\alpha_i & d_i \\ 0 & 0 & 0 & 1 \end{bmatrix} $$. By solving inverse kinematics, we map desired end-effector positions to joint angles, which are sent to the servo controllers. The companion robot can perform actions like waving or grabbing, with feedback from encoders and depth cameras to avoid collisions.

Autonomous mobility is implemented using SLAM and navigation stacks. The mobile control node publishes odometry data based on encoder readings, and the laser radar and depth camera create 3D maps. The velocity control for differential drive robots is given by: $$ v = \frac{v_r + v_l}{2}, \quad \omega = \frac{v_r – v_l}{L} $$ where $v_r$ and $v_l$ are right and left wheel velocities, $v$ is linear velocity, $\omega$ is angular velocity, and $L$ is the wheelbase. PID controllers adjust motor speeds to follow paths: $$ u(t) = K_p e(t) + K_i \int_0^t e(\tau) d\tau + K_d \frac{de(t)}{dt} $$ where $e(t)$ is the error in position or velocity. This allows the companion robot to navigate homes safely while avoiding obstacles.

The UI provides an intuitive interface for users to access features like children’s education, entertainment, and identity verification via facial recognition. The main interface uses a card-based layout, and each function is wrapped as a ROS node for easy integration. This enhances the usability of the companion robot for all age groups.

During physical testing, we calibrated the companion robot to ensure accuracy. The mobile base was tuned for linear velocity error within 2 mm/s and rotational error below 2°. The upper limb joints were calibrated for range of motion, as summarized below:

Joint Action Max Angle (°) Min Angle (°)
1 Flexion/Extension 180 -40
2 Abduction/Adduction 180 0
3 External/Internal Rotation 60 -70
4 Elbow Flexion/Extension 135 0
5 Pronation/Supination 80 -80
6 Hand Open/Close 30 0

We conducted multiple tests on human-robot interactions, with results showing high success rates. The companion robot excelled in voice wake-up,聊天, eye expressions, and肢体交流. For example, in 10 trials, voice chat succeeded 9 times, and all UI-based functions worked flawlessly. This demonstrates the robustness of the companion robot in real-world settings.

Looking ahead, there are areas for improvement. The variety of physical interactions and eye expressions could be expanded, and more educational content could be added to the UI. Future work may involve enhancing the dexterity of the hands for finer manipulations and integrating more AI capabilities for personalized companionship. Overall, this companion robot design proves effective for family陪伴, and with continued development, it can adapt to even more roles in domestic environments.

The integration of ROS has been instrumental in this project, allowing for modular development and easy scalability. By using open-source packages and custom nodes, we have created a companion robot that is both functional and affordable. The use of advanced hardware like the Jetson TX1 ensures that the companion robot can handle complex tasks in real-time, making it a viable solution for addressing loneliness and providing assistance. As technology evolves, we believe that companion robots will become integral to家庭生活, offering not just陪伴 but also safety and convenience.

In conclusion, the autonomous family companion robot based on ROS represents a significant step forward in robotics for domestic use. It combines natural language processing, physical interaction, autonomous navigation, and user-friendly interfaces into a cohesive system. Through rigorous testing, we have validated its design and functionality, confirming that it meets the needs of daily family陪伴. The companion robot is not just a machine but a potential friend and helper, capable of improving the well-being of users. As we refine its features, this companion robot will continue to evolve, paving the way for smarter and more empathetic robotic companions in homes worldwide.

Scroll to Top