In recent years, the development of interactive robots has become a focal point in various fields, driven by advancements in remote control and real-time multimedia communication. These systems hold significant market value, particularly for applications such as home monitoring and pet companionship. As a researcher in embedded systems and robotics, I have dedicated efforts to designing a smart cloud video companion robot that enables Wi-Fi remote control and real-time video interaction. This companion robot aims to provide environmental surveillance and engage with pets through cloud-based video streaming, offering a novel solution for remote companionship. The core innovation lies in integrating advanced control algorithms, efficient hardware, and robust software to achieve low-latency, responsive operation. Throughout this article, I will elaborate on the design process, emphasizing key aspects like control algorithms, hardware circuitry, and software implementation, while frequently highlighting the role of this companion robot in enhancing user experience.
The companion robot system is structured into three main components: the robot本体, data transmission links, and终端 devices. From my perspective, the robot本体 serves as the physical embodiment of the companion robot, equipped with mecanum wheels for omnidirectional movement, a pan-tilt platform for camera orientation, and a robotic arm for interactive modules. The data transmission链路 utilizes Wi-Fi routers and a streaming media server to relay control commands and video streams, ensuring seamless communication.终端 devices, such as Qt or Android-based applications, allow users to monitor and control the companion robot remotely. To summarize the system architecture, I have compiled the key elements in Table 1.

| Component | Description | Function |
|---|---|---|
| Robot本体 | Includes mecanum wheels, pan-tilt, camera, and robotic arm | Physical interaction and data acquisition |
| Main Controller | NXP i.MX 8 processor with ARM Cortex-A53 cores | Multimedia processing and system coordination |
| Sub-Controller | STM32F103 microcontroller | Real-time control of motors and sensors |
| Data Transmission | Wi-Fi via Intel 9260 module, RTMP protocol | Streaming video and control data |
| Server | Nginx with RTMP module | Video streaming and command forwarding |
| Client终端 | Qt or Android application | User interface for remote operation |
To achieve precise motion control in this companion robot, I implemented advanced algorithms for motor regulation. The mecanum wheels require闭环 speed control for omnidirectional movement, while the pan-tilt system relies on field-oriented control (FOC) for smooth camera rotation. From my design experience, these algorithms are critical for enhancing the responsiveness of the companion robot. Below, I detail the mathematical foundations and their application.
For the mecanum wheels, I employed an improved PID闭环 speed algorithm. The PID controller adjusts motor speed based on error between desired and actual velocities. The standard PID formula is given by:
$$ u(t) = K_p e(t) + K_i \int_0^t e(\tau) d\tau + K_d \frac{de(t)}{dt} $$
where \( K_p \), \( K_i \), and \( K_d \) are proportional, integral, and derivative gains, respectively, and \( e(t) \) is the error at time \( t \). In this companion robot, I enhanced this by using an incremental PID approach to reduce computational overhead and improve stability. The incremental form is derived as:
$$ \Delta u(t) = K_p [e(t) – e(t-1)] + K_i e(t) + K_d [e(t) – 2e(t-1) + e(t-2)] $$
This allows for smoother velocity adjustments in the companion robot’s wheels. To illustrate the parameter tuning process, I have summarized typical values used in Table 2.
| Parameter | Value Range | Effect on Companion Robot Motion |
|---|---|---|
| \( K_p \) | 0.5–2.0 | Adjusts responsiveness to error; higher values reduce steady-state error but may cause oscillations. |
| \( K_i \) | 0.01–0.1 | Eliminates residual error over time; crucial for precise positioning in the companion robot. |
| \( K_d \) | 0.05–0.3 | Damps oscillations; improves stability during rapid movements of the companion robot. |
For the pan-tilt system, I utilized FOC with space vector pulse width modulation (SVPWM) to control a brushless motor. FOC enables efficient torque control by aligning the motor’s magnetic field. The SVPWM algorithm generates PWM signals to approximate sinusoidal waveforms. Assume a DC bus voltage \( U_{dc} \) and three-phase voltages \( U_A \), \( U_B \), and \( U_C \) with 120° phase differences:
$$ U_A(t) = \sqrt{2} U_m \cos(2\pi f t) $$
$$ U_B(t) = \sqrt{2} U_m \cos\left(2\pi f t – \frac{2\pi}{3}\right) $$
$$ U_C(t) = \sqrt{2} U_m \cos\left(2\pi f t + \frac{2\pi}{3}\right) $$
where \( U_m \) is the phase voltage RMS and \( f \) is the frequency. The SVPWM technique synthesizes a reference voltage vector \( U_{ref} \) using adjacent voltage vectors and zero vectors. The伏秒 balance principle is applied:
$$ \int_0^T U_{ref} \, dt = \int_0^{T_x} U_x \, dt + \int_{T_x}^{T_x + T_y} U_y \, dt + \int_{T_x + T_y}^T U_0^* \, dt $$
In this companion robot, I implemented this in the sub-controller to achieve 360° horizontal rotation with minimal jerk. The algorithm involves determining the扇区 of \( U_{ref} \), calculating time intervals \( T_1 \), \( T_2 \), and \( T_3 \), and assigning PWM outputs. This ensures smooth camera movement, enhancing the video quality for the companion robot’s interaction.
The hardware design of this companion robot focuses on reliability and efficiency. I selected components to support real-time processing and robust communication. Key circuits include voltage regulation, motor drives, and wireless interfaces. From my perspective, these elements are vital for the companion robot’s performance in diverse environments.
For power management, I designed降压稳压 circuits using TPS54531 switching regulators. These provide multiple voltage rails: 3.3 V for digital logic, 5 V for analog circuits, and 7.2 V for the pan-tilt servo. The TPS54531 offers over 90% efficiency, crucial for prolonging battery life in the companion robot. The circuit design emphasizes components with high current ratings, such as 5 A inductors and Schottky diodes, to handle peak loads. To summarize the power specifications, I present Table 3.
| Voltage Rail | Current Rating | Purpose in Companion Robot |
|---|---|---|
| 3.3 V | Up to 5 A | Main controller and sensors |
| 5 V | Up to 5 A | Analog circuits and peripherals |
| 7.2 V | Up to 5 A | Pan-tilt servo and互动 modules |
The mecanum wheels are driven by DC brushed motors with A4950 driver chips. Each motor includes an incremental encoder for speed feedback, enabling the PID闭环 control. The A4950 provides a full-bridge MOSFET drive, supporting bidirectional control. This setup allows the companion robot to achieve precise omnidirectional motion, essential for navigating home spaces. For wireless communication, I integrated an Intel Wireless 9260AC module via a Mini PCIe interface. This supports Wi-Fi and Bluetooth, ensuring the companion robot can connect to local networks and stream video with low latency. The PCIe interface requires careful differential pair routing to maintain signal integrity.
Software development for this companion robot involves multiple layers, from low-level motor control to high-level video streaming. I architected the software to run on the main controller with Linux OS and the sub-controller for real-time tasks. The goal is to ensure the companion robot responds promptly to user commands while maintaining stable video transmission.
On the main controller, I implemented video capture using V4L2 (Video for Linux 2), encoding via FFmpeg, and streaming through RTMP to an Nginx server. The video pipeline is optimized for low latency, a key requirement for real-time interaction with the companion robot. For control data, I used TCP protocol for reliable transmission between the server and robot. On the sub-controller, I developed firmware for the mecanum wheel and pan-tilt modules. The mecanum wheel controller parses speed vectors from the main controller and applies incremental PID to adjust motor PWM duty cycles. The pan-tilt controller uses SVPWM-based FOC to position the brushless motor according to angle commands. To illustrate the software workflow, I have outlined key steps in Table 4.
| Module | Platform | Key Functions |
|---|---|---|
| Video Streaming | Main Controller (Linux) | V4L2 capture, FFmpeg encoding, RTMP push to server |
| Control Communication | Main Controller | TCP client for command reception and transmission |
| Mecanum Wheel Control | Sub-Controller (STM32) | PID speed闭环, MODBUS protocol for data parsing |
| Pan-Tilt Control | Sub-Controller (STM32) | FOC with SVPWM, angle positioning for camera |
| Client Application | Qt or Android | Video decoding, user interface, command generation |
Testing and validation are crucial to ensure the companion robot meets performance expectations. I conducted extensive experiments to evaluate connectivity, latency, and responsiveness. From my tests, the companion robot demonstrates reliable operation within typical home environments.
First, I deployed the server on a cloud platform (simulated as阿里云) and measured video stream latency. Using FFmpeg tools, I pushed and pulled streams at 720p resolution, 20 fps, and 2 Mbps bitrate. The results, averaged over multiple trials, show an average delay of 150 ms, which is acceptable for real-time interaction with the companion robot. Detailed latency data is provided in Table 5.
| Trial Set | Average Latency (ms) | Notes on Companion Robot Performance |
|---|---|---|
| 1 | 150 | Stable streaming with minimal packet loss |
| 2 | 145 | Slight variations due to network conditions |
| 3 | 155 | Consistent within 10 m Wi-Fi range |
| 4 | 148 | Effective for real-time pet interaction |
| 5 | 152 | No noticeable lag in companion robot video feed |
Second, I tested control data transmission using network debugging tools. The companion robot successfully received commands from the server and executed movements with negligible delay. In综合 tests, the companion robot responded well within a 10 m radius of the Wi-Fi router, confirming robust wireless performance. The integration of PID and FOC algorithms contributed to smooth motion, enhancing the user experience of controlling this companion robot.
In conclusion, this project represents a comprehensive approach to designing a cloud-based companion robot for remote宠物陪伴 and environmental monitoring. I have detailed the system architecture, control algorithms, hardware circuits, and software implementation, all aimed at creating a responsive and reliable companion robot. The use of improved PID and FOC algorithms enables precise control, while efficient video streaming ensures low-latency interaction. From my perspective, the companion robot achieves its goals of providing real-time video feedback and remote操控, with potential applications in home security and pet care. Future work could focus on enhancing the robotic arm with more interactive modules and integrating AI for autonomous behavior, further advancing the capabilities of such companion robots. Throughout this article, I have emphasized the importance of each component in building an effective companion robot, highlighting how technological integration can foster innovative solutions for modern companionship needs.
