Design and Development of a Smart Family Companion Robot

In recent years, the rapid aging of the global population, particularly the rise of empty-nest elderly living alone, has highlighted an urgent need for innovative assistive technologies. As a researcher focused on embedded systems and robotics, I have been exploring practical solutions to enhance the quality of life for seniors through intelligent automation. This paper presents my comprehensive design and implementation of a family companion robot, leveraging the NI myRIO embedded platform and LabVIEW graphical programming. The primary goal is to create an affordable, flexible, and multifunctional companion robot that provides daily companionship, safety monitoring, and interactive engagement for elderly individuals living independently.

The core concept revolves around a mobile robot capable of autonomous following, voice interaction, obstacle avoidance, and remote communication. The choice of NI myRIO as the central processor is strategic: its rich I/O peripherals, real-time processing capabilities, and seamless integration with LabVIEW enable rapid prototyping and robust performance. This companion robot is designed not only as a functional aid but also as an empathetic presence, addressing social isolation and providing peace of mind for families. Throughout this work, the term “companion robot” is emphasized to underscore its role beyond mere automation—it is a proactive caregiver and friend.

The system architecture is modular, ensuring scalability and ease of maintenance. As illustrated in the conceptual diagram above, the companion robot integrates sensing, actuation, communication, and intelligence modules. The NI myRIO serves as the brain, coordinating data from cameras, ultrasonic sensors, infrared detectors, and voice modules. Motor drivers control movement, while WiFi enables remote monitoring via mobile devices. This holistic approach ensures that the companion robot can adapt to various home environments and user needs. The design philosophy prioritizes user-friendliness, with intuitive interfaces and fail-safe mechanisms to prevent accidents.

To quantify the system’s performance, several key parameters are defined. For instance, the companion robot maintains a “friendly distance” from the user, typically between 30 cm and 50 cm, to avoid intrusion while ensuring proximity. The following equations govern distance measurement and motor control:

Ultrasonic distance calculation: $$s = \frac{340 \times t}{2}$$ where \(s\) is the distance in meters, and \(t\) is the time of flight in seconds. In practice, since the sensor outputs time in microseconds (\(\mu s\)), the formula is adapted for centimeters: $$y = \frac{0.034 \times x}{2}$$ where \(y\) is the distance in cm, and \(x\) is the pulse width in \(\mu s\). This allows precise tracking for the companion robot.

For motor control, PWM signals drive the wheels, with speed adjusted based on distance error. A proportional-integral-derivative (PID) controller can be implemented, though in this initial design, a simpler threshold-based approach is used. The control law is: $$v = k_p \cdot (d_{target} – d_{actual})$$ where \(v\) is the motor velocity, \(k_p\) is a gain constant, \(d_{target}\) is the desired following distance (e.g., 40 cm), and \(d_{actual}\) is the measured distance. This ensures smooth movement of the companion robot.

Hardware Design and Integration

The hardware selection focuses on cost-effectiveness, reliability, and compatibility with NI myRIO. Each module is chosen to fulfill specific roles in the companion robot ecosystem. Below is a summary of the key components and their functions:

Component Model/Specification Function in Companion Robot
Core Processor NI myRIO-1900 Central control, data processing, and I/O management
Vision Sensor USB Webcam (e.g., 720p resolution) Real-time image capture for person detection
Distance Sensor HC-SR04 Ultrasonic Module Measuring distance to objects and users
Person Detection HC-SR501 PIR Sensor Infrared-based human presence sensing
Motor Driver L298N Dual H-Bridge Controlling DC motors for movement
Voice Module LD3320 Speech Recognition Chip Voice command processing and interaction
Communication WiFi via NI myRIO’s built-in adapter Remote data transmission and control
Power Supply 12V Li-ion Battery with Regulators Providing stable power to all modules

The wiring between NI myRIO and peripherals is critical for signal integrity. I configured digital I/O pins for sensor triggers, analog inputs for audio, and PWM outputs for motor speed control. For example, the ultrasonic sensor’s trigger and echo pins connect to digital lines, while the motor driver’s enable pins use PWM outputs. This setup allows the companion robot to react in real-time to environmental changes. The table below details the pin connections:

NI myRIO Pin Connected Module Signal Type Purpose
A/DIO0 (Pin11) L298N IN1 Digital Output Motor direction control
A/PWM0 (Pin27) L298N ENA PWM Output Motor speed control
C/DIO7 HC-SR04 Trig Digital Output Ultrasonic trigger pulse
C/DIO6 HC-SR04 Echo Digital Input Echo pulse measurement
A/AI0 (Pin3) HC-SR501 OUT Analog Input PIR sensor output
A/TX (Pin14) LD3320 RXD UART Transmit Serial communication to voice module
A/RX (Pin10) LD3320 TXD UART Receive Serial data from voice module

Power management is another essential aspect. The companion robot operates on a 12V battery, with voltage regulators stepping down to 5V for sensors and 3.3V for logic circuits. This ensures safe operation and prolongs battery life. The total power consumption is estimated using: $$P_{total} = \sum (V_i \times I_i)$$ where \(V_i\) and \(I_i\) are the voltage and current for each module. For instance, the motors draw up to 2A each under load, so heat sinks are added to the L298N driver. By optimizing hardware layout, I minimized interference and improved the reliability of this companion robot.

Software Architecture and Implementation

The software is developed in LabVIEW, utilizing its graphical programming paradigm to create modular, readable code. The program flow follows a state-machine pattern, with key states such as Initialization, Person Search, Following, Interaction, and Alert. This structure allows the companion robot to switch tasks seamlessly based on sensor inputs and user commands. Below, I detail the three core software modules: autonomous following, voice interaction, and data communication.

Autonomous Following Module

This module enables the companion robot to detect and follow a person while avoiding obstacles. It combines computer vision, ultrasonic ranging, and motor control. First, video from the USB camera is processed using NI Vision Development Module. I trained a pattern-matching algorithm to recognize human silhouettes, particularly leg patterns, as they are stable features in home settings. The matching score \(M\) is computed as: $$M = \frac{\sum (I_{template} \cdot I_{scene})}{\sqrt{\sum I_{template}^2 \cdot \sum I_{scene}^2}}$$ where \(I\) represents pixel intensities. If \(M\) exceeds a threshold (e.g., 0.7), a person is detected.

Concurrently, the PIR sensor provides a binary output for human presence, reducing false positives. When both sensors affirm detection, the companion robot activates the ultrasonic sensor to measure distance. The HC-SR04 emits a 40 kHz pulse, and the echo time is captured via a counter on NI myRIO. Using the formula earlier, distance is calculated. If the distance is greater than 50 cm, the robot moves forward; if less than 30 cm, it stops; and if between 30-50 cm, it maintains position. This hysteresis prevents oscillatory behavior.

For obstacle avoidance, a servo motor (SG90) pans the ultrasonic sensor to scan five directions: -90°, -45°, 0°, 45°, and 90°. The distances are stored in an array \(D = [d_1, d_2, d_3, d_4, d_5]\). The robot then turns toward the direction with the maximum distance, i.e., $$\theta_{turn} = \arg\max(D)$$ This simple algorithm ensures safe navigation in cluttered spaces. The motor control signals are generated using PWM, with duty cycle adjusted proportionally to speed requirements. The overall following accuracy of the companion robot is within ±5 cm under normal lighting conditions.

Voice Interaction Module

Voice interaction is crucial for making the companion robot user-friendly, especially for elderly individuals who may not be tech-savvy. The LD3320 module handles speech recognition locally, reducing latency and privacy concerns. It supports a vocabulary of up to 50 phrases, which I programmed with common commands like “Hello,” “Follow me,” “What time is it?” and “Help.” Recognition results are sent via UART to NI myRIO at 9600 baud.

On the software side, LabVIEW manages a voice library stored on the NI myRIO’s internal memory. Audio files are recorded as .dat files (uncompressed PCM format) at a sampling rate of 32 kS/s to balance quality and processing load. The audio playback involves reading these files, extracting amplitude data, and outputting to the myRIO’s audio-out port. The sound pressure level \(L_p\) in decibels can be approximated for feedback: $$L_p = 20 \log_{10}\left(\frac{P}{P_0}\right)$$ where \(P\) is the measured amplitude, and \(P_0\) is a reference. This ensures audible yet non-intrusive responses from the companion robot.

A finite-state machine handles dialogue flows. For example, if the user says “Remind me to take medicine,” the companion robot acknowledges, stores the task, and triggers an alarm at the scheduled time. The interaction loop runs at 10 Hz, ensuring prompt responses. Additionally, the robot can play preloaded music or news updates, enhancing its role as a companion. The table below summarizes key voice commands and actions:

Voice Command Action by Companion Robot Response Audio
“Hello robot” Greet user and activate following “Hi, I’m here to accompany you.”
“Stop following” Halt motors and enter standby “Okay, I’ll wait here.”
“What is today’s date?” Retrieve system time and speak “Today is [date].”
“Emergency help” Send alert message via WiFi “Alert sent to your family.”
“Play music” Stream audio from library Plays selected melody

Data Communication Module

Remote monitoring is essential for families to check on elderly relatives. The NI myRIO’s WiFi capability allows the companion robot to transmit real-time data to a cloud server or directly to mobile devices. I implemented a client-server model using LabVIEW’s Network Streams or shared variables. Data packets include sensor readings, battery status, and error flags, formatted as JSON for interoperability. The transmission interval is configurable, typically set to 5 seconds to balance network load and timeliness.

On the mobile end, I developed a simple dashboard using NI Data Dashboard for iPad. It displays live video feed, distance metrics, and system logs. Users can also send commands, such as forcing the companion robot to return to a charging dock. The communication protocol ensures encryption for privacy. The signal strength \(RSSI\) in dBm affects connectivity: $$RSSI = -10n \log_{10}(d) + C$$ where \(n\) is the path-loss exponent, \(d\) is distance from router, and \(C\) is a constant. In home environments, the companion robot maintains a stable connection within 20 meters.

For abnormal situations, like a fall detection (inferred from prolonged inactivity or sudden loud noises), the companion robot autonomously sends an SMS or email to predefined contacts. This feature uses IFTTT webhooks or direct SMTP via myRIO. The alert rate is critical: $$Alert_{rate} = \frac{Number\ of\ true\ alerts}{Total\ events} \times 100\%$$ aiming for above 95% accuracy to minimize false alarms. The integration of communication transforms the companion robot from a local assistant to a connected health monitor.

Simulation and Testing Results

Due to constraints, extensive physical testing was supplemented with simulation in LabVIEW and MATLAB. I modeled the companion robot’s dynamics as a differential drive system, with equations: $$\dot{x} = v \cos(\theta), \quad \dot{y} = v \sin(\theta), \quad \dot{\theta} = \frac{v_r – v_l}{L}$$ where \(x, y\) are positions, \(\theta\) is orientation, \(v_r\) and \(v_l\) are right and left wheel velocities, and \(L\) is the wheelbase. Simulations validated the control algorithms before deployment.

In functional tests, the companion robot successfully followed a person at speeds up to 0.5 m/s, with mean distance error of 3.2 cm. Obstacle avoidance worked for objects as small as 10 cm in diameter. Voice recognition accuracy was 88% in quiet environments and 75% with moderate background noise. The table below summarizes performance metrics:

Metric Target Value Measured Value Unit
Following Distance Error < 5 cm 3.2 ± 1.1 cm
Maximum Speed 0.5 m/s 0.48 m/s
Battery Life > 4 hours 4.5 hours
Voice Recognition Rate > 85% 88% %
WiFi Range 20 m 22 meters
Alert Latency < 10 s 8.5 seconds

The system’s robustness was tested under various lighting and floor conditions. The companion robot performed well in typical living rooms but struggled on very dark surfaces due to infrared sensor limitations. Future iterations could include additional sensors like LiDAR for improved mapping. Overall, the design meets the core requirements of a family companion robot, providing reliable companionship and safety features.

Conclusion and Future Work

This project demonstrates the feasibility of building a low-cost, intelligent companion robot using off-the-shelf components and the NI myRIO platform. The integration of autonomous navigation, voice interaction, and wireless communication creates a holistic solution for elderly care. The companion robot not only assists with daily routines but also offers emotional support through interactive engagement. Key innovations include the hybrid sensing approach (vision + IR + ultrasonic) and the modular software architecture, which allow easy upgrades and customization.

Looking ahead, several enhancements are planned. First, machine learning algorithms could be incorporated for personalized behavior, such as learning the user’s daily patterns and adapting responses. For instance, a reinforcement learning model could optimize following paths: $$Q(s,a) \leftarrow Q(s,a) + \alpha [r + \gamma \max_{a’} Q(s’,a’) – Q(s,a)]$$ where \(Q\) is the action-value function, \(s\) is state, \(a\) is action, \(r\) is reward, and \(\alpha, \gamma\) are parameters. This would make the companion robot more intuitive.

Second, additional health monitoring sensors (e.g., heart rate, temperature) could be integrated, turning the companion robot into a mobile health assistant. Data analytics could then predict potential issues using statistical models: $$P(health\ event) = f(sensor\ data, history)$$ Third, swarm robotics concepts could enable multiple companion robots to collaborate in larger homes, communicating via mesh networks. The scalability of the NI myRIO platform supports such expansions.

In conclusion, this family companion robot represents a step toward sustainable elderly care through technology. By emphasizing affordability and user-centric design, it has the potential to improve the lives of millions. The open-source nature of the software and hardware schematics will encourage community development, fostering innovation in the field of companion robotics. As population aging accelerates globally, such intelligent systems will become indispensable, and this work aims to contribute to that future.

Scroll to Top