Performance Evaluation of Preschool Companion Robots Using Automated Data Collection and Analysis

In recent decades, robotics has made significant strides, particularly in educational settings where early childhood development plays a crucial role in shaping future outcomes. The preschool phase is foundational, as learning experiences during this period directly influence individual growth. To better address the needs of preschool education, traditional companion robots have achieved some successes but also revealed limitations. These robots often rely on pre-programmed routines and fixed datasets for interaction, lacking flexibility and personalized support for children’s individual differences. Moreover, traditional approaches in data collection and analysis are constrained, unable to fully leverage automation technologies for real-time acquisition, processing, and adaptation to dynamic learning environments. In this study, we explore the application of automated data collection and analysis methods in evaluating the performance of preschool companion robots. By integrating advanced data acquisition techniques, we aim to capture children’s behavioral data, emotional states, and learning progress in real-time, thereby gaining a comprehensive understanding of each student’s characteristics. Our innovation lies in the extensive use of automation, enabling robots to better adapt and respond to students’ needs through real-time data collection and intelligent analysis, ultimately enhancing the effectiveness of preschool companionship. We hope this research provides a feasible technical solution for improving the quality and impact of preschool education, offering valuable insights for the development of future intelligent educational systems.

The core of our approach involves designing an intelligent preschool companion robot with a robust data collection and processing model. We developed a sophisticated binocular stereoscopic vision measurement model to work synergistically with the companion robot, enriching children’s learning and interactive experiences. High-resolution binocular cameras serve as the hardware foundation, capable of rapidly capturing high-quality images. By configuring these cameras on the robot’s head or body, we ensure the robot can acquire depth information from the environment and accurately perceive object positions. This setup allows for real-time collection of multimodal data during robot-child interactions, including speech, facial expressions, and movements. The companion robot’s data acquisition system is illustrated in the following figure, which shows the measurement range of the binocular cameras:

In the data collection process for the preschool companion robot, we focus on parameters categorized into four main classes. First, operational parameters include the robot’s movement speed, which assesses its mobility efficiency in learning environments, and interaction pressure, reflecting the intensity of interactions with children. Second, on-site sensor information encompasses environmental perception organs such as cameras and microphones, along with the number and placement of touch sensors for detecting children’s tactile interactions. Third, robot work information involves the location of speech output devices and the status of LED lights, determining the direction of audio output and conveying the robot’s emotional or operational state. Finally, state information includes the robot’s engagement depth in learning activities, evaluating its participation level, and interaction status, recording current activities like dialogues or games. These parameters collectively enable the companion robot to participate more intelligently and personalizedly in preschool education, providing rich and beneficial learning experiences. The data collection workflow is summarized in Table 1, outlining key stages and associated metrics.

Table 1: Data Collection Parameters for Preschool Companion Robot
Parameter Category Specific Parameters Description
Operational Parameters Movement Speed, Interaction Pressure Assess robot mobility and interaction intensity.
Sensor Information Camera Resolution, Microphone Sensitivity, Touch Sensor Count Capture environmental and tactile data.
Work Information Speech Output Location, LED Status Determine audio direction and emotional cues.
State Information Engagement Depth, Interaction Status Evaluate learning participation and activity type.

The data collection process involves signals with time length denoted as L. In signal processing, system parameter information pertains to algorithms and their parameters used during noise reduction. We employ adaptive noise suppression algorithms, including parameters like noise suppression width and adaptive step-size factors, to optimize touch signal quality. Similarly, for visual and audio sensors, we select different denoising algorithms and adjust parameters such as correlation peak thresholds and equalizer orders to enhance accurate perception of environmental images and audio signals. Process data, generated after stages like denoising, demodulation, synchronization, and equalization, are intermediate data from various sensors. In the context of the preschool companion robot, these data can be channelized: touch sensors capture tactile signals, visual sensors acquire environmental images, and audio sensors record sound signals. Through equalization, we obtain constellation diagrams for touch, visual, and audio data, facilitating a better understanding of interaction scenarios between the companion robot and children in preschool settings.

To evaluate the performance of the companion robot, we implement a denoising algorithm-based assessment method. Currently, widely used parameters for evaluating denoising effects in signal processing include Normalized Cross-Correlation (NCC) and Signal-to-Noise Ratio (SNR). NCC measures similarity between signals, ranging from -1 to 1, where 1 indicates perfect similarity, 0 no correlation, and -1 perfect opposition. SNR assesses the relative strength of useful information versus noise, with higher values denoting better denoising performance. The SNR is calculated as:

$$ SNR = 20 \log_{10} \left( \frac{\max(s(n))}{\min(N(n))} \right) $$

where s(n) represents the signal and N(n) represents noise. SNR is an ideal metric for denoising performance; higher values indicate superior results. To address unknown noise intensity, we focus on the similarity between denoised and original signals within the frequency domain bandwidth. Assuming pre-denoising spectrum as P0(f) and post-denoising spectrum as P1(f), the similarity within bandwidth is given by:

$$ NCC = \frac{\int_{f_c – B}^{f_c + B} P_0(f) P_1(f) df}{\sqrt{\int_{f_c – B}^{f_c + B} P_0^2(f) df \cdot \int_{f_c – B}^{f_c + B} P_1^2(f) df}} $$

Here, NCC ranges from 0 to 1, where 0 indicates uncorrelated spectra (severe distortion from over-denoising) and 1 denotes identical signals (no effect on noise or signal). This range guides denoising effectiveness, with values closer to 1 suggesting higher similarity and less distortion, crucial for avoiding over-denoising in companion robot applications.

We further design the preschool companion robot system based on tangible interaction principles. The goal is to create an intelligent partner that aligns with children’s behavioral habits, especially in preschool education. We prioritize interaction modes matching children’s natural behaviors, using tangible and gesture-based interactions to foster engaging experiences. In preschool contexts, machine vision is vital; by emulating human visual systems, the companion robot intelligently recognizes and interprets children’s speech, tone, expressions, and body language. This enhances perceptual capabilities, allowing the robot to respond better to emotional needs and create intimate interactions. For situational awareness, we emphasize safe companionship in dynamic environments, leveraging advanced perception technologies for real-time adjustments. Lastly, we aim to provide native content conforming to children’s growth patterns, tailored to their developmental stages. The design process for the companion robot, centered on tangible interaction, is summarized in Table 2, detailing key phases from user behavior analysis to feedback mechanisms.

Table 2: Tangible Interaction Design Process for Companion Robot
Design Phase Activities Outcomes
User Behavior Analysis Market research, user studies to understand children’s habits. Set of interaction actions for user-friendly engagement.
Physical Interface Setup Design explicit physical entities for easy and accurate interaction. Environment aligned with children’s psychological expectations.
Digital Information Coupling Software analysis and computation for system operation. Closed-loop system integrating hardware and child interactions.
Tangible Feedback Use light, sound, scent for tangible feedback. Enhanced fun and emotional connection with companion robot.

In data from the preschool companion robot, an increase in specific operational or perceptual signal components within bandwidth can cause the normalized median variance to approach 1. This indicates greater signal variability during tasks or interactions, reflecting high signal strength and complexity, possibly due to multiple perceptual channels and diverse tasks. When applying similar noise signals to systems with different carriers and transmission rates, a relationship exists between normalized median variance and SNR. Typically, higher SNR correlates with lower normalized median variance, implying stable signals with minimal noise impact. Conversely, lower SNR leads to higher variance, suggesting significant fluctuations due to noise. We analyze this through signals representing pure noise and those without specific components. For instance, a signal with ID 926 denotes pure noise in companion robot interactions, always exhibiting large normalized median variance, whereas signal ID 925, lacking certain components, shows smaller variance. A representative signal for the companion robot, with duration 270 s, carrier 12 Hz, and transmission rate 4 b/s, is processed via Fourier transform with frequency resolution 0.1 and 1000 points per second. The normalized amplitude reflects specific patterns in the robot’s environmental perception and response.

Comparing denoising algorithms, we evaluate Adaptive Wiener Filter (AWF), Frequency Domain Median Filter (FDMF), and Noise Harmonic Suppression (NHS). Power spectrum results show AWF and FDMF have similar denoising effects, but normalized median variance analysis suggests AWF performs better, albeit with lower spectral similarity, indicating potential over-denoising. NHS fails to effectively remove noise at times. Statistical results from mean variance indicate that when pump noise is mild (normalized median variance near 0.4), higher spectral similarity correlates with better denoising. Specifically, the FDMF algorithm achieves optimal performance at a frequency domain median variance of 0.421, demonstrating the highest spectral similarity. This is summarized in Table 3, which compares algorithms based on key metrics relevant to companion robot data processing.

Table 3: Comparison of Denoising Algorithms for Companion Robot Data
Algorithm Spectral Similarity (NCC) Normalized Median Variance Overall Performance
Adaptive Wiener Filter (AWF) Moderate Low Good, but risk of over-denoising
Frequency Domain Median Filter (FDMF) High 0.421 (optimal) Best for mild noise conditions
Noise Harmonic Suppression (NHS) Low Variable Ineffective in some scenarios

To comprehensively assess the preschool companion robot, we conduct evaluations involving experts and users. Learning tasks are divided into five stages: (1) emotional guidance and user familiarization, (2) basic knowledge and skill transfer, (3) interactive deepening, (4) creative thinking and problem-solving, and (5) comprehensive application and summary evaluation. Three experts score the companion robot across these stages. For educational effectiveness, scores range from 85 to 97, indicating strong performance. For personalized adaptation, scores range from 80 to 96, with a minimum of 80 in stage 1 due to the robot’s learning curve, but overall high satisfaction. These results are detailed in Table 4, showing expert ratings for each stage, highlighting the companion robot’s strengths in education and adaptability.

Table 4: Expert Scores for Companion Robot Across Learning Stages
Learning Stage Educational Effectiveness Score (Range) Personalized Adaptation Score (Range)
Stage 1: Emotional Guidance 85-90 80-85
Stage 2: Basic Skills 88-93 85-90
Stage 3: Interactive Deepening 90-95 88-93
Stage 4: Creative Thinking 92-96 90-94
Stage 5: Comprehensive Application 94-97 92-96

Additionally, we gather feedback from 100 users who interacted with the preschool companion robot. Overall satisfaction scores are above 80, reflecting positive experiences. For interaction naturalness and emotional experience, most users rate above 80, though some indicate room for improvement in emotional responsiveness. In human-robot collaboration, interactive response time scores are high, benefiting from our binocular stereoscopic vision model, and privacy protection scores are also favorable, enhancing trust. User ratings are summarized in Table 5, providing insights into key aspects of the companion robot’s performance from a user perspective.

Table 5: User Satisfaction Scores for Companion Robot
Evaluation Aspect Average Score (out of 100) Comments
Interaction Naturalness 85 Generally smooth, but some delays noted.
Emotional Experience 82 Good, but emotional recognition could be enhanced.
Interactive Response Time 90 Excellent, due to efficient vision model.
Privacy Protection 88 High trust in data security measures.
Overall Satisfaction 86 Positive feedback for educational value.

The results underscore the efficacy of automated data collection and analysis in optimizing preschool companion robots. The FDMF denoising algorithm proves best under specific noise conditions, with a frequency domain median variance of 0.421. Expert assessments validate the robot’s strong educational impact and adaptive capabilities, while user feedback highlights satisfaction, particularly in response times facilitated by our binocular stereoscopic vision model. This research demonstrates that integrating advanced data techniques can significantly enhance the performance of companion robots in preschool settings, paving the way for more personalized and effective educational tools. Future work should explore additional factors, such as long-term engagement and cultural adaptability, to further refine companion robot designs. Ultimately, this study contributes to the growing field of intelligent education, emphasizing the transformative potential of companion robots in early childhood development.

In conclusion, our investigation into preschool companion robots leverages automation to address existing limitations. By designing a robust data acquisition framework, implementing effective denoising methods, and incorporating tangible interaction principles, we create a system that not only performs well in controlled evaluations but also resonates with users. The companion robot’s ability to adapt and respond in real-time, supported by empirical data, marks a step forward in educational technology. As robotics continues to evolve, such approaches will be crucial for developing companion robots that are not only functional but also empathetic and engaging for young learners. We encourage further research to expand on these findings, exploring diverse contexts and integrating emerging technologies like artificial intelligence for even smarter companion robots in preschool education.

Scroll to Top