In recent years, the aging population and increasing incidence of stroke have heightened the demand for effective rehabilitation technologies. Stroke often leads to motor dysfunction, necessitating prolonged and costly rehabilitation processes. Traditional methods rely heavily on manual therapy, which is inconsistent, labor-intensive, and often delays optimal recovery. To address these challenges, we propose a human-robot interaction system utilizing a six-axis force sensor to accurately recognize movement intention, enabling precise control of rehabilitation robots. This approach focuses on detecting directional displacement and velocity changes through force data, offering a stable and simplified alternative to complex biological signal-based methods.
The core of our system involves real-time data acquisition from a six-axis force sensor, which measures three orthogonal forces (Fx, Fy, Fz) and three moments (Mx, My, Mz). We developed a data reception algorithm on the Visual Studio 2019 platform using C++, implementing socket communication for seamless data transfer between the sensor and an upper computer. This setup ensures continuous monitoring of interaction forces, facilitating immediate response to patient movements. Data processing is critical due to noise from hand tremors and environmental factors. We applied a second-order low-pass filter to smooth the data, with the transfer function defined as:
$$G(s) = \frac{\omega_n^2}{s^2 + 2\xi\omega_n s + \omega_n^2}$$
where $\omega_n$ is the cutoff frequency and $\xi$ is the damping ratio. Through fitting collected data using MATLAB’s Curve Fitting Toolbox, we identified optimal parameters: $\omega_n = 2.90\, \text{rad/s}$ and $\xi = 0.7$, resulting in minimal latency (0.2 s) and effective noise reduction.

Movement intention recognition is achieved by analyzing the relative change in force data. The algorithm computes the difference between real-time force ($F_{\text{act}}$) and initial force ($F_{\text{ini}}$):
$$k_{\text{act}} = F_{\text{act}} – F_{\text{ini}}$$
A tolerance threshold $\sigma$ (set to 3 N) accounts for natural hand tremors. If $-\sigma \leq k_{\text{act}} \leq +\sigma$, the system interprets no movement intention; otherwise, it detects motion in the positive or negative direction along the coordinate axes. For active movement, the resultant force $F_{\text{act}}$ is derived from the vector sum of forces in x, y, and z directions. If $F_{\text{act}}$ exceeds a resistance threshold $F_{\text{res}}$ (patient-specific) and remains below 20 N for safety, the robot executes motion using a velocity control model. The relationship between force and velocity is modeled to ensure smooth and responsive robot behavior, with collision detection for emergency stops.
Experimental validation involved tests with a UR10 collaborative robot equipped with an OptoForce six-axis force sensor. Patients performed tasks like reaching for a cup, with the robot following their movement intentions. The six-axis force sensor provided accurate force data, enabling real-time adjustments. Data fitting revealed multiple frequency components, as summarized in Table 1.
| Order | Frequency (rad/s) |
|---|---|
| 1 | 1.061 |
| 2 | 7.072 |
| 3 | 2.854 |
| 4 | 7.959 |
| 5 | 4.054 |
| 6 | 13.91 |
| 7 | 9.548 |
| 8 | 9.543 |
The second-order low-pass filter effectively minimized noise, as shown by the step response analysis. Comparative plots of different damping ratios (0.5, 0.7, 0.8, 1) indicated that $\xi = 0.7$ provided the best balance between response time and stability. The filter’s performance ensured that the six-axis force sensor data was reliable for intention recognition, with force changes accurately triggering robot movements. In tests, the system successfully guided patients through tasks, such as fetching a cup, demonstrating the six-axis force sensor’s capability to interpret complex movement intentions in real-time.
Our approach leverages the six-axis force sensor’s high sensitivity and stability to overcome limitations of biological signals like EMG, which are prone to variability and interference. The mathematical model for velocity control incorporates force-velocity relationships to dynamically adjust robot motion, enhancing safety and precision. For instance, the velocity $v$ in the movel command is derived from the force components, with constraints to prevent excessive speeds. The algorithm continuously monitors forces and moments, updating the target pose at each step to ensure seamless interaction.
In conclusion, the integration of a six-axis force sensor in rehabilitation robotics offers a robust solution for movement intention recognition. Our system achieves high accuracy in detecting directional forces and velocities, facilitating effective upper limb rehabilitation. Future work will focus on optimizing trajectory planning and adapting the system for diverse patient-specific movements. The six-axis force sensor remains pivotal in advancing human-robot interaction, providing a foundation for more intelligent and responsive rehabilitation devices.
