Collision Point Detection Using Six-Axis Force Sensor

In the field of robotics, ensuring safety during human-robot interaction is a critical challenge. Traditional methods for detecting collision points on robot bodies often rely on skin-based force sensors, which suffer from limitations such as high cost, low durability, and sensitivity to environmental factors. To address these issues, we propose a novel approach utilizing a six-axis force sensor mounted at the robot base. This method enables self-constraint through sensor data, eliminating the need for geometric information of the collision surface. By projecting spatial external force vectors onto an optimal plane and solving the collision point with minimal error, our technique enhances accuracy and robustness. Additionally, we introduce a dynamic force compensation algorithm to maintain zero sensor readings during motion in the absence of external collisions, allowing threshold-based collision detection. This article details the methodology, simulation experiments, and results, demonstrating the effectiveness of our approach in real-world scenarios.

The integration of sensors into robotic systems is essential for environmental perception, with force information and collision point detection playing pivotal roles. While visual sensors aid in object tracking and capture, collision sensing is crucial for preventing accidents in shared human-robot environments. Skin force sensors, though inspired by biological systems, are impractical due to their complexity and vulnerability. In contrast, six-axis force sensors offer a reliable alternative by measuring three-dimensional force and torque components. Our research focuses on leveraging these sensors for collision point detection without relying on structured probes or surface models, making it applicable to irregular shapes and dynamic conditions.

Collision scenarios on robot bodies can be categorized into single-point external force collisions, mixed force-torque collisions, and multi-point collisions. Our method primarily addresses single-point external force collisions, as multi-point cases can be decomposed into combinations of single-point interactions. The six-axis force sensor is positioned at the robot base to capture force and moment data, which are processed to determine the collision point coordinates. The core equation governing this relationship is derived from the force and torque equilibrium:

$$ \mathbf{M} = \mathbf{P} \times \mathbf{F} $$

where $\mathbf{M}$ represents the moment vector, $\mathbf{F}$ is the force vector, and $\mathbf{P}$ denotes the position vector of the collision point relative to the sensor coordinate system. Expanding this into matrix form yields:

$$ \begin{bmatrix} 0 & -F_z & F_y \\ F_z & 0 & -F_x \\ -F_y & F_x & 0 \end{bmatrix} \begin{bmatrix} P_x \\ P_y \\ P_z \end{bmatrix} = \begin{bmatrix} M_x \\ M_y \\ M_z \end{bmatrix} $$

However, due to the singularity when $|\mathbf{F}| = 0$, the solution for $\mathbf{P}$ is not unique. To resolve this, we utilize multiple frames of sensor data to construct external force vector lines in space. These lines, represented as $L_c(p)$, are derived from the sensor readings and expressed as:

$$ \frac{x}{F_x} = \frac{y + \frac{M_z}{F_x}}{F_y} = \frac{z + \frac{M_y}{F_x}}{F_z} $$

In ideal conditions, these lines intersect at the collision point $P_c$. However, sensor noise often prevents exact intersections. Thus, we project these lines onto an optimal plane (e.g., XOY, XOZ, or YOZ) based on the selection factor $\eta$, which minimizes error. The optimal plane is chosen by calculating the angles between projected lines:

$$ \theta_{XOY} = \left| \arctan\left(\frac{F_{y1}}{F_{x1}}\right) – \arctan\left(\frac{F_{y2}}{F_{x2}}\right) \right| $$
$$ \theta_{XOZ} = \left| \arctan\left(\frac{F_{z1}}{F_{x1}}\right) – \arctan\left(\frac{F_{z2}}{F_{x2}}\right) \right| $$
$$ \theta_{YOZ} = \left| \arctan\left(\frac{F_{y1}}{F_{z1}}\right) – \arctan\left(\frac{F_{y2}}{F_{z2}}\right) \right| $$

The plane with the maximum angle is selected for projection. For instance, if XOY is optimal, the projected lines $L’_{XOY,1}$ and $L’_{XOY,2}$ are solved to find intersection point $P'(x’, y’)$. This point is then mapped back to the original force vector lines to obtain $P_1$ and $P_2$, and the collision point is initially estimated as their midpoint:

$$ P = \frac{1}{2} (P_1 + P_2) $$

To refine the solution, we introduce an error factor $\zeta = (\zeta_{M_x}, \zeta_{M_y}, \zeta_{M_z})$ to minimize the absolute error:

$$ \zeta_{M_x} = F_z y” – F_y z” – M_x $$
$$ \zeta_{M_y} = F_x z” – F_z x” – M_y $$
$$ \zeta_{M_z} = F_y x” – F_x y” – M_z $$

The optimal collision point is the one that minimizes $|\zeta| = \sqrt{\zeta_{M_x}^2 + \zeta_{M_y}^2 + \zeta_{M_z}^2}$. This approach ensures high accuracy even with sensor imperfections.

Prior to collision detection, dynamic force compensation is essential to account for the robot’s own weight and payload effects. The six-axis force sensor at the base must read zero during motion without external collisions. We employ the Denavit-Hartenberg (D-H) parameters to model the robot’s kinematics. The homogeneous transformation matrix between consecutive joints is given by:

$$ ^{i-1}_i T = \begin{bmatrix} \cos\theta_i & -\sin\theta_i & 0 & \alpha_{i-1} \\ \cos\alpha_{i-1}\sin\theta_i & \cos\alpha_{i-1}\cos\theta_i & -\sin\alpha_{i-1} & -d_i \sin\alpha_{i-1} \\ \sin\alpha_{i-1}\sin\theta_i & \sin\alpha_{i-1}\cos\theta_i & \cos\alpha_{i-1} & d_i \cos\alpha_{i-1} \\ 0 & 0 & 0 & 1 \end{bmatrix} $$

The position of each link’s center of mass $c_i$ relative to joint coordinate system ${i}$ is denoted by $\mathbf{r}_i$. The transformation to the base frame is computed as:

$$ ^0_{c_i} T = ^0_i T \cdot ^i_{c_i} T $$

Dynamic parameters such as angular velocity $\omega_i$, angular acceleration $\dot{\omega}_i$, linear acceleration $\dot{v}_i$, and center of mass acceleration $\dot{v}_{c_i}$ are derived recursively:

$$ \omega_{i+1} = ^{i+1}_i R \omega_i + \dot{\theta}_{i+1} Z_{i+1} $$
$$ \dot{\omega}_{i+1} = ^{i+1}_i R \dot{\omega}_i + ^{i+1}_i R \omega_i \times \dot{\theta}_{i+1} e_{i+1} + \ddot{\theta}_{i+1} Z_{i+1} $$
$$ \dot{v}_{i+1} = ^{i+1}_i R \left[ \dot{v}_i + \dot{\omega}_i \times ^i_{i+1} p + \omega_i \times (\omega_i \times ^i_{i+1} p) \right] $$
$$ \dot{v}_{c_{i+1}} = \dot{v}_{i+1} + \dot{\omega}_{i+1} \times r_{i+1} + \omega_{i+1} \times (\omega_{i+1} \times r_{i+1}) $$

The forces and moments between links are calculated using Newton-Euler equations:

$$ F^{i-1}_i = F^{i+1}_i + m_i \dot{v}_{c_i} – ^{c_i} G_i $$
$$ M^{i-1}_i = M^{i+1}_i – r_{i+1,c_i} \times F^{i+1}_i + r_{i,c_i} \times F^{i-1}_i + I_i \dot{\omega}_i + \omega_i \times I_i \omega_i $$

where $I_i$ is the inertia tensor of link $i$. The force and moment at the base, $^0 f$ and $^0 m$, are transformed to the sensor coordinate system ${S}$ using:

$$ \begin{bmatrix} ^S f \\ ^S m \end{bmatrix} = \begin{bmatrix} ^S_0 R & 0 \\ ^S(^S p_0) \times ^S_0 R & ^S_0 R \end{bmatrix} \begin{bmatrix} ^0 f \\ ^0 m \end{bmatrix} $$

The compensated force and moment are then:

$$ \begin{bmatrix} F \\ M \end{bmatrix} = \begin{bmatrix} D_f – ^S f \\ D_M – ^S m \end{bmatrix} $$

A safety threshold $S$ is defined to detect collisions:

$$ S = \begin{cases} 0 & \text{if } F^T \in [-\delta_f, \delta_f] \text{ and } M^T \in [-\delta_m, \delta_m] \\ 1 & \text{otherwise} \end{cases} $$

where $\delta_f$ and $\delta_m$ are empirically set thresholds. For our experiments, $\delta_f = 0.4923 \, \text{N}$.

To validate our method, we conducted simulation experiments using a 3-degree-of-freedom robot model. The robot links were symmetric, with material density $\rho = 2.7 \times 10^3 \, \text{kg/m}^3$. The structural parameters are summarized in Table 1.

Table 1: Link Parameters for the 3-DOF Robot
Link Number Mass (kg) Center of Mass Relative Position (mm)
0 1.598 (0, 0, 60)
1 1.686 (0, 0, 70.024)
2 2.489 (84.685, 0, 0)
3 3.201 (104.684, 0, 0)

The inertia tensors for links 2, 3, and 4 are:

$$ I_2 = \begin{bmatrix} 390.122 & 0 & -0.164 \\ 0 & 2.137 & 0 \\ -0.164 & 0 & 391.366 \end{bmatrix} $$
$$ I_3 = \begin{bmatrix} 402.110 & 515.403 & -89.144 \\ 515.403 & 700.313 & -66.861 \\ -89.144 & -66.861 & 1077.722 \end{bmatrix} $$
$$ I_4 = \begin{bmatrix} 123.919 & -292.363 & -103.181 \\ -192.163 & 809.897 & 37.927 \\ -103.181 & 37.927 & 905.914 \end{bmatrix} $$

The robot was programmed to repeat a motion cycle 10 times, with a collision force of 100 N applied at different points at the 3-second mark. The joint angles were set as $\theta_1 = -\pi/6$, $\theta_2 = -\pi/6$, $\theta_3 = -\pi/4$, with accelerations $\ddot{\theta}_1 = 0.3491 \, \text{rad/s}^2$, $\ddot{\theta}_2 = 0.1746 \, \text{rad/s}^2$, $\ddot{\theta}_3 = 0.4363 \, \text{rad/s}^2$. The results are presented in Table 2.

Table 2: Experimental Results for Different Collision Points
Test No. Measured Force (N) Force Components (N) Collision Position (mm) Computed Force (N) Computed Position (mm) Force Absolute Error (N) Force Relative Error (%) Position Absolute Error (mm) Position Relative Error (%)
1 100 (92.687, 10.349, 36.083) (40.03, 0, 148.03) (94.563, 9.675, 37.550) (40.463, 0.044, 149.926) (1.876, 0.674, 1.467) 1.896 (0.433, 0.044, 1.896) 1.547
2 100 (62.349, 73.948, 25.383) (0, 38.49, 167.03) (60.381, 72.750, 24.315) (1.347, 37.643, 167.897) (1.967, 1.198, 1.068) 2.380 (1.347, 0.847, 0.867) 1.786
3 100 (-82.944, 48.673, -27.411) (46.83, 0, 184.16) (-84.191, 49.607, -29.051) (47.316, 2.379, 185.839) (1.247, 0.935, 1.641) 1.946 (0.486, 2.379, 1.679) 2.392
4 100 (76.894, 14.688, -62.222) (64.07, 36.83, 264.19) (78.762, 16.420, -61.654) (64.827, 39.843, 261.176) (1.867, 1.732, 0.568) 2.648 (0.757, 3.013, 3.014) 2.854
5 100 (-16.483, 84.678, -50.576) (82.64, 6.76, 306.82) (-18.051, 85.662, -51.423) (79.171, 9.408, 313.314) (1.567, 0.983, 0.847) 1.528 (3.469, 2.648, 6.494) 3.968
6 100 (-45.017, -62.864, 63.416) (116.79, 69.72, 298.71) (-46.364, -62.230, 64.451) (122.101, 76.489, 302.471) (1.347, 0.635, 1.035) 0.876 (5.311, 6.769, 3.761) 4.826
7 100 (-86.035, 32.186, 39.523) (190.46, 80.73, 422.67) (-87.797, 33.052, 40.670) (180.787, 92.224, 408.177) (1.762, 0.865, 1.148) 2.250 (9.673, 11.494, 14.493) 7.578
8 100 (72.189, 30.161, -62.282) (213.41, 124.08, 486.49) (70.713, 29.669, -63.276) (226.900, 135.715, 503.804) (1.476, 0.492, 0.994) 0.580 (13.490, 11.635, 17.314) 7.779
9 100 (-80.793, 52.879, 26.004) (219.46, 97.61, 519.73) (-82.517, 54.340, 27.139) (235.465, 116.015, 505.411) (1.723, 1.460, 1.135) 4.893 (16.005, 18.494, 14.319) 8.526
10 100 (-68.494, -24.679, -68.553) (234.18, 126.49, 536.18) (-70.634, -23.639, -69.518) (253.864, 139.958, 555.177) (2.141, 1.040, 0.965) 1.886 (19.684, 13.468, 18.997) 8.712

Error analysis was performed to evaluate the algorithm’s precision. The relative error for force components is defined as:

$$ \delta_F = \frac{\Delta F_n}{|F_C|} \times 100\% \quad (n = x, y, z) $$

where $\Delta F_n$ is the absolute error in each direction, and $|F_C|$ is the magnitude of the measured force vector. The overall force relative error is:

$$ \Delta F_D = \frac{|\sqrt{f_{Cx}^2 + f_{Cy}^2 + f_{Cz}^2} – \sqrt{f_{Dx}^2 + f_{Dy}^2 + f_{Dz}^2}|}{\sqrt{f_{Cx}^2 + f_{Cy}^2 + f_{Cz}^2}} \times 100\% $$

Similarly, for position errors:

$$ \delta_P = \frac{\Delta P_n}{|P_C|} \times 100\% \quad (n = x, y, z) $$
$$ \Delta P_D = \frac{\Delta P_x + \Delta P_y + \Delta P_z}{\sqrt{P_{Cx}^2 + P_{Cy}^2 + P_{Cz}^2}} \times 100\% $$

The results indicate that the dynamic force compensation algorithm maintains consistent accuracy, with force relative errors below 5% across all tests. The maximum force relative error was 4.8925%, occurring at the ninth collision point. For collision point detection, the relative error increases with distance from the sensor. The highest position relative error was 8.7119% at the farthest point (598.61 mm). This demonstrates that while the dynamic force compensation is robust, collision point accuracy degrades slightly with distance due to moment amplification.

In conclusion, our method utilizing a six-axis force sensor for collision point detection proves effective in simulating real-world scenarios. The dynamic force compensation algorithm ensures zero baseline readings during motion, enabling reliable collision identification through thresholding. The collision point detection algorithm, based on projection and error minimization, achieves satisfactory accuracy within practical ranges. Future work will focus on optimizing the algorithm for multi-point collisions and integrating machine learning for adaptive thresholding. The six-axis force sensor remains a versatile tool for enhancing robot safety in collaborative environments.

Scroll to Top