Collision Detection on Robot Outer Surfaces Using Six-Axis Force Sensors

In recent years, the advancement of human-robot collaboration has become a dominant paradigm in manufacturing, driven by initiatives such as Industry 4.0 and smart manufacturing strategies. Ensuring safety in these collaborative environments is paramount, as robots directly interact with human operators. A critical aspect of this safety is the accurate detection of collision points on a robot’s body, which not only prevents accidents but also facilitates post-collision analysis and repair. Traditional methods for collision detection, such as vision sensors or infrared detectors, often suffer from limitations in reliability, environmental sensitivity, and maintenance costs. In contrast, six-axis force sensors offer a robust solution by measuring multi-dimensional force and torque signals, enabling precise detection of external collisions. This paper presents a novel algorithm for determining collision points on articulated robots by integrating measurements from a six-axis force sensor mounted on the robot base with the mathematical models of the robot’s outer surfaces. The approach involves deriving the spatial force line vector from sensor data and solving for intersection points with the robot’s links, categorized into various scenarios based on geometric constraints.

The foundation of our method lies in the generalized force and moment vectors detected by the six-axis force sensor. The sensor output is represented as:

$$ \mathcal{F} = \begin{pmatrix} \mathbf{F} \\ \mathbf{M} \end{pmatrix} $$

where $\mathbf{F} = (F_x, F_y, F_z)^T$ is the force vector and $\mathbf{M} = (M_x, M_y, M_z)^T$ is the moment vector. The resultant force magnitude $F_R$ is computed as:

$$ F_R = \sqrt{F_x^2 + F_y^2 + F_z^2} $$

The direction vector $\mathbf{u}$ and its direction cosines are given by:

$$ \mathbf{u} = (\cos \alpha, \cos \beta, \cos \gamma) $$

with:

$$ \cos \alpha = \frac{F_x}{F_R}, \quad \cos \beta = \frac{F_y}{F_R}, \quad \cos \gamma = \frac{F_z}{F_R} $$

A point $\mathbf{P} = (x_P, y_P, z_P)$ on the spatial force line is derived from the moment equation $\mathbf{M} = \mathbf{F} \times \mathbf{P}$, leading to:

$$ x_P = \frac{M_x}{F_x}, \quad y_P = \frac{M_y}{F_y}, \quad z_P = \frac{M_z}{F_z} $$

Thus, the spatial force line in the sensor coordinate system is expressed as:

$$ \frac{x – x_P}{\cos \alpha} = \frac{y – y_P}{\cos \beta} = \frac{z – z_P}{\cos \gamma} $$

This can be parameterized as:

$$ x = x_i + t \cos \alpha_i, \quad y = y_i + t \cos \beta_i, \quad z = z_i + t \cos \gamma_i $$

where $(x_i, y_i, z_i)$ is a point on the line in coordinate system $\{i\}$, and $t$ represents the distance to potential collision points.

To detect collisions on a specific link $i$, the force and moment vectors are transformed from the sensor frame $\{S\}$ to the link frame $\{i\}$ using:

$$ ^i\mathcal{F} = ^i_S\mathbf{T} \, ^S\mathcal{F} $$

The transformation matrix $^i_S\mathbf{T}$ is defined as:

$$ ^i_S\mathbf{T} = \begin{pmatrix} ^i_S\mathbf{R} & \mathbf{0} \\ \mathbf{S}(^i\mathbf{P}_S) \, ^i_S\mathbf{R} & ^i_S\mathbf{R} \end{pmatrix} $$

where $^i_S\mathbf{R}$ is the rotation matrix and $\mathbf{S}(^i\mathbf{P}_S)$ is the skew-symmetric matrix of the position vector. The spatial force line in frame $\{i\}$ is then solved for intersections with the link’s outer surface equation. For instance, if the link has a cylindrical surface, the equation is:

$$ y^2 + z^2 = r_i^2, \quad L_{i1} \leq x \leq L_{i2} $$

where $r_i$ is the radius, and $L_{i1}$, $L_{i2}$ define the axial bounds. Substituting the parametric line equations into the surface equation yields a quadratic in $t$:

$$ (\cos^2 \beta_i + \cos^2 \gamma_i) t^2 + 2 (y_i \cos \beta_i + z_i \cos \gamma_i) t + (y_i^2 + z_i^2 – r_i^2) = 0 $$

The discriminant $\Delta$ determines the nature of solutions:

$$ \Delta = b^2 – 4ac $$

where $a = \cos^2 \beta_i + \cos^2 \gamma_i$, $b = 2(y_i \cos \beta_i + z_i \cos \gamma_i)$, and $c = y_i^2 + z_i^2 – r_i^2$. Depending on $\Delta$ and the $x$-coordinate constraints, collisions are categorized into nine cases, as summarized in the table below. Each case provides the number of collision points and their coordinates in frame $\{i\}$, which are then transformed to the base frame using homogeneous transformations.

Collision Cases for Cylindrical Links
Case Δ Condition x Constraints Collision Points Coordinates in Frame {i}
1 Δ < 0 Any 0 None
2 Δ = 0 $x_{s5} < L_{i1}$ or $x_{s5} > L_{i2}$ 0 None
3 Δ = 0 $L_{i1} \leq x_{s5} \leq L_{i2}$ 1 $\mathbf{p} = (x_{s5}, y_{s5}, z_{s5})^T$
4 Δ > 0 $x_{s1} < L_{i1}$, $x_{s2} < L_{i1}$ or $x_{s1} > L_{i2}$, $x_{s2} > L_{i2}$ 0 None
5 Δ > 0 $x_{s1} = L_{i1}$, $x_{s2} < L_{i1}$ or $x_{s1} = L_{i2}$, $x_{s2} > L_{i2}$ 1 $\mathbf{p} = (L_{i1}, y_{s1}, z_{s1})^T$ or $(L_{i2}, y_{s1}, z_{s1})^T$
6 Δ > 0 $L_{i1} < x_{s1} \leq L_{i2}$, $x_{s2} < L_{i1}$ or $x_{s2} > L_{i2}$ 2 $\mathbf{p}_1 = (x_{s1}, y_{s1}, z_{s1})^T$, $\mathbf{p}_2 = (L_{i1}, y_{s3}, z_{s3})^T$ or $(L_{i2}, y_{s4}, z_{s4})^T$
7 Δ > 0 $L_{i1} < x_{s1} \leq L_{i2}$, $L_{i1} \leq x_{s2} < L_{i2}$ 2 $\mathbf{p}_1 = (x_{s1}, y_{s1}, z_{s1})^T$, $\mathbf{p}_2 = (x_{s2}, y_{s2}, z_{s2})^T$
8 Δ > 0 $x_{s1} < L_{i1}$, $x_{s2} > L_{i2}$ 2 $\mathbf{p}_1 = (L_{i1}, y_{s3}, z_{s3})^T$, $\mathbf{p}_2 = (L_{i2}, y_{s4}, z_{s4})^T$
9 Δ > 0 $x_{s1} = L_{i1}$, $x_{s2} = L_{i2}$ 2 $\mathbf{p}_1 = (L_{i1}, y_{s1}, z_{s1})^T$, $\mathbf{p}_2 = (L_{i2}, y_{s2}, z_{s2})^T$

The algorithm proceeds by iterating through each link, computing potential intersections, and verifying constraints. If a collision point is found, it is transformed to the base frame via:

$$ \begin{pmatrix} ^0\mathbf{p} \\ 1 \end{pmatrix} = ^0_i\mathbf{T} \begin{pmatrix} ^i\mathbf{p} \\ 1 \end{pmatrix} $$

where $^0_i\mathbf{T}$ is the homogeneous transformation from frame $\{i\}$ to the base frame.

Experimental validation was conducted using a six-degree-of-freedom articulated robot with a six-axis force sensor installed at its base. The sensor, capable of resolving forces and moments with high precision, was connected to data acquisition systems to monitor interactions. The robot was programmed to assume specific joint configurations, and collisions were simulated by applying known forces at designated points on the links. For example, weights were suspended from various locations and released to impart controlled impulses, with the sensor data captured at the moment of release. The measured force and moment vectors were input into the collision detection algorithm to compute the collision points, which were then compared to the actual impact locations.

The results demonstrated that the algorithm accurately identified collision points across different scenarios. The error analysis considered the absolute and relative errors in each coordinate direction. The relative error $\delta P_n$ for position $n$ was calculated as:

$$ \delta P_n = \frac{\Delta p_x + \Delta p_y + \Delta p_z}{\sqrt{p_{Cx}^2 + p_{Cy}^2 + p_{Cz}^2}} \times 100\% $$

where $\Delta p_m = |p_{Jm} – p_{Cm}|$ for $m = x, y, z$, with $p_J$ being the computed position and $p_C$ the actual position. The tests showed that relative errors decreased with increasing collision force, remaining below 6% in all cases. Additionally, collisions closer to the six-axis force sensor resulted in higher accuracy, underscoring the sensor’s sensitivity to proximal events.

Our findings highlight the efficacy of using a six-axis force sensor for real-time collision detection in articulated robots. The algorithm’s ability to precisely locate impact points enhances safety in human-robot collaboration by enabling rapid response and analysis. Future work will focus on optimizing the computational efficiency for real-time applications and extending the method to robots with complex surface geometries. The integration of six-axis force sensors into collaborative robotics represents a significant step toward safer and more intelligent manufacturing environments.

In conclusion, the proposed method leverages the high-fidelity data from a six-axis force sensor to solve for collision points through geometric intersections. By categorizing collisions into distinct cases based on algebraic solutions, the algorithm ensures robust detection across various scenarios. The experimental results validate the theoretical framework, demonstrating that the six-axis force sensor is a critical component for advancing robot safety. As human-robot collaboration continues to evolve, the role of six-axis force sensors in enabling precise environmental interaction will only grow in importance.

Scroll to Top