In modern manufacturing, industrial robots play a pivotal role in automating repetitive, hazardous, and precision-demanding tasks. However, traditional teaching methods, such as using teach pendants, require extensive operator expertise and time to achieve accurate path planning. Direct teaching, where an operator physically guides the robot to define trajectories, offers a more intuitive and efficient alternative. Despite its advantages, most commercial robots lack built-in support for direct teaching due to proprietary control systems. To address this, we propose a force guidance teaching method leveraging a six-axis force sensor. This approach enables real-time force and torque monitoring, compensating for gravitational effects, and converting sensory data into motion commands for the robot. Our system enhances accessibility and reduces the learning curve for operators, making it suitable for educational and industrial settings. This paper details the hardware setup, data processing algorithms, experimental validation, and the broader implications of our method.
The core of our system is the integration of a six-axis force sensor, which provides high-fidelity measurements of forces and torques in three-dimensional space. This sensor is critical for capturing the operator’s intent during direct teaching. We designed a comprehensive platform comprising an industrial robot, the sensor, signal conditioning hardware, and a PC for data processing and control. The robot used in our experiments is a six-degree-of-freedom (DOF) model, similar to common industrial arms, with its kinematic parameters defined using the Denavit-Hartenberg (D-H) convention. The D-H parameters are summarized in Table 1, which facilitates forward kinematics computations essential for coordinate transformations.
| Joint | θ (rad) | a (mm) | d (mm) | α (rad) |
|---|---|---|---|---|
| 1 | θ₁ | 0 | 0 | -π/2 |
| 2 | θ₂ | 270 | 0 | 0 |
| 3 | θ₃ | 70 | 0 | -π/2 |
| 4 | θ₄ | 0 | 302 | π/2 |
| 5 | θ₅ | 0 | 0 | -π/2 |
| 6 | θ₆ + π | 0 | 72 | 0 |
The hardware configuration includes the industrial robot, a six-axis force sensor mounted at the end-effector, an amplifier to boost sensor signals, a data acquisition card for analog-to-digital conversion, and a handheld tool attached to the sensor for operator interaction. The six-axis force sensor is aligned such that its z-axis coincides with the robot’s sixth joint axis, and its x and y axes are parallel to the tool coordinate system. This alignment ensures consistent force and torque measurements relative to the robot’s frame. The amplifier converts millivolt-level outputs from the sensor into volt-level signals, which are then sampled by the data acquisition card at high frequencies. The PC processes these signals in real-time using custom algorithms and communicates with the robot controller via Ethernet-based Socket communication for low-latency command transmission. The following figure illustrates the setup, showing the integration of the six-axis force sensor with the robot end-effector.

Data processing begins with converting raw sensor readings into meaningful force and torque values. The data acquisition card outputs digital codes representing analog voltages, which are transformed using the following equation:
$$ V = \frac{5000}{2^{16}} \times \text{ADbuffer}[n] – 2500 $$
Here, \( V \) is the analog voltage in millivolts, \( \text{ADbuffer}[n] \) is the raw digital value from channel \( n \) (where \( n = 0 \) to \( 5 \) for the six channels), and the constant 16 refers to the card’s 16-bit resolution. This conversion scales the output to a manageable range. Next, the voltage signals are mapped to force and torque components using a calibration matrix provided by the sensor manufacturer. The transformation is given by:
$$ \mathbf{F} = \begin{bmatrix} V_0 & V_1 & V_2 & V_3 & V_4 & V_5 \end{bmatrix}^T \times \mathbf{C} $$
where \( \mathbf{F} = [F_x, F_y, F_z, T_x, T_y, T_z]^T \) represents the force and torque vector in the sensor coordinate system, and \( \mathbf{C} \) is a 6×6 constant calibration matrix. For our six-axis force sensor, this matrix is:
$$ \mathbf{C} = \begin{bmatrix}
0.03093 & 0.01615 & 0.00556 & -13.82205 & -0.45852 & 14.19659 \\
-0.26785 & 16.10155 & -0.05010 & -7.99819 & 0.21523 & -8.21116 \\
24.96838 & -1.09941 & 24.89981 & -1.21617 & 25.31232 & -1.04139 \\
-0.00232 & 0.19398 & -0.72411 & -0.06257 & 0.73102 & -0.12736 \\
0.82662 & -0.03814 & -0.41505 & 0.18887 & -0.42087 & -0.15352 \\
0.00893 & -0.42968 & 0.00145 & -0.43146 & 0.01053 & -0.44103
\end{bmatrix} $$
This step ensures accurate decoding of the physical forces and torques applied by the operator, which is fundamental for subsequent compensation and control algorithms.
Gravity compensation is crucial to isolate the operator-applied forces from the weight of the handheld tool. The tool’s gravity vector in the base coordinate system is \( [0, 0, G]^T \), where \( G \) is the gravitational force magnitude. As the robot moves, the tool’s orientation changes, causing the gravity components in the sensor frame to vary. We compute the real-time gravity compensation values using the inverse of the transformation matrix from the base to the sensor coordinate system. The forward kinematics derived from the D-H parameters yield the transformation matrix:
$$ \text{Base}_{\text{Sensor}}\mathbf{T} = \mathbf{T}_1^0 \cdot \mathbf{T}_2^1 \cdot \mathbf{T}_3^2 \cdot \mathbf{T}_4^3 \cdot \mathbf{T}_5^4 \cdot \mathbf{T}_6^5 \cdot \mathbf{T}_{\text{Sensor}}^6 = \begin{bmatrix}
n_x & o_x & a_x & p_x \\
n_y & o_y & a_y & p_y \\
n_z & o_z & a_z & p_z \\
0 & 0 & 0 & 1
\end{bmatrix} $$
Here, \( \mathbf{T}_{n+1}^n \) denotes the transformation from joint \( n \) to \( n+1 \), and the resulting 4×4 matrix includes position \( (p_x, p_y, p_z) \) and orientation \( (n, o, a) \) data. The gravity components in the sensor frame are then:
$$ \begin{bmatrix} F_{x1} \\ F_{y1} \\ F_{z1} \end{bmatrix} = \begin{bmatrix} n_x & o_x & a_x \\ n_y & o_y & a_y \\ n_z & o_z & a_z \end{bmatrix}^{-1} \begin{bmatrix} 0 \\ 0 \\ G \end{bmatrix} $$
where \( [F_{x1}, F_{y1}, F_{z1}]^T \) are the compensation values subtracted from the measured forces to obtain the net operator-applied forces. This algorithm runs continuously in the PC to account for dynamic pose changes.
Gravitational torque compensation addresses the moments induced by the tool’s weight. The center of mass of the handheld tool relative to the sensor frame must be determined accurately. Initially, we measure the static torques caused by gravity and solve for the center of mass position vector \( [d_x, d_y, d_z]^T \) using:
$$ \begin{bmatrix} M_{x0} \\ M_{y0} \\ M_{z0} \end{bmatrix} = \begin{bmatrix}
0 & -F_{z0} & F_{y0} \\
F_{z0} & 0 & -F_{x0} \\
-F_{y0} & F_{x0} & 0
\end{bmatrix} \begin{bmatrix} d_x \\ d_y \\ d_z \end{bmatrix} $$
Here, \( [M_{x0}, M_{y0}, M_{z0}]^T \) are the initial gravitational torques, and \( [F_{x0}, F_{y0}, F_{z0}]^T \) are the corresponding gravity forces. To improve accuracy, multiple measurements are averaged. During operation, the real-time gravitational torques are computed as:
$$ \begin{bmatrix} M_{xt} \\ M_{yt} \\ M_{zt} \end{bmatrix} = \begin{bmatrix}
0 & -F_{zt} & F_{yt} \\
F_{zt} & 0 & -F_{xt} \\
-F_{yt} & F_{xt} & 0
\end{bmatrix} \begin{bmatrix} d_x \\ d_y \\ d_z \end{bmatrix} $$
where \( [M_{xt}, M_{yt}, M_{zt}]^T \) are the compensation values subtracted from the measured torques. This ensures that only the operator-induced torques are used for motion control.
The force and torque data, after compensation, are converted into displacement and rotation commands for the robot. We employ a force-position control algorithm where the translation step size in each direction is calculated as:
$$ \text{step}_i = \frac{F_i}{|F_i|} \times (|F_i| – 3) \times k_i $$
for \( i = x, y, z \). Here, \( F_i \) is the net force in direction \( i \), and \( k_i \) is a scaling factor (set to 0.015 in our experiments). A force threshold of 3 N is applied to avoid actuation from minor disturbances. Similarly, the rotation step size is given by:
$$ \theta_{\text{step}_i} = \frac{M_i}{|M_i|} \times (|M_i| – 1) \times k_\theta $$
for \( i = x, y, z \), where \( M_i \) is the net torque, and \( k_\theta = 0.015 \) is the rotation scaling factor. A torque threshold of 1 N·m prevents unwanted rotations. The resulting displacement vector \( \Delta \mathbf{P} = [\Delta x, \Delta y, \Delta z, \Delta R_x, \Delta R_y, \Delta R_z] \) is transmitted to the robot every 10 ms via Socket communication. The data format for transmission is structured as a sequence of values representing the tool coordinate frame adjustments, ensuring smooth and responsive motion.
To validate our method, we conducted experiments focusing on gravity compensation accuracy and trajectory teaching. The six-axis force sensor was instrumental in capturing real-time data. Before motion, the system recorded the installation stresses and tool weight for compensation. During testing, we monitored the force and torque readings with and without compensation. For instance, uncompensated data showed significant gravity-related offsets, which were effectively nullified by our algorithms. Table 2 summarizes typical force values before and after compensation, demonstrating the efficacy of our approach.
| Parameter | Before Compensation (N or N·m) | After Compensation (N or N·m) |
|---|---|---|
| F_x | 5.2 | 0.1 |
| F_y | -3.8 | -0.2 |
| F_z | 12.5 | 0.3 |
| T_x | 0.9 | 0.0 |
| T_y | -1.2 | -0.1 |
| T_z | 0.5 | 0.0 |
In a trajectory teaching scenario, an operator guided the robot along a rectangular workpiece contour using the handheld tool. The six-axis force sensor detected the applied forces and torques, which were converted into motion commands. The system recorded the taught path points at regular intervals. For trajectory playback, the stored points were sent to the robot, which accurately replicated the path. This demonstrates the method’s capability for complex path following and reproducibility. The use of the six-axis force sensor ensured high precision and responsiveness, with the robot smoothly adhering to the operator’s guidance. Additional tests involved varying the scaling factors \( k_i \) and \( k_\theta \) to optimize for different tasks, such as high-speed tracing or fine adjustments.
Our force guidance teaching system based on the six-axis force sensor proves to be a robust and generalizable solution for direct robot teaching. It eliminates the need for robot hardware modifications and reduces dependency on operator expertise. The gravity and gravitational torque compensation algorithms effectively isolate human inputs, enabling precise control. The force-position conversion strategy with thresholds minimizes errors from environmental noise. Experimental results confirm that the method accurately captures and replays trajectories, making it suitable for applications like polishing, welding, or assembly. Future work will focus on enhancing the algorithm for curved surface tracking and integrating adaptive control to handle variable payloads. The six-axis force sensor remains central to these advancements, providing the necessary data fidelity for advanced robotic manipulation.