In the field of civil engineering, steel-concrete composite structures are widely used due to their excellent load-bearing capacity, stiffness, and seismic performance. However, over time, differences in material properties such as elastic modulus, Poisson’s ratio, and thermal expansion coefficients between steel and concrete can lead to debonding or void formation at their interfaces, compromising structural integrity. Traditional inspection methods for such hidden defects rely heavily on manual techniques, which are inefficient and prone to inaccuracies. To address this, we have developed an innovative magnetic wall-climbing robot that integrates image and acoustic feature recognition for automated detection. This robot leverages advanced robot technology to navigate vertical surfaces, capture real-time visual data, and analyze acoustic signatures for void identification. The design focuses on robustness, efficiency, and precision, making it a valuable tool for structural health monitoring. In this article, I will detail the overall design, hardware and software systems, and the integration of image and acoustic modules, emphasizing how robot technology enhances inspection capabilities.
The wall-climbing robot is designed with a flat profile to minimize wind resistance and improve grip on surfaces. It measures approximately 50 cm in length, 20 cm in width, and 10 cm in height. The chassis is constructed from 3 mm thick square aluminum alloy, custom-cut via laser to accommodate components like the battery and control systems. The body shell is 3D-printed using polylactic acid (PLA) thermoplastic, which reduces weight while providing sufficient strength and rigidity. This material also offers dust and water resistance, protecting internal electronics. The robot consists of several key modules: the power system, drive system, magnetic adsorption device, image transmission module, and acoustic recognition module. The power system utilizes a 4S Li-Po battery, composed of four 3.7 V cells in series, delivering up to 30 minutes of operation. The drive system includes Hall-effect DC motors directly connected to four driven wheels, enabling movement on both horizontal and vertical surfaces with a maximum speed of 1 m/s and an error margin of ±0.05 m/s. Steering is achieved through skid-steering (differential drive), where varying the speeds of the left and right wheels allows for stable and sensitive turns with a small turning radius. This approach is advantageous in robot technology for its enhanced maneuverability and energy efficiency. To ensure safety during inspections, the robot includes a low-battery warning at 5% charge. Since it employs permanent magnets for adsorption, the adhesive force remains unaffected by battery levels, eliminating the risk of falling even during power loss.

The hardware control system forms the core of the robot’s operation, integrating components such as the CPU, wireless remote controller, recording device, drive mechanism, permanent magnets, and battery. These elements work in unison to ensure reliable performance in diverse environments. The system is powered by the battery, with an STM32F407 microcontroller serving as the central processing unit. It connects directly to drivers and motors via I/O ports, executing commands from the remote controller and sensors. The remote controller provides a user-friendly interface for real-time monitoring and adjustment, embodying the flexibility of modern robot technology. The acoustic excitation device, crucial for void detection, generates sounds by tapping the steel surface, which are then captured for analysis. The recording equipment, an Aigo S5 wireless microphone, pairs with mobile devices via Bluetooth to transmit acoustic data seamlessly. For magnetic adsorption, we selected neodymium-iron-boron (NdFeB) permanent magnets due to their high magnetic strength and reliability. Unlike electromagnetic alternatives, which are heavier and power-intensive, permanent magnets offer a lightweight solution with consistent adhesion, extending the robot’s operational time. The adsorption force must counteract gravity and enable movement; thus, we conducted a force analysis to determine the required magnetic strength.
When the robot is stationary on a vertical wall, the forces acting on it can be described as follows. Let $F_C$ be the counterforce due to magnetic adsorption, $G$ be the gravitational force, $F_{N1}$ and $F_{N2}$ be the normal forces at the front and rear wheels, and $F_1$ and $F_2$ be the frictional forces, with $\mu$ as the coefficient of friction. The equations are:
$$F_C = 2 (F_{N1} + F_{N2})$$
$$G = 2 (F_1 + F_2)$$
Since $F_1 = \mu \cdot F_{N1}$ and $F_2 = \mu \cdot F_{N2}$, we derive:
$$G = 2\mu (F_{N1} + F_{N2})$$
Combining with the first equation gives:
$$G = \mu F_C$$
Assuming a conservative friction coefficient $\mu = 0.5$ to account for surface impurities like rust or dust, the minimum adsorption force for safety is:
$$F_C \geq 2G$$
Given the robot’s weight $G = 1.9\, \text{kN}$, the theoretical minimum $F_C$ is $3.8\, \text{kN}$. To ensure stability under practical conditions, we designed the adsorption force to be at least twice this value. We incorporated four NdFeB magnets, each measuring 28 mm × 10 mm × 50 mm, providing a total force exceeding $6\, \text{kN}$. This design decision highlights the integration of robot technology with mechanical principles to achieve reliable performance. The following table summarizes the key hardware parameters:
| Component | Specifications | Communication Method | Supply Voltage (V) |
|---|---|---|---|
| 4S Li-Po Battery | 14.8 V, 30 min runtime | Direct connection | 14.8 |
| Drive Motor | Hall-effect DC, 1 m/s max speed | Serial port | 12 |
| Remote Controller | Wireless, real-time control | Wireless | 1.5 |
| Recording Device | Aigo S5 wireless microphone | Bluetooth | 3.7 |
| Camera | JINJIEAN B19FPV HD | Analog signal | 5 |
| Acoustic Exciter | Electromagnetic, 1/2/5 Hz frequencies | Direct connection | 12 |
The software system of the robot comprises two main modules: image transmission and acoustic recognition. These modules leverage advanced robot technology to process and analyze data in real-time, facilitating automated inspection. The image transmission module is built on the Orange Pi platform, which runs a Linux operating system and connects to the robot via CAN bus. It employs COFDM (Coded Orthogonal Frequency Division Multiplexing) technology for robust video transmission. The front-mounted JINJIEAN B19FPV HD camera captures analog images, transmitting them over distances of up to 1 km in open environments, with a control range of 1.5 km. This system allows operators to remotely monitor the robot’s path and identify surface defects, such as cracks or corrosion, on steel-concrete structures. The interaction between the robot, camera, and server is illustrated in the data flow: the camera captures images, the Orange Pi processes them, and the server displays the feed for analysis. This integration exemplifies how robot technology enhances visual inspection capabilities in challenging environments.
The acoustic recognition module is structured into three layers: front-end, middle-end, and back-end, each playing a critical role in void detection. The front-end consists of the robot’s acoustic excitation and recording devices. The exciter uses electromagnetic mechanisms to tap the surface at frequencies of 1, 2, or 5 Hz, generating sound waves that vary with underlying voids. The wireless microphone records these acoustic responses and transmits them via Bluetooth to a mobile device. In preliminary tests, we calibrated the tapping force against different cladding thicknesses to optimize detection sensitivity. The middle-end is implemented as a WeChat mini-program, which receives the acoustic data, preprocesses it to remove noise, and extracts relevant features. This step involves signal processing algorithms to enhance data quality, storing the features in a database for further analysis. The back-end utilizes Tencent Cloud services to train and analyze the acoustic data using a deep learning model based on one-dimensional time-series waveforms. This model identifies patterns indicative of voids and returns the results to the mini-program. The overall architecture ensures efficient data handling and accurate recognition, showcasing the power of robot technology in combining hardware and software for structural assessment.
To elaborate on the acoustic analysis, the sound waves generated by tapping can be modeled using wave propagation theory. The fundamental frequency $f$ of the sound is related to the material properties and void presence. For a steel-concrete interface, the sound velocity $v$ depends on the elastic modulus $E$ and density $\rho$ as:
$$v = \sqrt{\frac{E}{\rho}}$$
When a void exists, the acoustic impedance changes, altering the reflected and transmitted waves. The reflection coefficient $R$ for a wave incident on an interface can be expressed as:
$$R = \frac{Z_2 – Z_1}{Z_2 + Z_1}$$
where $Z_1$ and $Z_2$ are the acoustic impedances of steel and concrete, respectively. A void acts as a low-impedance layer, increasing $R$ and modifying the sound signature. The mini-program computes these parameters in real-time, enabling rapid void detection. This approach demonstrates how robot technology integrates physical principles with digital tools for advanced diagnostics.
In conclusion, our magnetic wall-climbing robot represents a significant advancement in automated inspection for steel-concrete structures. By integrating image and acoustic feature recognition, it provides a comprehensive solution for detecting interface voids. The hardware design, centered on permanent magnets and efficient drives, ensures stable adhesion and mobility. The software modules, including the Orange Pi-based image system and the three-layer acoustic recognition system, enable real-time data acquisition and analysis. This robot technology not only improves inspection accuracy but also reduces human effort and risks associated with manual methods. Future work will focus on enhancing battery life, data processing speed, and sensor precision to extend the robot’s applicability. Through continuous innovation, we aim to further embed robot technology into civil engineering practices, paving the way for smarter infrastructure maintenance.
The development of this robot underscores the transformative potential of robot technology in addressing complex engineering challenges. As structures age and require more frequent assessments, automated systems like ours will become indispensable. We envision expanding this technology to other composite materials and environments, leveraging insights from this design to drive future iterations. The fusion of mechanical engineering, electronics, and artificial intelligence in this robot exemplifies the interdisciplinary nature of modern robot technology, setting a benchmark for next-generation inspection tools.
