MOTION CONTROL METHOD FOR ADAPTIVE SELF-RECONFIGURABLE PIPELINE ROBOT BASED ON ENVIRONMENTAL PERCEPTION
20260072437 ยท 2026-03-12
Assignee
Inventors
- Aiguo Song (Jiangsu, CN)
- Tianyuan MIAO (Jiangsu, CN)
- Qinjie JI (Jiangsu, CN)
- Shaohu WANG (Jiangsu, CN)
- Huijun Li (Jiangsu, CN)
Cpc classification
G06V10/26
PHYSICS
F16L55/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G05D2111/52
PHYSICS
G06V20/56
PHYSICS
G05D1/242
PHYSICS
G05D1/435
PHYSICS
G05D2107/50
PHYSICS
B62D57/028
PERFORMING OPERATIONS; TRANSPORTING
International classification
G05D1/435
PHYSICS
B62D57/028
PERFORMING OPERATIONS; TRANSPORTING
F16L55/26
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G05D1/242
PHYSICS
G06V10/26
PHYSICS
Abstract
A motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception includes: acquiring internal images of the pipeline for scene recognition, segmenting planar surfaces and curved surfaces in the images according to recognition results, and extracting boundary lines of the pipeline; calculating a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height to analyze passability of the robot; designing a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, performing smoothing, and inputting the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; estimating a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm, and inputting estimation results and collision warning signals into the MPC motion controller; and finally outputting a signal for control of a motor and a swing-arm motor.
Claims
1. A motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception, comprising the following steps: step 1: placing a pipeline robot in a pipeline, using a depth camera to acquire images of the pipeline in front, recognizing a scene of the pipeline through a neural network after denoising and preprocessing on the images, segmenting planar surfaces and curved surfaces in the images according to recognition results, projecting segmentation results of regions on 2D images into a 3D space through a camera projection model, and extracting boundary lines of the pipeline; step 2: calculating a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height according to features of the boundary lines to analyze passability of the robot; step 3: designing a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, performing smoothing, and inputting the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; step 4: estimating a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm according to data from the depth camera, an inertial measurement unit (IMU), and a wheel speed sensor, and inputting estimation results and collision warning signals from Time-of-Flight (TOF) modules into the MPC motion controller in real time; and step 5: constructing a quadratic objective function through a kinematic model of a dual-swing-arm crawler robot in combination with the reference trajectory and the swing-arm angle sequence in the MPC motion controller, solving through an iterative method, and outputting a signal for adaptive motion control of the robot; wherein a measured length, width, and height of the pipeline robot are denoted as L.sub.l,L.sub.w,L.sub.h a radius of a driving wheel is denoted as R, a radius of a driven wheel on a swing arm is denoted as r, a wheelbase of a front swing arm is denoted as l.sub.F, a wheelbase of a rear swing arm is denoted as l.sub.R, and a wheelbase of a traveling main body is denoted as l.sub.B wherein the step 2 comprises: after obtaining an equation of each boundary line, calculating a width of the pipeline using two straight lines on a bottom surface of the pipeline in a straight-pipe scene according to geometric characteristics of the pipeline, calculating a height of the pipeline using two straight lines on a side surface of the pipeline, and identifying passability of the straight-pipe scene in front according to results of comparison with outer envelope dimensions of the robot; and calculating an angle between side boundary lines as a slope angle .sub.s of a slope environment, wherein a coefficient of friction between the robot and a wall surface of the pipeline is denoted as , and when arctan()<.sub.s, the slope in front is deemed passable; calculating a distance between a boundary line on a step surface and a boundary line on the bottom surface of the pipeline in the step scene to obtain a step height H, and when
2. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the pipeline robot is a dual-swing-arm crawler mechanism, and sensing equipment equipped thereon comprises the depth camera, the IMU, the wheel speed sensor, and three TOF modules for distance measurement, wherein the depth camera is installed just in front of the robot, and the three TOF modules are installed on left and right sides and a top of the robot.
3. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 1 comprises: acquiring images of a rectangular pipeline through the depth camera, performing denoising and strong light suppression on the images through bilateral filtering and backlight compensation sequentially, completing preprocessing of the images, and using a lightweight ResNet18 neural network to perform scene recognition based on the processed images, wherein four scenes are classified according to recognition results: a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene; using a DeepLabv3 network to segment a 2D image into a plurality of planar regions P and curved surface regions S according to classification results; randomly selecting three points on each region along an x axis of a pixel coordinate system, and converting a selected point p.sub.uv in the pixel coordinate system to p.sub.C in a camera coordinate system according to an intrinsic matrix K of the depth camera and scale information Z from the depth camera:
4. (canceled)
5. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 3 comprises: arranging only a traveling crawler for the motion control in the straight-pipe scene and the bent-pipe scene, designing the path planner to calculate equidistant lines of left and right side boundary lines as a reference trajectory for traveling of the robot, and performing sampling and smoothing; adjusting angles of front and rear swing arms simultaneously while controlling the traveling crawler in the slope scene and the step scene; and simplifying the kinematic model of the dual-swing-arm crawler pipeline robot into a contact boundary line model, wherein the contact boundary line model is composed of a front swing-arm contact boundary line l.sub.f, a rear swing-arm contact boundary line l.sub.r, and a chassis contact boundary line l.sub.b;
6. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 5, wherein solving .sub.r and .sub.f in the step 3 is subjected to the following constraints: 1) at least two points on three simplified boundary lines are in contact with the terrain; 2) the three simplified boundary lines are free from interference with the terrain; and 3) a vertical line of the robot's center of gravity is between p.sub.br and p.sub.bf; in response to .sub.r and .sub.f have a plurality of solutions, taking a state of the robot with a lowest height of the center of gravity as an optimal state; after obtaining the swing-arm angle sequence in a discrete state during obstacle crossing, calculating a motion cost according to a swing-arm angle change rate, reducing a cumulative cost through a dynamic programming algorithm, obtaining an optimal discrete swing-arm angle sequence, and performing sampling and smoothing on the generated optimal discrete swing-arm state sequence through a Bezier curve; and sending a smoothed reference path and swing-arm angle to the MPC motion controller.
7. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 4 comprises: deriving ESKF motion equations for a nominal state and an error state in discrete time using an IMU measurement model, then performing an ESKF prediction, comprising predictions of the nominal state and the error state, obtaining left and right crawler speeds through an encoder in an update stage, observing and measuring a distance between a pipeline joint and a vehicle body in order to mitigate cumulative errors, and completing the update of a covariance matrix and the error state; calculating the distance between the pipeline joint and the vehicle body according to boundary line features of the pipeline joint extracted by the depth camera in combination with rectangular constraints and the camera projection model; installing the TOF modules for distance measurement on the left and right sides and the top of the robot and configured for collision warning; and inputting the real-time position and state of the robot and a collision warning signal into the MPC motion controller.
8. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 5 comprises: setting a system state as x=[x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch,.sub.F,.sub.R].sup.T, wherein x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch represent 3D spatial coordinates, a yaw angle, and a pitch angle of the robot respectively, wherein a system input is denoted as u=[v.sub.L,v.sub.R,.sub.F,.sub.R].sup.T, wherein v.sub.L,v.sub.R,.sub.F,.sub.R represent an input velocity of a left crawler, an input velocity of a right crawler, an angular increment of the front swing arm, and an angular increment of the rear swing arm; a reference state is denoted as X.sub.ref={x.sub.0, x.sub.1, x.sub.2, . . . , x.sub.end}, wherein x.sub.0 signifies an initial state of the system, and x.sub.end represents a desired final state of the system; predicting states of the system in next N cycles at a moment i through the kinematic model f of the dual-swing-arm crawler robot according to the formula
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038]
[0039]
[0040]
[0041]
DETAILED DESCRIPTIONS OF THE EMBODIMENTS
[0042] The present disclosure will be further described in detail with reference to the accompanying drawings and specific embodiments.
[0043] The present disclosure provides a motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception, and the method includes the following steps: [0044] step 1: place the pipeline robot in a pipeline, using the depth camera to acquire images of the pipeline in front, recognize a scene of the pipeline through a neural network after denoising and preprocessing on the images, segment planar surfaces and curved surfaces in the images according to recognition results, project segmentation results of regions on the 2D images into a 3D space through a camera projection model, and extract boundary lines of the pipeline; [0045] step 2: calculate a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height according to features of the boundary lines to analyze passability of the robot; [0046] step 3: design a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, perform smoothing, and input the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; [0047] step 4: estimate a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm according to data from the depth camera, the IMU, and the wheel speed sensor, and input estimation results and collision warning signals from the TOF modules into the MPC motion controller in real time; and [0048] step 5: construct a quadratic objective function through a kinematic model of the dual-swing-arm crawler robot in combination with the reference trajectory and the swing-arm angle sequence in the MPC motion controller, solve through an iterative method, and output a signal for adaptive motion control of the robot, where a framework of the motion control method for the robot is shown in
[0049] First, process the images of a rectangular pipeline acquired through the depth camera, perform denoising and strong light suppression on the images through bilateral filtering and backlight compensation sequentially, and complete preprocessing of the images; [0050] use a lightweight ResNet18 neural network to perform scene recognition based on the processed images, where four scenes are classified according to recognition results: a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene; [0051] use a DeepLabv3 network to segment a 2D image into a plurality of planar regions P and curved surface regions S according to classification results; [0052] randomly select three points on each region along an axis x of a pixel coordinate system, and convert a selected point p.sub.uv in the pixel coordinate system to p.sub.C in a camera coordinate system of the camera according to an intrinsic matrix K and scale information Z from the depth camera;
[0058]
[0059] Arrange only a traveling crawler for the motion control in the straight-pipe scene and the bent-pipe scene, design the path planner to calculate equidistant lines of left and right side boundary lines as a reference trajectory for traveling of the robot, and perform sampling and smoothing; [0060] adjust angles of front and rear swing arms simultaneously while controlling the traveling crawler in the slope scene and the step scene; and simplifying the kinematic model of the dual-swing-arm crawler pipeline robot into a contact boundary line model, where as shown in
[0065] After obtaining the swing-arm angle sequence in a discrete state during obstacle crossing, calculate a motion cost according to a swing-arm angle change rate, reduce a cumulative cost through a dynamic programming algorithm, obtain an optimal discrete swing-arm angle sequence, and perform sampling and smoothing on the generated optimal discrete swing-arm state sequence through a Bezier curve; and send a smoothed reference path and swing-arm angle to the MPC motion controller.
[0066] In order to obtain a real-time position and state of the robot, derive ESKF motion equations for a nominal state and an error state in discrete time using an IMU measurement model, and then perform an ESKF prediction, including predictions of the nominal state and the error state; [0067] obtain left and right crawler speeds through an encoder in an update stage, observe and measure a distance between a pipeline joint and a vehicle body in order to mitigate cumulative errors, and complete the update of a covariance matrix and the error state; calculate the distance between the pipeline joint and the vehicle body according to boundary line features of the pipeline joint extracted by the depth camera in combination with rectangular constraints and the camera projection model; and [0068] the TOF modules for distance measurement are respectively installed on the left and right sides and the top of the robot and configured for collision warning; and input the real-time position and state of the robot and a collision warning signal into the MPC motion controller.
[0069] Set a system state as x=[x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch,.sub.F,.sub.R].sup.T, where x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch represent 3D spatial coordinates, a yaw angle, and a pitch angle of the robot respectively; a system input is denoted as u=[v.sub.L,v.sub.R,.sub.F,.sub.R].sup.T, where v.sub.L,v.sub.R,.sub.F,.sub.R represent an input velocity of a left crawler, an input velocity of a right crawler, an angular increment of the front swing arm, and an angular increment of the rear swing arm respectively; a reference state is denoted as X.sub.ref={x.sub.0, x.sub.1, x.sub.2, . . . , x.sub.end}, where x.sub.0 signifies an initial state of the system, and x.sub.end represents a desired final state of the system; [0070] predict states of the system in next N cycles at a moment i through the kinematic model f of the dual-swing-arm crawler robot according to the formula
[0072] The foregoing descriptions are merely preferred examples of the present disclosure, and are not intended to impose any formal restrictions on the present disclosure. Any modifications or equivalent variations made based on the technical essence of the present disclosure still fall within the protection scope of the present disclosure.