MOTION CONTROL METHOD FOR ADAPTIVE SELF-RECONFIGURABLE PIPELINE ROBOT BASED ON ENVIRONMENTAL PERCEPTION

20260072437 ยท 2026-03-12

Assignee

Inventors

Cpc classification

International classification

Abstract

A motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception includes: acquiring internal images of the pipeline for scene recognition, segmenting planar surfaces and curved surfaces in the images according to recognition results, and extracting boundary lines of the pipeline; calculating a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height to analyze passability of the robot; designing a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, performing smoothing, and inputting the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; estimating a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm, and inputting estimation results and collision warning signals into the MPC motion controller; and finally outputting a signal for control of a motor and a swing-arm motor.

Claims

1. A motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception, comprising the following steps: step 1: placing a pipeline robot in a pipeline, using a depth camera to acquire images of the pipeline in front, recognizing a scene of the pipeline through a neural network after denoising and preprocessing on the images, segmenting planar surfaces and curved surfaces in the images according to recognition results, projecting segmentation results of regions on 2D images into a 3D space through a camera projection model, and extracting boundary lines of the pipeline; step 2: calculating a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height according to features of the boundary lines to analyze passability of the robot; step 3: designing a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, performing smoothing, and inputting the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; step 4: estimating a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm according to data from the depth camera, an inertial measurement unit (IMU), and a wheel speed sensor, and inputting estimation results and collision warning signals from Time-of-Flight (TOF) modules into the MPC motion controller in real time; and step 5: constructing a quadratic objective function through a kinematic model of a dual-swing-arm crawler robot in combination with the reference trajectory and the swing-arm angle sequence in the MPC motion controller, solving through an iterative method, and outputting a signal for adaptive motion control of the robot; wherein a measured length, width, and height of the pipeline robot are denoted as L.sub.l,L.sub.w,L.sub.h a radius of a driving wheel is denoted as R, a radius of a driven wheel on a swing arm is denoted as r, a wheelbase of a front swing arm is denoted as l.sub.F, a wheelbase of a rear swing arm is denoted as l.sub.R, and a wheelbase of a traveling main body is denoted as l.sub.B wherein the step 2 comprises: after obtaining an equation of each boundary line, calculating a width of the pipeline using two straight lines on a bottom surface of the pipeline in a straight-pipe scene according to geometric characteristics of the pipeline, calculating a height of the pipeline using two straight lines on a side surface of the pipeline, and identifying passability of the straight-pipe scene in front according to results of comparison with outer envelope dimensions of the robot; and calculating an angle between side boundary lines as a slope angle .sub.s of a slope environment, wherein a coefficient of friction between the robot and a wall surface of the pipeline is denoted as , and when arctan()<.sub.s, the slope in front is deemed passable; calculating a distance between a boundary line on a step surface and a boundary line on the bottom surface of the pipeline in the step scene to obtain a step height H, and when l R + r + l B sin 2 - R cos > H , the step in front is deemed passable, ( 0 , arccos R 2 l B ) ; calculating an inner curvature .sub.in and an outer curvature .sub.out of a pipe bend using two curved lines on the bottom surface of the pipeline in the bent-pipe scene, and then identifying passability conditions of the bent pipe: { L w < 1 / out - 1 / in L 1 < 2 ( 1 / out ) 2 + ( 1 / in + L w ) 2 .

2. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the pipeline robot is a dual-swing-arm crawler mechanism, and sensing equipment equipped thereon comprises the depth camera, the IMU, the wheel speed sensor, and three TOF modules for distance measurement, wherein the depth camera is installed just in front of the robot, and the three TOF modules are installed on left and right sides and a top of the robot.

3. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 1 comprises: acquiring images of a rectangular pipeline through the depth camera, performing denoising and strong light suppression on the images through bilateral filtering and backlight compensation sequentially, completing preprocessing of the images, and using a lightweight ResNet18 neural network to perform scene recognition based on the processed images, wherein four scenes are classified according to recognition results: a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene; using a DeepLabv3 network to segment a 2D image into a plurality of planar regions P and curved surface regions S according to classification results; randomly selecting three points on each region along an x axis of a pixel coordinate system, and converting a selected point p.sub.uv in the pixel coordinate system to p.sub.C in a camera coordinate system according to an intrinsic matrix K of the depth camera and scale information Z from the depth camera: Zp uv = Kp C ; and fitting equations of each planar region P.sub.i and each curved surface region S.sub.j in space according to 3D coordinates of sampling points in each region in combination with geometric structural features of a pipeline environment; and calculating a spatial equation of each straight line or curved line according to a regional equation, wherein boundary lines of the pipeline formed by intersections of each region are straight lines or curved lines.

4. (canceled)

5. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 3 comprises: arranging only a traveling crawler for the motion control in the straight-pipe scene and the bent-pipe scene, designing the path planner to calculate equidistant lines of left and right side boundary lines as a reference trajectory for traveling of the robot, and performing sampling and smoothing; adjusting angles of front and rear swing arms simultaneously while controlling the traveling crawler in the slope scene and the step scene; and simplifying the kinematic model of the dual-swing-arm crawler pipeline robot into a contact boundary line model, wherein the contact boundary line model is composed of a front swing-arm contact boundary line l.sub.f, a rear swing-arm contact boundary line l.sub.r, and a chassis contact boundary line l.sub.b; f = arccos ( l R / ( R - r ) ) + F r = arccos ( l R / ( R - r ) ) + R l f = l R 2 - ( R - r ) 2 + R arctan ( - f 2 ) l r = l R 2 - ( R - r ) 2 + R arctan ( - r 2 ) l b = l B + R arctan ( - f 2 ) + R arctan ( - r 2 ) endpoints on two sides of l.sub.r are denoted as p.sub.r and p.sub.br, endpoints on two sides of l.sub.f are denoted as p.sub.f and p.sub.bf, endpoints on two sides of l.sub.b are denoted as p.sub.br and p.sub.bf, an included angle between l.sub.r and l.sub.b is denoted as .sub.r, and an included angle between l.sub.f and l.sub.b is denoted as .sub.f; using the simplified model for terrain contact, calculating .sub.r and .sub.f during traveling, and mapping same to the swing-arm joint angles .sub.R and .sub.F to complete the control of the swing arms.

6. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 5, wherein solving .sub.r and .sub.f in the step 3 is subjected to the following constraints: 1) at least two points on three simplified boundary lines are in contact with the terrain; 2) the three simplified boundary lines are free from interference with the terrain; and 3) a vertical line of the robot's center of gravity is between p.sub.br and p.sub.bf; in response to .sub.r and .sub.f have a plurality of solutions, taking a state of the robot with a lowest height of the center of gravity as an optimal state; after obtaining the swing-arm angle sequence in a discrete state during obstacle crossing, calculating a motion cost according to a swing-arm angle change rate, reducing a cumulative cost through a dynamic programming algorithm, obtaining an optimal discrete swing-arm angle sequence, and performing sampling and smoothing on the generated optimal discrete swing-arm state sequence through a Bezier curve; and sending a smoothed reference path and swing-arm angle to the MPC motion controller.

7. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 4 comprises: deriving ESKF motion equations for a nominal state and an error state in discrete time using an IMU measurement model, then performing an ESKF prediction, comprising predictions of the nominal state and the error state, obtaining left and right crawler speeds through an encoder in an update stage, observing and measuring a distance between a pipeline joint and a vehicle body in order to mitigate cumulative errors, and completing the update of a covariance matrix and the error state; calculating the distance between the pipeline joint and the vehicle body according to boundary line features of the pipeline joint extracted by the depth camera in combination with rectangular constraints and the camera projection model; installing the TOF modules for distance measurement on the left and right sides and the top of the robot and configured for collision warning; and inputting the real-time position and state of the robot and a collision warning signal into the MPC motion controller.

8. The motion control method for the adaptive self-reconfigurable pipeline robot based on environmental perception according to claim 1, wherein the step 5 comprises: setting a system state as x=[x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch,.sub.F,.sub.R].sup.T, wherein x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch represent 3D spatial coordinates, a yaw angle, and a pitch angle of the robot respectively, wherein a system input is denoted as u=[v.sub.L,v.sub.R,.sub.F,.sub.R].sup.T, wherein v.sub.L,v.sub.R,.sub.F,.sub.R represent an input velocity of a left crawler, an input velocity of a right crawler, an angular increment of the front swing arm, and an angular increment of the rear swing arm; a reference state is denoted as X.sub.ref={x.sub.0, x.sub.1, x.sub.2, . . . , x.sub.end}, wherein x.sub.0 signifies an initial state of the system, and x.sub.end represents a desired final state of the system; predicting states of the system in next N cycles at a moment i through the kinematic model f of the dual-swing-arm crawler robot according to the formula X pre i + 1 .fwdarw. i + N = f ( u i .fwdarw. i + N - 1 , x i ) , constructing the quadratic objective function according to differences between predicted states and reference states, and iteratively optimizing to minimize the differences, so as to obtain an optimized system input u.sub.i.fwdarw.i+N1 for N cycles; in the process of updating states of the system, using the position and state of the robot obtained in the step 4 to obtain more accurate 3D spatial coordinates of the robot; and optimizing the system input u through a sliding window in real time.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0038] FIG. 1 illustrates a framework diagram of a motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception provided by the present disclosure.

[0039] FIG. 2 illustrates trajectories and swing arm planning of a dual-swing-arm crawler pipeline robot in a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene, where (a) illustrates straight-pipe motion planning, (b) illustrates bent-pipe motion planning, (c) illustrates slope motion planning, and (d) illustrates step motion planning.

[0040] FIG. 3 illustrates a schematic diagram of simplifying a dual-swing-arm crawler pipeline robot model into a contact boundary line model.

[0041] FIG. 4 illustrates a schematic diagram of a simplified contact boundary line model for robots passing through a step in six stages.

DETAILED DESCRIPTIONS OF THE EMBODIMENTS

[0042] The present disclosure will be further described in detail with reference to the accompanying drawings and specific embodiments.

[0043] The present disclosure provides a motion control method for an adaptive self-reconfigurable pipeline robot based on environmental perception, and the method includes the following steps: [0044] step 1: place the pipeline robot in a pipeline, using the depth camera to acquire images of the pipeline in front, recognize a scene of the pipeline through a neural network after denoising and preprocessing on the images, segment planar surfaces and curved surfaces in the images according to recognition results, project segmentation results of regions on the 2D images into a 3D space through a camera projection model, and extract boundary lines of the pipeline; [0045] step 2: calculate a straight-pipe width, a bent-pipe curvature, a slope angle, and a step height according to features of the boundary lines to analyze passability of the robot; [0046] step 3: design a path planner and a swing-arm planner to generate a reference trajectory and a swing-arm angle sequence of the robot, perform smoothing, and input the reference trajectory and the swing-arm angle sequence into a model predictive control (MPC) motion controller; [0047] step 4: estimate a position and state of the robot through an Error State Kalman Filter (ESKF) algorithm according to data from the depth camera, the IMU, and the wheel speed sensor, and input estimation results and collision warning signals from the TOF modules into the MPC motion controller in real time; and [0048] step 5: construct a quadratic objective function through a kinematic model of the dual-swing-arm crawler robot in combination with the reference trajectory and the swing-arm angle sequence in the MPC motion controller, solve through an iterative method, and output a signal for adaptive motion control of the robot, where a framework of the motion control method for the robot is shown in FIG. 1.

[0049] First, process the images of a rectangular pipeline acquired through the depth camera, perform denoising and strong light suppression on the images through bilateral filtering and backlight compensation sequentially, and complete preprocessing of the images; [0050] use a lightweight ResNet18 neural network to perform scene recognition based on the processed images, where four scenes are classified according to recognition results: a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene; [0051] use a DeepLabv3 network to segment a 2D image into a plurality of planar regions P and curved surface regions S according to classification results; [0052] randomly select three points on each region along an axis x of a pixel coordinate system, and convert a selected point p.sub.uv in the pixel coordinate system to p.sub.C in a camera coordinate system of the camera according to an intrinsic matrix K and scale information Z from the depth camera;

[00007] Zp uv = Kp C ; ( 1 ) [0053] fit equations of each planar region P.sub.i and each curved surface region S.sub.j in space according to 3D coordinates of sampling points in each region in combination with geometric structural features of the pipeline environment; and calculate a spatial equation of each straight line or curved line according to a regional equation, where boundary lines of the pipeline formed by intersections of each region are straight lines or curved lines. [0054] a measured length, width, and height of the pipeline robot are denoted as L.sub.l,L.sub.w,L.sub.h, a radius of a driving wheel is denoted as R, a radius of a driven wheel on a swing arm is denoted as r, a wheelbase of a front swing arm is denoted as l.sub.F, a wheelbase of a rear swing arm is denoted as l.sub.R, and a wheelbase of a traveling main body is denoted as l.sub.B after obtaining an equation of each boundary line, calculate a width of the pipeline using two straight lines on a bottom surface of the pipeline in a straight-pipe scene according to geometric characteristics of the pipeline, calculate a height of the pipeline using two straight lines on a side surface of the pipeline, and identify the passability of the straight-pipe scene in front according to results of comparison with outer envelope dimensions of the robot; [0055] calculate an angle between side boundary lines as a slope angle .sub.s of a slope environment, where a coefficient of friction between the robot and a wall surface of the pipeline is denoted as , and when arctan()<.sub.s, the slope in front is deemed passable; [0056] calculate a distance between a boundary line on a step surface and a boundary line on the bottom surface of the pipeline in the step scene to obtain a step height H, and when

[00008] l R + r + l B sin 2 - R cos > H , the step in front is deemed passable,

[00009] ( 0 , arccos R 2 l B ) ; [0057] calculate an inner curvature .sub.in and an outer curvature .sub.out of a pipe bend using two curved lines on the bottom surface of the pipeline in the bent-pipe scene, and then identify passability conditions of the bent pipe:

[00010] { L w < 1 / out - 1 / in L 1 < 2 ( 1 / out ) 2 + ( 1 / in + L w ) 2 ; ( 2 )

[0058] FIG. 2 illustrates trajectories and swing arm planning of a dual-swing-arm crawler pipeline robot in a straight-pipe scene, a bent-pipe scene, a slope scene, and a step scene, where (a) illustrates straight-pipe motion planning, (b) illustrates bent-pipe motion planning, (c) illustrates slope motion planning, and (d) illustrates step motion planning.

[0059] Arrange only a traveling crawler for the motion control in the straight-pipe scene and the bent-pipe scene, design the path planner to calculate equidistant lines of left and right side boundary lines as a reference trajectory for traveling of the robot, and perform sampling and smoothing; [0060] adjust angles of front and rear swing arms simultaneously while controlling the traveling crawler in the slope scene and the step scene; and simplifying the kinematic model of the dual-swing-arm crawler pipeline robot into a contact boundary line model, where as shown in FIG. 3, the contact boundary line model is mainly composed of a front swing-arm contact boundary line l.sub.f, a rear swing-arm contact boundary line l.sub.r, and a chassis contact boundary line l.sub.b;

[00011] f = arccos ( l R / ( R - r ) ) + F ( 3 ) r = arccos ( l R / ( R - r ) ) + R l f = l R 2 - ( R - r ) 2 + R arctan ( - f 2 ) l r = l R 2 - ( R - r ) 2 + R arctan ( - r 2 ) l b = l B + R arctan ( - f 2 ) + R arctan ( - r 2 ) ; [0061] endpoints on two sides of l.sub.r are denoted as p.sub.r and p.sub.br, endpoints on two sides of l.sub.f are denoted as p.sub.f and p.sub.bf, endpoints on two sides of l.sub.b are denoted as p.sub.br and p.sub.bf, an included angle between l.sub.r and l.sub.b is denoted as .sub.r, and an included angle between l.sub.f and l.sub.b is denoted as .sub.f; [0062] use the simplified model for terrain contact, calculate .sub.r and .sub.f during traveling, and map same to the swing-arm joint angles .sub.R and .sub.F, to complete the control of the swing arms; [0063] solving .sub.r and .sub.f is subjected to the following constraints: 1) at least two points on three simplified boundary lines are in contact with the terrain; 2) the three simplified boundary lines are free from interference with the terrain; and 3) a vertical line of the robot's center of gravity is between p.sub.br and p.sub.bf; [0064] when .sub.r and .sub.f have a plurality of solutions, take a state of the robot with a lowest height of the center of gravity as an optimal state; and FIG. 4 illustrates a schematic diagram of a simplified boundary line model for robots passing through a step in six stages.

[0065] After obtaining the swing-arm angle sequence in a discrete state during obstacle crossing, calculate a motion cost according to a swing-arm angle change rate, reduce a cumulative cost through a dynamic programming algorithm, obtain an optimal discrete swing-arm angle sequence, and perform sampling and smoothing on the generated optimal discrete swing-arm state sequence through a Bezier curve; and send a smoothed reference path and swing-arm angle to the MPC motion controller.

[0066] In order to obtain a real-time position and state of the robot, derive ESKF motion equations for a nominal state and an error state in discrete time using an IMU measurement model, and then perform an ESKF prediction, including predictions of the nominal state and the error state; [0067] obtain left and right crawler speeds through an encoder in an update stage, observe and measure a distance between a pipeline joint and a vehicle body in order to mitigate cumulative errors, and complete the update of a covariance matrix and the error state; calculate the distance between the pipeline joint and the vehicle body according to boundary line features of the pipeline joint extracted by the depth camera in combination with rectangular constraints and the camera projection model; and [0068] the TOF modules for distance measurement are respectively installed on the left and right sides and the top of the robot and configured for collision warning; and input the real-time position and state of the robot and a collision warning signal into the MPC motion controller.

[0069] Set a system state as x=[x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch,.sub.F,.sub.R].sup.T, where x.sub.r,y.sub.r,z.sub.r,.sub.yaw,.sub.pitch represent 3D spatial coordinates, a yaw angle, and a pitch angle of the robot respectively; a system input is denoted as u=[v.sub.L,v.sub.R,.sub.F,.sub.R].sup.T, where v.sub.L,v.sub.R,.sub.F,.sub.R represent an input velocity of a left crawler, an input velocity of a right crawler, an angular increment of the front swing arm, and an angular increment of the rear swing arm respectively; a reference state is denoted as X.sub.ref={x.sub.0, x.sub.1, x.sub.2, . . . , x.sub.end}, where x.sub.0 signifies an initial state of the system, and x.sub.end represents a desired final state of the system; [0070] predict states of the system in next N cycles at a moment i through the kinematic model f of the dual-swing-arm crawler robot according to the formula

[00012] X pre i + 1 .fwdarw. i + N = f ( u i .fwdarw. i + N - 1 , x i ) , construct the quadratic objective function according to differences between predicted states and reference states, and iteratively optimize to minimize the differences, so as to obtain an optimized system input u.sub.i.fwdarw.i+N1 for N cycles; and [0071] in the process of updating states of the system, use the position and state of the robot obtained in the step 4 to obtain more accurate 3D spatial coordinates of the robot; and optimize the system input u through a sliding window in real time.

[0072] The foregoing descriptions are merely preferred examples of the present disclosure, and are not intended to impose any formal restrictions on the present disclosure. Any modifications or equivalent variations made based on the technical essence of the present disclosure still fall within the protection scope of the present disclosure.