Parallel platform tracking control apparatus using visual device as sensor and control method thereof
09971357 ยท 2018-05-15
Assignee
Inventors
- Xianmin Zhang (Guangdong, CN)
- Jiasi Mo (Guangdong, CN)
- Zhicheng Qiu (Guangdong, CN)
- Junyang Wei (Guangdong, CN)
- Hai Li (Guangdong, CN)
Cpc classification
G05B19/404
PHYSICS
B25J9/1623
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A parallel platform tracking control apparatus and method using a visual device as a sensor are disclosed. The apparatus comprises a parallel platform body, a CCD camera (11), a lens (6), a camera light source (9), a computer (11), a Dspace semi-physical simulation controller, an ultrasonic motor driver and the like. The parallel platform comprises an ultrasonic linear motor (1), a linear grating encoder (5), a driven rod (2), a moving platform (3), a stationary platform (4) and the like. The CCD camera photographs the moving platform in a perpendicular and opposite manner, and the photographed images are subjected to an image processing algorithm to measure the position of the corresponding marker on the moving platform, which is not only applicable to measurement, but also implements real-time feedback of the termination position of the moving platform, such that a full closed-loop system is constituted. In this way, the parallel platform is precisely positioned.
Claims
1. A parallel platform tracking control apparatus using a visual device as a sensor, comprising a parallel platform mechanism, a CCD camera-based visual measurement system, and a position tracking feedback control apparatus; wherein the parallel platform mechanism comprises: a stationary platform, a moving platform arranged above the stationary platform, three ultrasonic linear motors, an ultrasonic linear motor driver; the moving platform has three rotary shafts distributed uniformly in equilateral triangle at an outer edge of the moving platform; the three ultrasonic linear motors are distributed and mounted in equilateral triangle at an edge of the stationary platform; shaft ends of rotation shafts installed on the ultrasonic linear motors are connected to one of the rotary shafts of the moving platform via a driven rod; and the ultrasonic linear motors are each provided with a linear grating encoder; one ultrasonic linear motor, one driven rod and one rotary shaft of the moving platform constitute a parallel branch; the three ultrasonic linear motors collaboratively drive the driven rods, such that the moving platform moves to a target position; the ultrasonic linear motors, during movement, drives the linear grating encoder to move, to detect actual positions of the ultrasonic linear motors; a first point light source marker and a second point light source marker that are linearly distributed and configured to focus blue laser are mounted on the moving platform as visual detection features; the CCD camera-based visual measurement system comprises: a CCD camera arranged above the moving platform, a lens mounted on an end portion of the CCD camera, a camera light source arranged on a side of the lens, a USB interface and a computer; wherein the CCD camera is connected to the computer via the USB interface; the lens is perpendicular and opposite to the moving platform, such that a central point of the lens coincides with an origin point of the moving platform; the position tacking feedback control apparatus comprises a Dspace semi-physical simulation controller having an incremental encoder interface module and a D/AC interface; the D/AC interface is connected to the ultrasonic linear motor driver, each of the linear grating encoders is connected to the incremental encoder interface module, the incremental encoder interface module is connected to the computer, and the computer is connected to the D/AC interface.
2. The parallel platform tracking control apparatus using a visual device as a sensor according to claim 1, wherein the CCD camera further comprises a support mechanism, wherein the support mechanism comprises a bracket having a movable joint and a magnetic base configured to fix the bracket, and the CCD camera is fixed to an end portion of the bracket.
3. A control method of a parallel platform tracking control apparatus using a visual device as a sensor as defined in claim 2, comprising the following steps: (I) a CCD camera-based visual measurement step, comprising: acquiring an image signal of a moving platform via photographing by the CCD camera, and transmitting the image signal to the computer via the USB interface; upon acquiring the image signal, performing image processing by the computer, extracting features of the first point light source marker and the second point light source marker on the moving platform, and obtaining coordinates via calculation; calculating parallel displacements x and y and a rotation angle of the moving platform according to the coordinates of the first point light source marker and the second point light source marker; wherein the two parallel displacements x and y and the rotation angle of the moving platform are calculated using the following formulae: assuming that a coordinate of the first point light source marker before movement is (x.sub.1,y.sub.1) and a coordinate of the first point light source marker after movement is (x.sub.1,y.sub.1), and a coordinate of the second point light source marker before movement is (x.sub.2,y.sub.2) and a coordinate of the second point light source marker after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker and the second point light source marker before movement constitute a vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.1) after movement; the displacements of the moving platform are
4. A control method of a parallel platform tracking control apparatus using a visual device as a sensor according to claim 3, further comprising a linear grating encode joint position feedback step and a CCD camera-based visual measurement termination feedback step; wherein the CCD camera-based visual measurement termination feedback step, comprising: photographing by the CCD camera-based visual measurement system within each sampling cycle, and transmitting the image signal to the computer via the USB interface, whereupon the computer performs the corresponding image processing, acquires the coordinates information of the photographed first point light source marker and the photographed second point light source marker, and calculates the position and gesture angle error of the moving platform using the following formulae: assuming that the coordinate of the first point light source marker before movement is (x.sub.1,y.sub.1) and the coordinate of the first point light source marker after movement is (x.sub.1,y.sub.1), and the coordinate of the second point light source marker before movement is (x.sub.2,y.sub.2) and the coordinate of the second point light source marker after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker and the second point light source marker before movement constitute the vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)) after movement; the displacements of the moving platform are
5. A control method of a parallel platform tracking control apparatus using a visual device as a sensor as defined in claim 1, comprising the following steps: (I) a CCD camera-based visual measurement step, comprising: acquiring an image signal of a moving platform via photographing by the CCD camera, and transmitting the image signal to the computer via the USB interface; upon acquiring the image signal, performing image processing by the computer, extracting features of the first point light source marker and the second point light source marker on the moving platform, and obtaining coordinates via calculation; calculating parallel displacements x and y and a rotation angle of the moving platform according to the coordinates of the first point light source marker and the second point light source marker; wherein the two parallel displacements x and y and the rotation angle of the moving platform are calculated using the following formulae: assuming that a coordinate of the first point light source marker before movement is (x.sub.1,y.sub.1) and a coordinate of the first point light source marker after movement is (x.sub.1,y.sub.1), and a coordinate of the second point light source marker before movement is (x.sub.2,y.sub.2) and a coordinate of the second point light source marker after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker and the second point light source marker before movement constitute a vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)) after movement; the displacements of the moving platform are
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
(6) The present invention is further described with reference to specific embodiments.
Embodiments
(7) As illustrated in
(8) the parallel platform mechanism comprises: a stationary platform 4, a moving platform 3 arranged above the stationary platform 4, three ultrasonic linear motors 1, an ultrasonic linear motor driver; wherein the moving platform 3 has three rotary shafts distributed uniformly in equilateral triangle at an outer edge of the moving platform 3; the three ultrasonic linear motors 1 are distributed and mounted in equilateral triangle at an edge of the stationary platform 4; shaft ends of the rotation shafts 10 installed on the ultrasonic linear motors 1 are connected to one of the rotary shafts of the moving platform 3 via a driven rod 2; and the ultrasonic linear motors 1 are each provided with a linear grating encoder 5;
(9) one ultrasonic linear motor 1, one driven rod 2 and one rotary shaft of the moving platform 3 constitute a parallel branch; wherein totally three parallel branches are constituted;
(10) the three ultrasonic linear motors 1 collaboratively drive the driven rod 2, such that the moving platform 3 moves to a target position;
(11) the ultrasonic linear motors 1, during movement, drives the linear grating encoder 5 to move, to detect actual positions of the motors 1;
(12) a first point light source marker 8-1 and a second point light source marker 8-2 that are linearly distributed and configured to focus blue laser are mounted on the moving platform 3 as visual detection features;
(13) the CCD camera-based visual measurement system comprises: a CCD camera 11 arranged above the moving platform 3, a lens 6 mounted on an end portion of the CCD camera 11, a camera light source 9 arranged on a side of the lens 6, a USB interface and a computer; wherein the CCD camera 11 is connected to the computer via the USB3.0 interface;
(14) the lens 6 is perpendicular and opposite to the moving platform 3, such that a central point of the lens 6 coincides with the origin point of the moving platform 3; and
(15) the position tacking feedback control apparatus comprises a Dspace semi-physical simulation controller having an incremental encoder interface module and a D/AC interface; wherein the D/AC interface is connected to the ultrasonic linear motor driver, each of the linear grating encoders 5 is connected to the incremental encoder interface module, the incremental encoder interface module is connected to the computer, and the computer is connected to the D/AC interface; wherein the Dspace semi-physical simulation controller is a Dspace DS 1103 semi-physical simulation controller.
(16) The CCD camera 11 further comprises a support mechanism, wherein the support mechanism comprises a bracket 7 having a movable joint and a magnetic base 12 configured to fix the bracket 7, and the CCD camera 11 is fixed to an end portion of the bracket 7.
(17) A control method of the parallel platform tracking control apparatus using a visual device as a sensor comprises the following steps:
(18) (I) A CCD Camera-Based Visual Measurement Step
(19) acquiring an image signal of a moving platform 3 via photographing by a CCD camera 11, and transmitting the image signal to a computer via a USB interface;
(20) upon acquiring the image signal, performing image processing by the computer, extracting features of a first point light source marker 8-1 and a second point light source marker 8-2 on the moving platform 3, and obtaining coordinates via calculation;
(21) calculating parallel displacements x and y and a rotation angle of the moving platform 3 according to the coordinates of the first point light source marker 8-1 and the second point light source marker 8-2;
(22) wherein the two parallel displacements x and y and the rotation angle of the moving platform (3) are calculated using the following formula:
(23) assuming that a coordinate of the first point light source marker 8-1 before movement is (x.sub.1,y.sub.1) and a coordinate of the first point light source marker 8-1 after movement is (x.sub.1,y.sub.1), and a coordinate of the second point light source marker 8-2 before movement is (x.sub.2,y.sub.2) and a coordinate of the second point light source marker 8-2 after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker 8-1 and the second point light source marker 8-2 before movement constitute a vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.2y.sub.1)) after movement;
(24) the displacements of the moving platform 3 are
(25)
wherein if x is greater than 0, the moving platform 3 moves towards a positive direction of x-axis, and if x is less than 0, the moving platform 3 moves towards a negative direction of x-axis; and if y is greater than 0, the moving platform 3 moves towards a positive direction of y-axis, and if y is less than 0, the moving platform 3 moves towards a negative direction of y-axis; and
(26) the rotation angle of the moving platform is
(27)
wherein S.Math.S=(x.sub.2x.sub.1)(x.sub.2x.sub.1)+(y.sub.2y.sub.1)(y.sub.2y.sub.1) and |S|.sup.2=(x.sub.2x.sub.1).sup.2+(y.sub.2y.sub.1).sup.2; if is greater than 0, the moving platform 3 rotates clockwise; and if is less than 0, the moving platform 3 rotates counterclockwise;
(28) (II) A Parallel Platform Position Tracking Measurement Feedback Control Step
(29) step 1: upon receiving a drive signal transmitted by an ultrasonic linear motor driver, causing, by an ultrasonic linear motor 1, the moving platform 3 to move to a target position using a driven rod 2, wherein the moving platform 3 has a position and gesture angle error;
(30) step 2: driving, by the ultrasonic linear motor 1, a linear grating encoder 5 to detect an actual position of joint movement, and feeding back the detected actual position to a computer via an incremental encoder interface, wherein a subtraction is made between the actual position and an expected position to produce an error signal, a control signal is produced based on the error signal by means of a corresponding position control algorithm, the control signal outputs a direct-current control signal, via a D/AC interface, to the ultrasonic linear motor driver (the driver generates a 160 KHz alternating-current high voltage), whereupon the driver drives the motor such that the joint precisely moves to the expected position;
(31) step 3: photographing by the CCD camera 11 within each sampling cycle, and transmitting image signals to the computer via the USB interface, whereupon the computer performs corresponding image processing, acquires coordinate information of the photographed first point light source marker 8-1 and second point light source marker 8-2, and obtains the position and gesture angle error of the moving platform using the following formulae:
(32) assuming that a coordinate of the first point light source marker 8-1 before movement is (x.sub.1,y.sub.1) and a coordinate of the first point light source marker 8-1 after movement is (x.sub.1,y.sub.1)), and a coordinate of the second point light source marker 8-2 before movement is (x.sub.2,y.sub.2) and a coordinate of the second point light source marker 8-2 after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker 8-1 and the second point light source marker 8-2 before movement constitute a vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.2y.sub.1)) after movement;
(33) the displacements of the moving platform 3 are
(34)
wherein if x is greater than 0, the moving platform 3 moves towards a positive direction of x-axis, and if x is less than 0, the moving platform 3 moves towards a negative direction of x-axis; and if y is greater than 0, the moving platform 3 moves towards a positive direction of y-axis, and if y is less than 0, the moving platform 3 moves towards a negative direction of y-axis; and
(35) the rotation angle of the moving platform is
(36)
wherein S.Math.S=(x.sub.2x.sub.1)(x.sub.2x.sub.1)+(y.sub.2y.sub.1)(y.sub.2y.sub.1) and |S|.sup.2=(x.sub.2x.sub.1).sup.2+(y.sub.2y.sub.1).sup.2; if is greater than 0, the moving platform 3 rotates clockwise; and if is less than 0, the moving platform 3 rotates counterclockwise;
(37) step 4: after the computer obtains the position and gesture angle error of the moving platform 3 via calculation, rectifying a joint driving amount, wherein a joint control signal upon rectification outputs the direct-current control signal, via the D/AC interface, to the ultrasonic linear motor driver, whereupon the driver drives the motor such that the moving platform 3 precisely moves to the expected position;
(38) (III) A Moving Platform Position and Gesture Angle Error Calculation Step
(39) step 1: subjecting an image to an optimal threshold segmentation to obtain a binary image;
(40) step 2: subjecting the binary image to a connected region analysis, identifying the first point light source marker 8-1 and the second point light source marker 8-2 according to an area feature of the connected region, and excluding interference from a small-area connected region;
(41) step 3: obtaining a centroid coordinate of a connected region where the first point light source marker and the second point light source marker are located based on the Moment method using the following formula:
(42) a binary bounded image function (x,y), wherein the j+k order moment is
(43)
since quality distribution of the binary image is uniform, the centriod coincides with the mass center; and if a pixel position coordinate of an object in the image is
(44)
therefore, the corresponding centroid coordinate may be obtained by performing mass center calculation for the connected region where the two markers are located;
(45) step 4: after the coordinate information of the photographed first point light source marker 8-1 and the photographed second point light source marker 8-2 is acquired, calculating the position and gesture angle error of the moving platform using the following formulae:
(46) assuming that the coordinate of the first point light source marker 8-1 before movement is (x.sub.1,y.sub.1) and the coordinate of the first point light source marker 8-1 after movement is (x.sub.1,y.sub.1), and the coordinate of the second point light source marker 8-2 before movement is (x.sub.2,y.sub.2) and the coordinate of the second point light source marker 8-2 after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker 8-1 and the second point light source marker 8-2 before movement constitute the vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.2y.sub.1)) after movement;
(47) the displacements of the moving platform (3) are
(48)
wherein if x is greater than 0, the moving platform 3 moves towards a positive direction of x-axis, and if x is less than 0, the moving platform 3 moves towards a negative direction of x-axis; and if y is greater than 0, the moving platform 3 moves towards a positive direction of y-axis, and if y is less than 0, the moving platform 3 moves towards a negative direction of y-axis; and
(49) the rotation angle of the moving platform is
(50)
wherein S.Math.S=(x.sub.2x.sub.1)(x.sub.2x.sub.1)+(y.sub.2y.sub.1)(y.sub.2y.sub.1) and |S|.sup.2=(x.sub.2x.sub.1).sup.2+(y.sub.2y.sub.1).sup.2; if is greater than 0, the moving platform 3 rotates clockwise; and if is less than 0, the moving platform 3 rotates counterclockwise;
(51) The control method further comprises a linear grating encode joint position feedback step and a CCD camera-based visual measurement termination feedback step;
(52) CCD Camera-Based Visual Measurement Termination Feedback Step
(53) photographing by the CCD camera-based visual measurement system within each sampling cycle, and transmitting image signals to the computer via the USB interface, whereupon the computer performs corresponding image processing, acquires coordinates information of the photographed first point light source marker 8-1 and second point light source marker 8-2, and calculates the position and gesture angle error of the moving platform using the following formulae:
(54) assuming that a coordinate of the first point light source marker 8-1 before movement is (x.sub.1,y.sub.1) and a coordinate of the first point light source marker 8-1 after movement is (x.sub.1,y.sub.1), and a coordinate of the second point light source marker 8-2 before movement is (x.sub.2,y.sub.2) and a coordinate of the second point light source marker 8-2 after movement is (x.sub.2,y.sub.2), and assuming that the first point light source marker 8-1 and the second point light source marker 8-2 before movement constitute a vector S=((x.sub.2x.sub.1),(y.sub.2y.sub.1)), and the vector changes to S=((x.sub.2x.sub.1),(y.sub.2y.sub.2y.sub.1)) after movement;
(55) the displacements of the moving platform 3 are
(56)
wherein if x is greater than 0, the moving platform 3 moves towards a positive direction of x-axis, and if x is less than 0, the moving platform 3 moves towards a negative direction of x-axis; and if y is greater than 0, the moving platform 3 moves towards a positive direction of y-axis, and if y is less than 0, the moving platform 3 moves towards a negative direction of y-axis; and
(57) the rotation angle of the moving platform is
(58)
wherein S.Math.S=(x.sub.2x.sub.1)(x.sub.2x.sub.1)+(y.sub.2y.sub.1)(y.sub.2y.sub.1) and |S|.sup.2=(x.sub.2x.sub.1).sup.2+(y.sub.2y.sub.1).sup.2; if is greater than 0, the moving platform 3 rotates clockwise; and if is less than 0, the moving platform 3 rotates counterclockwise;
(59) Linear Grating Encoder Joint Position Feedback Step
(60) driving the linear grating encoder by the ultrasonic linear motor, feeding back the measured actual position to the computer via an incremental encoder interface of a Dspace semi-physical simulation controller, executing a joint positioning control algorithm by the computer to generate a joint control drive signal, outputting a direct-current control signal via a D/AC interface of the Dspace semi-physical simulation controller to a driver of the ultrasonic linear motor to drive the motor such that the joint moves to the expected position.
(61) In this embodiment, the stationary platform 4 has a size of a radium of a circumscribing circle and less than or equal to 365 mm, and may be received in the cavity of a Zeiss scanning electron microscopy EVO LS 15; the moving platform 3 has a circular top and a triangle bottom, an circumscribing circle of the equilateral triangle has a radius of 28 mm and a thickness of 10 mm; the driven rod 2 has a size parameter of 95 mm20 mm30 mm; and all the members are made of aluminum alloy and is subjected to no surface treatment.
(62) The CCD camera 11 is a CCD camera modeled GS3-U3-41C6C/M-C from PointGrey in Canada, which has a resolution of 20482048, and provides 10 m pixel positioning precision in a 20 mm20 mm field of view. If subpixel positioning is also employed, 1 m positioning precision may be achieved, the frame rate is 90 fps, the transport protocol is USB3.0, and the rate is 500 mb/s. Therefore, the rate requirement of real-time feedback may be accommodated.
(63) The linear grating encoder 5 is composed of a grating scale, a reading head and a fine interface, which are respectively: RELE grating scale, modeled RELE IN 20U 1A 0180A, 20 m grating pitch, inconel grating scale, length 180 mm, 20 mm reference mark away from the top; reading head: modeled T161130M, TONIC series vacuum reading head, compatible with RELE grating scale, all reference marks are output, the vacuum cable length is 3 m; Ti fine interface: modeled Ti0400A01A, resolution 50 nm, linear drive output, alarm, receiver clock 1 MHz, reference mark.
(64) The ultrasonic linear motor 1 employs U-264.30 motor from PI company in Germany, which has a stroke of 150 mm, an open loop precision of 0.1 m, an open loop speed of 250 mm/s, a shutdown rigidity of 1.5 N/m, a shutdown holding force of 8 N, a pulling force of 7 N (50 mm/s) and 2 N (250 mm/s), a resonant frequency of 158 kHz, a motor voltage of 200 Vpp, and an input impedance of from 40 to 80 ohms. The ultrasonic linear motor 1 is driven by using the C-872.160 driver from PI company in Germany, which supports linear ultrasonic motors from PI, point-to-point movement, slow movement and precise positioning.
(65) Dspace semi-physical simulation controller from Germany is used, which provides a PCI interface connected to the computer, 16-bit A/D and D/AC interfaces, a voltage range of from 10 V to +10 V, a digital I/O, an incremental encoder interface, an RS232 connector, an SPI and a 1.sup.2C communication interface. Matlab/Simulink RTI is used in software development for real-time simulation, and the greatest sampling time is 0.01 ms.
(66) The computer CPU model is Core i7 4770, with a memory of 8 GB.
(67) Obviously, the above embodiment is merely an exemplary one for illustrating the present invention, but is not intended to limit the implementation of the present invention. Persons of ordinary skills in the art may derive other modifications and variations based on the above embodiments. All embodiments of the present invention are not exhaustively listed herein. Any modification, equivalent replacement, or improvement made without departing from the spirit and principle of the present invention should fall within the protection scope of the present invention.