Method for extracting gear tooth profile edge based on engagement-pixel image edge tracking method
11371821 · 2022-06-28
Assignee
Inventors
Cpc classification
B23F21/06
PERFORMING OPERATIONS; TRANSPORTING
Y10T409/101431
GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
G05B2219/35022
PHYSICS
B23F23/1218
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
A method for extracting a gear tooth profile edge based on an engagement-pixel image edge tracking method includes defining a transmission ratio relationship between a cutter and an envelope tooth profile, setting a cutter profile step size and an envelope step size, acquiring instantaneous contact images at different engaging times, and performing a binarization processing on each curve envelope cluster image; sweeping a boundary of an envelope curve cluster, acquiring pixel points of the edge; preliminarily tracking a tooth profile edge, and then performing a secondary extraction and compensation on the pixel points; calibrating coordinates of a cutter profile; extracting a pixel coordinate of an instantaneous engaging point; converting the pixel points among different instantaneous engaging images; extracting a final tooth profile coordinate of the gear, and performing a tooth shape error analysis and a contact line error analysis.
Claims
1. A method for extracting a gear tooth profile edge based on an engagement-pixel image edge tracking method, comprising the following steps: step (1): defining a transmission ratio relationship between a cutter and an envelope tooth profile, setting a cutter profile step size and an envelope step size, acquiring instantaneous contact images at different engaging times, and performing a binarization processing on each curve envelope cluster image; step (2): sweeping a boundary of an envelope curve cluster, acquiring pixel points of the edge; whereas in order to meet a stability requirement during a gear transmission process, a gear tooth profile is a smooth tooth profile with a regular curvature variation; preliminarily tracking a tooth profile edge by means of a traditional edge tracking method, and then performing a secondary extraction and compensation on the pixel points to improve an accuracy of the tooth profile in combination of two main features, namely, a feature of a step-shaped tooth profile pixel edge and a feature of a pixel absence in a position with a small curvature variation between adjacent tooth profiles; step (3): calibrating coordinates of a cutter profile; step (4): extracting a pixel coordinate of an instantaneous engaging point; step (5): converting the pixel points among different instantaneous engaging images; step (6): extracting a final tooth profile coordinate and the tooth profile edge of the gear according to a coordinate transformation process of steps (1) to (5), and performing a tooth shape error analysis and a contact line error analysis.
2. The method for extracting the gear tooth profile edge based on the engagement-pixel image edge tracking method according to claim 1, wherein in step (3), the step of calibrating the coordinates of the cutter profile includes: simulating a forming machining process of a gear conjugate surface to obtain an image of the envelope curve cluster; changing coordinate values between the image of the envelope curve cluster and an image obtained by a binaryzation; and converting a theoretical value of the cutter into a coordinate value of a pixel to meet a requirement of subsequent processing.
3. The method for extracting the gear tooth profile edge based on the engagement-pixel image edge tracking method according to claim 1, wherein in step (4), the step of extracting the pixel coordinate of the instantaneous engaging point includes: taking the pixel coordinate of the cutter as a basis; combining an engaging point between the cutter and the tooth profile during each instantaneous engaging process; and obtaining the pixel coordinate of the instantaneous engaging point.
4. The method for extracting the gear tooth profile edge based on the engagement-pixel image edge tracking method according to claim 1, wherein in step (5), the step of converting the pixel points among different instantaneous engaging images includes: performing a corresponding coordinate transformation on the pixel points among the different instantaneous engaging images, since each instant corresponds to the different instantaneous engaging pixel images.
5. The method for extracting the gear tooth profile edge based on the engagement-pixel image edge tracking method according to claim 1, wherein in step (6), the gear is a face gear.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
DETAILED DESCRIPTION OF THE EMBODIMENTS
(10) The present invention takes the face gear which has complex conjugate curved surface as an example for verification and analysis, and the specific steps are as follows.
(11) 1. According to the moving track of a hob cutter and the relevant machining parameters, the instantaneous engaging envelope images at different engaging times are obtained, and the binarization is performed, wherein the instantaneous tooth profile equation of the hob cutter is as follows:
(12)
(13) the instantaneous envelope equation of face gear is as follows:
(14)
(15) in the equations, [x.sub.1,y.sub.1,z.sub.1]=[x.sub.1′(ϕs), y.sub.1′(ϕ.sub.s), z.sub.1′(ϕ.sub.s)]; θ.sub.1 is the engaging angle at different engaging moments; θ.sub.2=θ.sub.1/m.sub.gs, m.sub.gs is the transmission ratio of the face gear pair; θ.sub.s is the envelope angle of the hob cutter, ϕ.sub.g=ϕ.sub.s/m.sub.gs; wherein h and w are the x-coordinate and the y-coordinate of the cutter respectively; and R is the radius of the pitch circle of the cylindrical gear.
(16) 2. Regarding the image edge tracking, the gear tooth profiles are all smooth tooth profiles with regular curvature variation. The edge racking of this type of profile is relatively simple. Consequently, in the present invention, the pixel of the tooth profile edge is preliminarily extracted by the traditional edge tracking method i.e. reptile method. The pixel point of the gear tooth profile edge has the following two features.
(17) {circle around (1)} Due to the feature of step-shaped tooth profile pixel edge, the position of the theoretical tooth profile cannot be accurately defined, as shown in
(18) Assuming that the starting pixel point of the first pixel unit is the starting point, the coordinate of the starting point is set as P.sub.11(x.sub.11,y.sub.11). The coordinate of the starting pixel point of the n-th pixel unit is P.sub.ni(x.sub.ni,y.sub.ni), wherein n=1, 2 . . . ; i is the pixel point, i=1, 2 . . . ; when i=1, it is the starting pixel point of the n-th pixel unit. The slope between the pixel point P.sub.11 and any pixel point in the n-th pixel unit can be defined as below:
(19)
(20) the position with a maximum slope in each pixel unit is expressed below:
max(k.sub.ni)=[k.sub.21,k.sub.31 . . . k.sub.n1] (4);
(21) then the pixel point corresponding to the position with the maximum slope is the starting pixel point in each pixel unit:
[P.sub.21,P.sub.31 . . . P.sub.n1]=location(max(k.sub.ni)) (5).
(22) {circle around (2)} Regarding some of the positions of the tooth profile edge, since adjacent tooth profiles have relatively small variations in curvature, in the process of image binarization, there will be a distribution rule of pixel points as shown in
(23) Assuming that the coordinate of the starting point of an ideal straight line (i.e. the starting pixel point of the n-th pixel unit) is P.sub.n1(x.sub.n1,y.sub.n1), and the coordinate of the terminal point (i.e. the starting pixel point of the (n+1)-th pixel unit) is P.sub.(n+1)1(x.sub.(n+1)1,y.sub.(n+1)1), the equation of an implicit function of the straight line is as follows:
(24)
(25) wherein the value of b can be solved when P.sub.n1(x.sub.n1,y.sub.n1) is substituted into the equation. The compensation principle of the amount of the compensated pixel points between the two adjacent starting pixel points is as follows. Assuming that
D.sub.n=|y.sub.(n+1)1−y.sub.n1| (7),
(26) the amount of compensated pixel points between the adjacent starting pixel points is D.sub.n−1. For the coordinate of the compensated pixel point, the corresponding x.sub.ni can be obtained by substituting y.sub.n1+1, y.sub.n1+2 . . . , y.sub.n1+D.sub.n−1 into equation (8):
(27)
(28) 3. The techniques involved in the process of calibrating the coordinate of the cutter profile mainly include:
(29) {circle around (1)} the proportional scaling and translation transformation between the theoretical coordinate and the pixel coordinate; the coordinate transformation matrix is as follows:
(30)
(31) wherein, Dx and Dy are the magnification ratios the coordinates; Δx and Δy are the displacements of movement;
(32) {circle around (2)} the spline interpolation of the transformed pixel coordinates, namely, the value X of the pixel coordinate of the cutter is taken as an independent variable with a step size having a predetermined interval to perform a spline interpolation on value Y of the pixel coordinate of the cutter profile.
(33) 4. Since there is only one engaging point at each engaging moment when the cutter is engaged with the gear tooth profile (the left and right tooth profiles each have one engaging point), the instantaneous engaging point of the cutter and the gear tooth profile is the coordinate coincidence point. In the pixel image, as shown in
(34) {circle around (1)} Horizontal approximation along the x-axis. The X-coordinate step size of the calibrated cutter is defined, and the corresponding interpolation Y is obtained. Then, the interpolation Y is rounded to an integer to obtain the coordinate of the cutter in pixel coordinates. The initial pixel point L is horizontally translated until the nearest initial pixel point coincides with the pixel coordinate of the cutter (i.e. the position of minimal displacement L.sub.min), and this point is the instantaneous engaging point, as shown in
(35) {circle around (2)} Approximation along the normal direction. The X-coordinate step size of the calibrated cutter is defined, and the corresponding interpolation coordinate Y is calculated to obtain the coordinate of the cutter in pixel coordinates. The image edge pixel points are translated along the normal direction until the nearest pixel point coincides with the pixel coordinate of the cutter, and this point is the instantaneous engaging point, as shown in
(36) 5. In the process of converting the pixel points among different instantaneous engaging images:
(37) {circle around (1)} when the transmission mode is a gear transmission between the intersecting axes, by extracting the displacement variation quantity of the node position in each instantaneous engaging image, the movement displacement of each instantaneous engaging image can be obtained and the transformation coordinate matrix is established; the transformation coordinate matrix is as below:
(38)
(39) {circle around (2)} when the transmission mode is a gear transmission between the parallel axes, the angular displacement of each instantaneous engaging image can be obtained according to the rotation angle variation of the node position in each instantaneous engaging image, and the transformed coordinate matrix is obtained:
(40)
(41) 6. Assuming the theoretical coordinate of the cutter is P′(x, y) and the pixel coordinate of the engaging point is P(X, Y), the final pixel coordinate of the tooth profile edge can be obtained by the following equation:
(42)
(43) Taking the face gear as an example, the method provided by the present invention can be used to analyze the tooth profile of the face gear (as shown in