Aspheric lens eccentricity detecting device based on wavefront technology and detecting method thereof
11506567 · 2022-11-22
Assignee
Inventors
- Guohua Shi (Jiangsu Province, CN)
- Yi He (Sichun Province, CN)
- Feng Gao (Jiangxi Province, CN)
- Lina Xing (Jilin Province, CN)
- Xin Zhang (Jiangsu Province, CN)
- Wen Kong (Shandong Province, CN)
Cpc classification
International classification
Abstract
The present invention discloses an aspheric lens eccentricity detecting device based on wavefront technology and a detecting method thereof. The device comprises: an upper optical fiber light source, an upper collimating objective lens, an upper light source spectroscope, an upper beam-contracting front lens, an upper beam-contracting rear lens, an upper imaging detector, an upper imaging spectroscope, an upper wavefront sensor, a lens-under-detection clamping mechanism, a lower light source spectroscope, a lower beam-contracting front lens, a lower beam-contracting rear lens, a lower imaging spectroscope, a lower wavefront sensor, a lower imaging detector, a lower collimating objective lens and a lower optical fiber light source. The present invention achieves non-contact detection, with no risk of damaging the lens, and there is no moving part in the device, so the system reliability and stability are high; and in the present invention, various eccentricity errors in the effective aperture of the aspheric lens can be detected at a time, thereby avoiding errors caused by splicing detection, and also greatly reducing the detection time, thus being applicable to online detection on an assembly line.
Claims
1. A detecting method for an aspheric lens eccentricity detecting device based on wavefront technology, wherein the aspheric lens eccentricity detection device comprises: an upper optical fiber light source, an upper collimating objective lens, an upper light source spectroscope, an upper beam-contracting front lens, an upper beam-contracting rear lens, an upper imaging detector, an upper imaging spectroscope, an upper wavefront sensor, a lens-under-detection clamping mechanism, a lower light source spectroscope, a lower beam-contracting front lens, a lower beam-contracting rear lens, a lower imaging spectroscope, a lower wavefront sensor, a lower imaging detector, a lower collimating objective lens and a lower optical fiber light source, wherein light emitted by the upper optical fiber light source is collimated by the upper collimating objective lens, then is transmitted through the upper light source spectroscope, and becomes incident to an upper surface of a lens under detection on the lens-under-detection clamping mechanism; reflected light on the upper surface of the lens under detection is reflected by the upper light source spectroscope, then is subjected to aperture matching by the upper beam-contracting front lens and the upper beam-contracting rear lens successively, and reaches the upper imaging spectroscope, and the light is divided into two parts after reaching the upper imaging spectroscope, one part thereof being reflected by the upper imaging spectroscope and entering the upper imaging detector, and the other part thereof being transmitted through the upper imaging spectroscope and entering the upper wavefront sensor; the upper imaging detector acquires an image formed by the reflected light on the upper surface of the lens under detection, and processes a variable-curvature ring image in the acquired image to obtain an optical axis center position of the upper surface of the lens under detection; and the upper wavefront sensor acquires distortion information of the reflected light on the upper surface of the lens under detection, and processes the distortion information to obtain tilt information of the upper surface of the lens under detection; and wherein light emitted by the lower optical fiber light source is collimated by the lower collimating objective lens, then is transmitted through the lower light source spectroscope, and becomes incident to a lower surface of the lens under detection on the lens-under-detection clamping mechanism; reflected light on the lower surface of the lens under detection is reflected by the lower light source spectroscope, then is subjected to aperture matching by the lower beam-contracting front lens and the lower beam-contracting rear lens successively, and reaches the lower imaging spectroscope, and the light is divided into two parts after reaching the lower imaging spectroscope, one part thereof being reflected by the lower imaging spectroscope and entering the lower imaging detector, and the other part thereof being transmitted through the lower imaging spectroscope and entering the lower wavefront sensor; the lower imaging detector acquires an image formed by the reflected light on the lower surface of the lens under detection, and processes a variable-curvature ring image in the acquired image to obtain an optical axis center position of the lower surface of the lens under detection; and the lower wavefront sensor acquires distortion information of the reflected light on the lower surface of the lens under detection, and processes the distortion information to obtain tilt information of the lower surface of the lens under detection, wherein the detecting method comprises the following steps: step S1: turning on the upper optical fiber light source, the upper imaging detector and the upper wavefront sensor simultaneously, and adjusting the lens-under-detection clamping mechanism according to an image on the upper imaging detector, to adjust the position of the lens under detection to an imaging center area of the upper imaging detector; step S2: acquiring an image on the upper wavefront sensor and processing the wavefront image to obtain an amount of tilt (p.sub.x,p.sub.y) of the upper surface of the lens under detection; step S3: acquiring a pupil image (J.sub.p) on the upper imaging detector, and processing the pupil image to obtain an optical axis center position (O.sub.x, O.sub.y) of the upper surface of the lens under detection; step S4: turning off the upper optical fiber light source, turning on the lower optical fiber light source, acquiring a pupil image (I.sub.p) on the upper imaging detector, and performing calculation processing on the pupil image (I.sub.p) by using the method of step S3 to obtain an outer diameter center position (d.sub.x, d.sub.y,) of the upper surface of the lens under detection; step S5: turning on the lower imaging detector and the lower wavefront sensor, acquiring an image on the lower wavefront sensor, and performing calculation processing according to component parameters of the lower wavefront sensor by using the method of step S2 to obtain an amount of tilt (p.sub.x′,p.sub.y′) of the lower surface of the lens under detection; step S6: acquiring a pupil image (J.sub.p′) on the lower imaging detector, and performing calculation processing on the pupil image (J.sub.p′) by using the method of step S3 to obtain an optical axis center position (O.sub.x′, O.sub.y′) of the lower surface of the lens under detection; step S7: turning off the lower optical fiber light source, turning on the upper optical fiber light source, acquiring a pupil image (I.sub.p′) on the lower imaging detector, and performing calculation processing on the pupil image (I.sub.p′) by using the method of step S3 to obtain an outer diameter center position (d.sub.x′, d.sub.y′) of the lower surface of the lens under detection; step S8: subtracting the amount of tilt of the upper surface of the lens under detection obtained in step S2 from the amount of tilt of the lower surface obtained in step S5 to obtain face-wise tilt eccentricity of the upper and lower surfaces of the lens under detection: (
2. The aspheric lens eccentricity detecting method according to claim 1, wherein the light emitted by the lower optical fiber light source is collimated by the lower collimating objective lens to form a parallel light beam, which is transmitted through the lower light source spectroscope, becomes incident to the lens under detection on the lens-under- detection clamping mechanism, and is reflected by the upper light source spectroscope after passing through the lens under detection, and then the light reflected therefrom is subjected to aperture matching by the upper beam-contracting front lens and the upper beam-contracting rear lens successively, and reaches the upper imaging spectroscope; and part of the light is reflected by the upper imaging spectroscope and reaches the upper imaging detector to form a light transmittance image of the lens under detection, and an outer edge image of the lens under detection in the formed image is processed to obtain the outer diameter center position of the upper surface of the lens under detection.
3. The aspheric lens eccentricity detecting method according to claim 2, wherein the light emitted by the upper optical fiber light source is collimated by the upper collimating objective lens to form a parallel light beam, which is transmitted through the upper light source spectroscope, becomes incident to the lens under detection on the lens-under-detection clamping mechanism, and is reflected by the lower light source spectroscope after passing through the lens under detection, and then the light reflected therefrom is subjected to aperture matching by the lower beam-contracting front lens and the lower beam-contracting rear lens successively, and reaches the lower imaging spectroscope; and part of the light is reflected by the lower imaging spectroscope and reaches the lower imaging detector to form a light transmittance image of the lens under detection, and an outer edge image of the lens under detection in the formed image is processed to obtain the outer diameter center position of the lower surface of the lens under detection.
4. The aspheric lens eccentricity detecting method according to claim 3, wherein the positions of the upper imaging detector and the upper wavefront sensor in the optical path are both conjugate with the upper surface of the lens under detection; and the positions of the lower imaging detector and the lower wavefront sensor in the optical path are both conjugate with the lower surface of the lens under detection.
5. The aspheric lens eccentricity detecting method according to of claim 4, wherein the optical axis center position information of the upper surface, the tilt information of the upper surface, the outer diameter center position information of the upper surface, the optical axis center position information of the lower surface, the outer diameter center position information of the lower surface, and the tilt information of the lower surface obtained for the lens under detection are subjected to comprehensive processing, to finally obtain data of the face-wise translation eccentricity of the upper and lower surfaces, the face-wise tilt eccentricity of the upper and lower surfaces, the outer diameter eccentricity of the upper surface, and the outer diameter eccentricity of the lower surface, of the lens under detection, thereby measuring eccentricity error information of the lens under detection.
6. The aspheric lens eccentricity detecting method according to claim 4, wherein the upper and/or lower wavefront sensor is a Hartmann wavefront sensor or a shearing interference wavefront sensor or a quadrangular pyramid wavefront sensor.
7. The aspheric lens eccentricity detecting method according to claim 5, wherein the optical axis center position information of the upper surface, the tilt information of the upper surface, the outer diameter center position information of the upper surface, the optical axis center position information of the lower surface, the outer diameter center position information of the lower surface, and the tilt information of the lower surface obtained for the lens under detection are subjected to comprehensive processing, to finally obtain data of the face-wise translation eccentricity of the upper and lower surfaces, the face-wise tilt eccentricity of the upper and lower surfaces, the outer diameter eccentricity of the upper surface, and the outer diameter eccentricity of the lower surface, of the lens under detection, thereby measuring eccentricity error information of the lens under detection.
8. The aspheric lens eccentricity detecting method according claim 3, wherein the optical axis center position information of the upper surface, the tilt information of the upper surface, the outer diameter center position information of the upper surface, the optical axis center position information of the lower surface, the outer diameter center position information of the lower surface, and the tilt information of the lower surface obtained for the lens under detection are subjected to comprehensive processing, to finally obtain data of the face-wise translation eccentricity of the upper and lower surfaces, the face-wise tilt eccentricity of the upper and lower surfaces, the outer diameter eccentricity of the upper surface, and the outer diameter eccentricity of the lower surface, of the lens under detection, thereby measuring eccentricity error information of the lens under detection.
9. The aspheric lens eccentricity detecting method based on wavefront technology according to claim 1, wherein step S2 specifically comprises: step S21: acquiring a wavefront image on the upper wavefront sensor; step S22: detecting effective sub-apertures of an outermost ring in the wavefront image, after removing sub-apertures of an outermost circle and an innermost circle thereof, the remaining sub-apertures being referred to as effective sub-apertures; and letting the number of the effective sub-apertures be L, and calculating the slope of each effective sub-aperture, denoted by (g.sub.xi, g.sub.yi) wherein i=1, 2, . . . , L; step S23: calculating the average slope of the effective sub-apertures:
10. The aspheric lens eccentricity detecting method based on wavefront technology according to claim 9, wherein step S3 specifically comprises: step S31: acquiring a pupil image (J.sub.p) on the upper imaging detector; step S32: binarizing the pupil image (J.sub.p) to obtain a binarized image (J.sub.p2), wherein a binarization threshold is set by manually specifying a threshold or using an automatic threshold calculation method; step S33: performing edge extraction on the binarized image (J.sub.p2) to obtain an image (J.sub.p3);and step S34: performing a circle Hough transformation on the image (J.sub.p3) to obtain a circle, and extracting the center of the circle, denoted as (O.sub.x, O.sub.y), which is the optical axis center position of the upper surface of the lens under detection.
11. The aspheric lens eccentricity detecting method based on wavefront technology according to claim 10, wherein edge extraction is performed on the binarized image (J.sub.p2) by using a Sobel operator or a Laplacian operator or a Canny operator in step S33.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
REFERENCE NUMERALS
(14) upper optical fiber light source 1, upper collimating objective lens 2, upper light source spectroscope 3, upper beam-contracting front lens 4, upper beam-contracting rear lens 5, upper imaging detector 6, upper imaging spectroscope 7, upper wavefront sensor 8, lens-under-detection clamping mechanism 9, lower light source spectroscope 10, lower beam-contracting front lens 11, lower beam-contracting rear lens 12, lower imaging spectroscope 13, lower wavefront sensor 14, lower imaging detector 15, lower collimating objective lens 16, lower optical fiber light source 17.
DETAILED DESCRIPTION OF THE EMBODIMENTS
(15) The present invention will be further described in detail below in conjunction with the embodiments, so that those skilled in the art can carry it out with reference to the description.
(16) It should be understood that, as used herein, the terms such as “have”, “include” and “comprise” do not preclude the presence or addition of one or more other elements or combinations thereof.
(17) As shown in
(18) The wavefront sensor is a Hartmann wavefront sensor or a shearing interference wavefront sensor or a quadrangular pyramid wavefront sensor. Preferably, a Hartmann wavefront sensor is adopted in the embodiment.
(19) Light emitted by the upper optical fiber light source 1 is collimated by the upper collimating objective lens 2 to form a parallel light beam, which is transmitted through the upper light source spectroscope 3 and reaches the lens-under-detection clamping mechanism 9, and an upper surface of the lens under detection clamped on the lens-under-detection clamping mechanism 9 reflects the incident parallel light, and the reflected light is reflected by the upper light source spectroscope 3 and then is subjected to aperture matching by the upper beam-contracting front lens 4 and the upper beam-contracting rear lens 5, and reaches the upper imaging spectroscope 7. The light is divided into two parts after reaching the upper imaging spectroscope 7, one part thereof being reflected by the upper imaging spectroscope 7 and entering the upper imaging detector 6, and the other part thereof being transmitted through the upper imaging spectroscope 7 and entering the upper wavefront sensor 8. The positions of the upper imaging detector 6 and the upper wavefront sensor 8 in the optical path are both conjugate with the upper surface of the lens under detection. The upper imaging detector 6 acquires an image formed by the reflected light on the upper surface of the lens under detection, and processes a variable-curvature ring image in the acquired image to obtain an optical axis center position of the upper surface of the lens under detection; and the upper wavefront sensor 8 acquires distortion information of the reflected light on the upper surface of the lens under detection, and processes the distortion information to obtain tilt information of the upper surface of the lens under detection.
(20) Light emitted by the lower optical fiber light source 17 is collimated by the lower collimating objective lens 16 to form a parallel light beam, which is transmitted through the lower light source spectroscope 10, then reaches the lens-under-detection clamping mechanism 9, and is transmitted through the lens under detection clamped on the lens-under-detection clamping mechanism 9 and reflected by the upper light source spectroscope 3, and the reflected light, after reflected by the upper light source spectroscope 3, is subjected to aperture matching by the upper beam-contracting front lens 4 and the upper beam-contracting rear lens 5, and reaches the upper imaging spectroscope 7. Part of the light that is reflected by the upper imaging spectroscope 7 is detected by the upper imaging detector 6 to form a light transmittance image of the lens under detection, and an outer edge image of the lens under detection in the formed image is processed to obtain an outer diameter center position of the upper surface of the lens under detection.
(21) Light emitted by the lower optical fiber light source 17 is collimated by the lower collimating objective lens 16 to form a parallel light beam, which is transmitted through the lower light source spectroscope 10 and reaches the lens-under-detection clamping mechanism 9, and a lower surface of the lens under detection clamped on the lens-under-detection clamping mechanism 9 reflects the incident parallel light, and the reflected light is reflected by the lower light source spectroscope 10 and then is subjected to aperture matching by the lower beam-contracting front lens 11 and the lower beam-contracting rear lens 12, and reaches the lower imaging spectroscope 13. The light is divided into two parts after reaching the lower imaging spectroscope 13, one part thereof being reflected by the lower imaging spectroscope 13 and entering the lower imaging detector 15, and the other part thereof being transmitted through the lower imaging spectroscope 13 and entering the lower wavefront sensor 14. The positions of the lower imaging detector 15 and the lower wavefront sensor 14 in the optical path are both conjugate with the lower surface of the lens under detection. The lower imaging detector 15 acquires an image formed by the reflected light on the lower surface of the lens under detection, and processes a variable-curvature ring image in the acquired image to obtain an optical axis center position of the lower surface of the lens under detection; and the lower wavefront sensor 14 acquires distortion information of the reflected light on the lower surface of the lens under detection, and processes the distortion information to obtain tilt information of the lower surface of the lens under detection.
(22) Light emitted by the upper optical fiber light source 1 is collimated by the upper collimating objective lens 2 to form a parallel light beam, which is transmitted through the upper light source spectroscope 3, then reaches the lens-under-detection clamping mechanism 9, and is transmitted through the lens under detection clamped on the lens-under-detection clamping mechanism 9 and reflected by the lower light source spectroscope 10, and the reflected light, after being reflected by the lower light source spectroscope 10, is subjected to aperture matching by the lower beam-contracting front lens 11 and the lower beam-contracting rear lens 12, and reaches the lower imaging spectroscope 13. Part of the light that is reflected by the lower imaging spectroscope 13 is detected by the lower imaging detector 15 to form a light transmittance image of the lens under detection, and an outer edge image of the lens under detection in the formed image is processed to obtain an outer diameter center position of the lower surface of the lens under detection.
(23) Optical axis center position information of the upper surface, tilt information of the upper surface, outer diameter center position information of the upper surface, optical axis center position information of the lower surface, tilt information of the lower surface, and outer diameter center position information of the lower surface obtained for the lens under detection as described above are subjected to comprehensive processing to finally obtain data of face-wise translation eccentricity of the upper and lower surfaces, face-wise tilt eccentricity of the upper and lower surfaces, outer diameter eccentricity of the upper surface, and outer diameter eccentricity of the lower surface, of the lens under detection, thereby measuring eccentricity error information of the lens under detection.
(24) The embodiment further provides a detecting method for the aspheric lens eccentricity detecting device based on wavefront technology, including the following steps:
(25) step S1: turning on the upper optical fiber light source 1, the upper imaging detector 6 and the upper wavefront sensor 8 simultaneously, and adjusting the lens-under-detection clamping mechanism 9 according to an image on the upper imaging detector 6, to adjust the position of the lens under detection to an imaging center area of the upper imaging detector 6;
(26) step S2: acquiring an image on the upper wavefront sensor 8 and processing the wavefront image to obtain an amount of tilt (p.sub.x, p.sub.y) of the upper surface of the lens under detection;
(27) step S3: acquiring a pupil image J.sub.p on the upper imaging detector 6, and processing the pupil image to obtain an optical axis center position (O.sub.x, O.sub.y) of the upper surface of the lens under detection;
(28) step S4: turning off the upper optical fiber light source 1, turning on the lower optical fiber light source 17, acquiring a pupil image I.sub.p on the upper imaging detector 6, and performing calculation processing on the pupil image I.sub.p by using the method of step S3 to obtain an outer diameter center position (d.sub.x, d.sub.y) of the upper surface of the lens under detection;
(29) step S5: turning on the lower imaging detector 15 and the lower wavefront sensor 14, acquiring an image on the lower wavefront sensor 14, and performing calculation processing according to component parameters of the lower wavefront sensor by using the method of step S2 to obtain an amount of tilt (p.sub.x′, p.sub.y′) of the lower surface of the lens under detection;
(30) step S6: acquiring a pupil image J.sub.p′ on the lower imaging detector 15, and performing calculation processing on the pupil image J.sub.p′ by using the method of step S3 to obtain an optical axis center position (Ox′, Oy′) of the lower surface of the lens under detection;
(31) step S7: turning off the lower optical fiber light source 17, turning on the upper optical fiber light source 1, acquiring a pupil image I′.sub.p on the lower imaging detector 15, and performing calculation processing on the pupil image I′.sub.p by using the method of step S3 to obtain an outer diameter center position (d.sub.x′, d.sub.y′) of the lower surface of the lens under detection;
(32) step S8: subtracting the amount of tilt of the upper surface of the lens under detection obtained in step S2 from the amount of tilt of the lower surface obtained in step S5 to obtain face-wise tilt eccentricity of the lens under detection: (
(33) subtracting the optical axis center position of the upper surface obtained in step S3 from the optical axis center position of the lower surface obtained in step S6, to obtain face-wise translation eccentricity of the lens under detection: O=(O.sub.x′, O.sub.y′)−(O.sub.x, O.sub.y);
(34) step S9: according to the optical axis center position (O.sub.x, O.sub.y) of the upper surface obtained in step S3 and the outer diameter center position (d.sub.x, d.sub.y) of the upper surface obtained in step S4, obtaining outer diameter eccentricity of the upper surface of the lens under detection:
(35)
(36) wherein the parameter K is a magnification of a beam-contracting-expanding optical system composed of the upper beam-contracting front lens 4 and the upper beam-contracting rear lens 5, and the parameter p is the pixel size of the upper imaging detector 6; and
(37) step S10: according to the optical axis center position (O.sub.x′, O.sub.y′) of the lower surface obtained in step S6 and the outer diameter center position (d.sub.x′, d.sub.y′) of the lower surface obtained in step S7, obtaining outer diameter eccentricity of the lower surface of the lens under detection:
(38)
(39) wherein the parameter k′ is a magnification of a beam-contracting-expanding optical system composed of the lower beam-contracting front lens 11 and the lower beam-contracting rear lens 12, and the parameter p′ is the pixel size of the lower imaging detector 15.
(40) Step S2 specifically further includes:
(41) step S21: acquiring a wavefront image on the upper wavefront sensor 8;
(42) step S22: detecting effective sub-apertures of the outermost ring in the wavefront image, after removing sub-apertures of the outermost circle and the innermost circle thereof, the remaining sub-apertures being referred to as effective sub-apertures; and letting the number of the effective sub-apertures be L, and calculating the slope of each effective sub-aperture, denoted by (g.sub.xi, g.sub.yi), wherein i=1, 2, . . . , L;
(43) step S23: calculating the average slope of the effective sub-apertures:
(44)
and
(45) step S24: calculating the amount of tilt of the upper surface of the lens under detection:
(46)
(47) wherein the parameter K is a magnification of a beam-contracting-expanding optical system composed of the upper beam-contracting front lens 4 and the upper beam-contracting rear lens 5, the parameter p is the pixel size of a detecting camera of the upper wavefront sensor 8, and the parameter q is the sub-aperture size of the upper wavefront sensor 8.
(48) Step S3 specifically further includes:
(49) step S31: acquiring a pupil image J.sub.p on the upper imaging detector 6;
(50) step S32: binarizing the pupil image J.sub.p to obtain a binarized image J.sub.p2, wherein a binarization threshold is set by manually specifying a threshold or using an automatic threshold calculation method;
(51) step S33: performing edge extraction on the binarized image J.sub.p2 to obtain an image J.sub.p3, by using a method with a Sobel operator, a Laplacian operator, a Canny operator or the like; and
(52) step S34: performing a circle Hough transformation on the image J.sub.p3 to obtain a circle, and extracting the center of the circle, denoted as (Ox, Oy), which is the optical axis center position of the upper surface of the lens under detection.
(53) Still further, in an embodiment, the detection result is: the tilt wavefront image of the upper surface obtained by the upper wavefront sensor 8 in step S2 is as shown in
(54) referring to
(55) referring to
(56) in step S9, the outer diameter eccentricity of the upper surface is (−0.755 degrees, −2.016 degrees);
(57) referring to
(58) referring to
(59) referring to
(60) in step S8, the face-wise tilt eccentricity of the upper and lower surfaces is (0.0082 mm, 0.0118 mm), and the overall eccentricity is 0.144 mm; and the face-wise translation eccentricity of the upper and lower surfaces is (−0.3444 mm, 0.7145 mm).
(61) Although the embodiments of the present invention have been disclosed above, they are not limited to those listed in the specification and the embodiments section, and they are absolutely applicable to various fields suitable for the present invention. Other modifications may be readily made by those skilled in the art. Thus, the present invention is not limited to the specific details if such modifications do not depart from the general concept defined by the claims and equivalents thereof.