Image inspection device, image inspection method, and image inspection device component
11158034 · 2021-10-26
Assignee
Inventors
Cpc classification
G01M11/00
PHYSICS
A61B3/10
HUMAN NECESSITIES
G02B17/00
PHYSICS
A61B3/14
HUMAN NECESSITIES
H04N5/64
ELECTRICITY
International classification
Abstract
An image inspection device includes a mounting unit on which an image projection device that directly projects an image on a retina of a user is to be mounted; a condensing lens that condenses a light beam emitted from the image projection device mounted on the mounting unit; a detector on which an inspection image is projected and detected; and a controller that inspects the inspection image detected by the detector. The image inspection device inspects the image projected by the image projection device, which directly projects the image on the retina of the user.
Claims
1. An image projection device comprising: a mounter on which an image projection device that directly projects an image on a retina of a user is to be mounted; a condensing lens configured to condense a light beam emitted from the image projection device mounted on the mounter; a detector on which an inspection image is to be projected by irradiation with the light beam condensed by the condensing lens and is configured to detect the inspection image; and a controller configured to inspect the inspection image detected by the detector, wherein the detector is movable in a direction vertical to a plane of the detector, and the controller is configured to measure a size of a region of convergence of the light beam by identifying a position of the detector and a size of the inspection image at the position as the detector moves.
2. The image inspection device according to claim 1, wherein the detector has a planar shape.
3. An image inspection method comprising: projecting an inspection image on a detector by causing a light beam forming the inspection image to be emitted from an image projection device that directly projects an image on a retina of a user, causing the light beam to pass through a condensing lens, and irradiating the detector with the light beam; detecting the inspection image by the detector: inspecting the inspection image detected by the detector; identifying a position of the detector, which detects an image, and a size of the inspection image detected at the position as the detector moves, the detector being movable in a direction vertical to a plane of the detector; and measuring a size of a region of convergence of the light beam by an identified position of the detector and an identified size of the inspection image.
4. The image inspection device according to claim 3, wherein the detector has a planar shape.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
EMBODIMENTS FOR CARRYING OUT THE INVENTION
(33) Hereinafter, embodiments of the present invention will be described with reference to drawings.
First Embodiment
(34)
(35) An image projection device 30, as a test object, that directly projects an image on the retina of the user is mounted on the mounting unit 10. Here, an example of the image projection device 30 is described with use of
(36) As illustrated in
(37) The scanning unit 38 is arranged in the temple 46 of the spectacle type frame. The scanning unit 38 scans the light beam 50 emitted from the light source 32 in the horizontal direction and the vertical direction. The scanning unit 38 is, for example, a Micro Electro Mechanical System (MEMS) mirror. The light beam 50 emitted from the light source 32 is reflected by the mirror 34 and the mirror 36, and then enters the scanning unit 38.
(38) A scanning light formed of the light beam 50 scanned by the scanning unit 38 is reflected by the mirror 40 toward a lens 48 of the spectacle type frame. The projection unit 42 is arranged on the surface closer to the eye ball 90 of the lens 48. Accordingly, the light beam 50 scanned by the scanning unit 38 enters the projection unit 42. The projection unit 42 is a half mirror having a free curved surface or a half mirror having a composite structure of a free curved surface and a diffraction surface. Thus, the scanning light formed of the light beam 50 that has entered the projection unit 42 converges near a pupil 94 of the eye ball 90 and is then emitted to the retina 92. This allows the user to recognize the image formed of the light beam 50 and visually recognize an external world image through the lens.
(39) The control unit 44 is composed of a processor such as a Central Processing Unit (CPU), and memories such as a Random Access Memory (RAM) and a Read Only Memory (ROM), and the processor operates according to programs stored in the memory, and controls the entire of the image projection device 30 such as controlling the light source 32 to cause the light beam 50 based on input image data to be emitted from the light source 32. The processor and the memory may be provided to the spectacle type frame, or may be provided to an external device such as a mobile terminal.
(40)
(41) The target projection unit 14 is located near the condensing spot of the light beam 50 by the condensing lens 12. The target projection unit 14 is formed of glass in the shape of a hemisphere of which the side closer to the condensing lens 12 opens, and having a film translucent to the light beam 50 on the inner surface thereof. The target projection unit 14 may be formed of a material translucent to the light beam 50. When the target projection unit 14 is irradiated with the light beam 50, an image is projected on the target projection unit 14. Since the target projection unit 14 is translucent to the light beam 50, the target projection unit 14 displays the image projected by the light beam 50 and allows the image to pass therethrough.
(42) The above-described structure allows the condensing lens 12 that condenses the light beam 50 to be considered as a crystalline lens of the eye ball, and allows the target projection unit 14 having a surface in the shape of a hemisphere to be considered as the retina of the eye ball. That is, the condensing lens 12 corresponding to the crystalline lens and the target projection unit 14 corresponding to the retina form a pseudo eye (often referred to as a dummy eye or an eye ball screen model, hereinafter, described as the dummy eye). For this reason, the diameter of the target projection unit 14 preferably corresponds to the typical dimension of the eye ball, and is preferably configured to be approximately 23 mm to 25 mm. In addition, when it is assumed that the target projection unit 14 in the shape of a hemisphere is in the shape of a sphere, the scanning light formed of the light beam 50 in the part corresponding to the pupil is preferably configured to be within the range of the general dimension of the pupil (for example, approximately 5 mm to 7 mm) so that this configuration is equivalent to the situation in which the light beam 50 passes through the pupil of the eye ball.
(43)
(44) As illustrated in
(45) The control unit 18 is composed of a processor such as a Central Processing Unit (CPU) and memories such as a Random Access Memory (RAM) and a Read Only Memory (ROM), and the processor operates according to programs stored in the memory, and controls the entire of the image inspection device 100. For example, the control unit 18 inputs inspection image data to the image projection device 30 mounted on the mounting unit 10, and captures an inspection image projected on the target projection unit 14 with the imaging unit 16. The control unit 18 functions as an image transformation unit 20 that transforms the inspection image captured by the imaging unit 16 from a polar coordinate system, which is expressed by the moving radius from the center point of the hemispherical shape of the target projection unit 14 and the angle, into the Cartesian coordinate system, and functions as an inspection unit 22 that inspects the inspection image that is the inspection image captured by the imaging unit 16 and transformed by the image transformation unit 20. The display unit 24 is, for example, a liquid crystal display, and displays the inspection result of the inspection image.
(46) The shape of the target projection unit 14 is not limited to a complete hemispherical shape, and it is sufficient if the target projection unit 14 has a substantially hemispherical shape. A substantially hemispherical shape includes a spherical shape or a shape of a substantial sphere of which a part opens.
(47)
(48) Then, the control unit 18 of the image inspection device 100 inputs the inspection image data to the control unit 44 of the image projection device 30 to cause the light beam 50 forming an inspection image to be emitted from the image projection device 30, thereby projecting the inspection image on the target projection unit 14 (step S12). The light beam 50 emitted from the image projection device 30 is emitted to the target projection unit 14 through the condensing lens 12, and the inspection image is thereby projected on the target projection unit 14. For example, a lattice image can be used as the inspection image.
(49) Then, the control unit 18 captures the inspection image projected on the target projection unit 14 with the imaging unit 16 (step S14). The inspection image captured by the imaging unit 16 is transmitted to the control unit 18.
(50) Then, the control unit 18 executes curved image transformation that transforms the captured inspection image from the polar coordinate system expressed by the moving radius from the center point of the hemispherical shape of the target projection unit 14 and the angle into the Cartesian coordinate system (step S16). Here, the curved image transformation is described.
(51) Then, the control unit 18 inspects distortion of the inspection image after the curved image transformation (hereinafter, may be referred to as a transformed inspection image) (step S18). The image after the curved image transformation is an image equivalent to the image to be viewed by the user wearing the image projection device 30. Thus, the distortion (geometric uniformity) of the image to be viewed by the user wearing the image projection device 30 can be inspected by inspecting the distortion of the transformed inspection image (the image of
(52)
(53)
(54)
(55)
(56)
(57)
(58) Then, the control unit 18 displays the inspection results of distortion of the image (e.g., distortions described in
(59) As described above, the image inspection device 100 of the first embodiment includes the condensing lens 12 that condenses the light beam 50 emitted from the image projection device 30 mounted on the mounting unit 10, the target projection unit 14 that is irradiated with the light beam 50 that has been condensed and on which the inspection image is projected, and the inspection unit 22 that inspects the projected inspection image. That is, the inspection image is projected on the target projection unit 14 by emitting the light beam 50 forming the inspection image from the image projection device 30, causing the light beam 50 to pass through the condensing lens 12 to be emitted to the target projection unit 14, and the projected inspection image is then inspected. Accordingly, inspection of the image to be projected by the image projection device 30 that directly projects an image on the retina of the user becomes possible.
(60) In addition, in the first embodiment, the target projection unit 14 has a substantially hemisphere shape having an opening at the condensing lens 12 side, and includes the imaging unit 16 that captures the inspection image projected on the target projection unit 14, and the image transformation unit 20 that transforms the captured inspection image from the polar coordinate system, which is expressed by the moving radius from the center point of the substantial hemisphere and the angle, into the Cartesian coordinate system. The inspection unit 22 inspects the inspection image that has been transformed by the image transformation unit 20. As described above, the condensing lens 12 and the substantially hemispherical target projection unit 14 constitute a pseudo eye (a dummy eye). Accordingly, the image equivalent to the image to be viewed by the user wearing the image projection device 30 can be inspected by capturing the inspection image projected on the target projection unit 14, transforming the captured inspection image from the polar coordinate system, which is expressed by the moving radius from the center point of a substantial hemisphere and the angle, into the Cartesian coordinate system, and inspecting the transformed inspection image.
(61) In the first embodiment, the target projection unit 14 allows the inspection image to pass therethrough, and the imaging unit 16 captures the inspection image that has passed through the target projection unit 14. The above described configuration reduces the number of components of the image inspection device 100, and enables to inspect the image with simple structure.
Second Embodiment
(62) The first embodiment describes an example in which the distortion of an image is inspected, while a second embodiment will describe an example in which the resolution of an image is inspected. In the second embodiment, the image inspection device is the same as the image inspection device 100 of the first embodiment, and the description thereof is thus omitted.
(63)
(64) Then, the control unit 18 of the image inspection device 100 inputs the inspection image data to the control unit 44 of the image projection device 30 to cause the light beam 50 forming the inspection image to be emitted from the image projection device 30, thereby projecting the inspection image on the target projection unit 14 (step S34). A resolution chart image can be used as the inspection image, for example. That is, as illustrated in
(65) Then, the control unit 18 captures the inspection image 60 projected on the target projection unit 14 with the imaging unit 16 (step S36). Then, the control unit 18 executes the curved image transformation of the captured inspection image 60 (step S38). Then, the control unit 18 measures the resolution R1 of the inspection image 60 after the curved image transformation (step S40).
(66) After the measurement of the resolution R1 is completed, the user replaces the condensing lens 12a with the focal length f1 that is mounted to the image inspection device 100 with a condensing lens 12b with a focal length f2 that is different from the focal length f1, for example, is shorter than the focal length f1 (step S42). Accordingly, as illustrated in
(67) Then, the control unit 18 captures the inspection image 60 projected on the target projection unit 14 with the imaging unit 16 (step S44). Then, the control unit 18 executes the curved image transformation of the captured inspection image 60 (step S46). Then, the control unit 18 measures the resolution R2 of the inspection image 60 after the curved image transformation (step S48).
(68) Then, the control unit 18 calculates the ratio (ΔR/Δf) of the difference between the resolution R1 and the resolution R2 (ΔR=R1−R2) to the difference between the focal length f1 and the focal length f2 (Δf=f1−f2), and inspects whether the ratio is within a predetermined range preliminarily stored in the memory (step S50). The control unit 18 display the inspection result on the display unit 24 (step S52).
(69) As described above, in the second embodiment, the resolution R1 of the inspection image formed of the light beam 50 condensed by the condensing lens 12a with the focal length f1 and the resolution R2 of the inspection image formed of the light beam 50 condensed by the condensing lens 12b with the focal length f2 are measured. The resolution R1 and the resolution R2 correspond to retina image resolutions. Then, it is inspected whether the ratio of the difference between the resolution R1 and the resolution R2 to the difference between the focal length f1 and the focal length f2 is within the predetermined range. When the ratio is within the predetermined range, it is considered that the focal depth is deep, and therefore, the favorable image can be provided to the user regardless of the difference among users wearing the image projection device 30. Accordingly, the resolution independent from the focal point (the focal point independent resolution) of the image to be projected by the image projection device 30 that directly projects the image on the retina of the user can be measured, and it can be inspected whether the image projection device 30 can provide a favorable image to the user regardless of the difference among users, in the second embodiment. Thus, the condensing lens 12a with the focal length f1 preferably has a condensing point in a position anterior to the target projection unit 14, and the condensing lens 12b with the focal length f2 has a condensing point in a position posterior to the target projection unit 14.
Third Embodiment
(70) A third embodiment describes a second example in which the resolution of an image is inspected. Also in the third embodiment, the image inspection device is the same as the image inspection device 100 of the first embodiment, and the description thereof is thus omitted.
(71)
(72) Then, the control unit 18 of the image inspection device 100 inputs a plurality of inspection image data sets having different spatial frequencies to the control unit 44 of the image projection device 30, causes the light beam 50 forming the inspection image to be emitted from the image projection device 30, thereby projecting a plurality of inspection images having different spatial frequencies on the target projection unit 14 (step S74). An image in which a bright section and a dark section are alternately repeated can be used as the inspection image, for example. Then, the control unit 18 captures the inspection images projected on the target projection unit 14 with the imaging unit 16 (step S76). That is, as illustrated in
(73) Then, the control unit 18 measures the contrast ratio of each of the captured inspection images 60 (step S78). Then, the control unit 18 calculates the spatial frequency at which the contrast ratio is 0.5, and identifies the calculated spatial frequency as the resolution R1 (step S80). That is, the relationship between the spatial frequency and the contrast ratio as illustrated in
(74) After the identification of the resolution R1 is completed, the user replaces the condensing lens 12a having the focal length f1 mounted to the image inspection device 100 with the condensing lens 12b with the focal length f2 that is different from the focal length f1, for example, is shorter than the focal length f1 (step S82). Then, the control unit 18 inputs a plurality of inspection image data sets having different spatial frequencies to the control unit 44 of the image projection device 30, causes the light beam 50 forming the inspection image to be emitted from the image projection device 30, thereby projecting the inspection images with different spatial frequencies on the target projection unit 14 (step S84). Accordingly, the inspection images 60 with different spatial frequencies are projected by the light beam 50 passing through the condensing lens 12b with the focal length f2 as illustrated in
(75) Then, the control unit 18 captures the inspection images 60 projected on the target projection unit 14 with the imaging unit 16 (step S86). Then, the control unit 18 measures the contrast ratio of each of the captured inspection images 60 (step S88). Then, the control unit 18 calculates the spatial frequency at which the contrast ratio is 0.5, and identifies the calculated spatial frequency as the resolution R2 (step S90). That is, the relationship between the spatial frequency and the contrast ratio as illustrated in
(76) Then, the control unit 18 calculates the ratio (ΔR/Δf) of the resolution between the resolution R1 and the resolution R2 (ΔR=R1−R2) to the difference between the focal length f1 and the focal length f2 (Δf=f1−f2), and inspects whether the ratio is within a predetermined range preliminarily stored in the memory (step S92). The control unit 18 displays the inspection result on the display unit 24 (step S94).
(77) As described above, in the third embodiment, the spatial frequency at which the contrast ratio is 0.5 is identified as the resolution R1 with use of the inspection images with different spatial frequencies formed of the light beam 50 condensed by the condensing lens 12a with the focal length f1. In the same manner, the spatial frequency at which the contrast ratio is 0.5 is identified as the resolution R2 with use of the inspection images with different spatial frequencies formed of the light beam 50 condensed by the condensing lens 12b with the focal length f2. Then, it is inspected whether the ratio of the resolution between the resolution R1 and the resolution R2 to the difference between the focal length f1 and the focal length f2 is within the predetermined range. Accordingly, as in the second embodiment, the resolution independent from the focal length (the focal length independent resolution) of the image to be projected by the image projection device 30, which directly projects the image on the retina of the user, can be measured, and it can be inspected whether the image projection device 30 can provide a favorable image to the user regardless of the difference among users.
(78) A first variation of the third embodiment describes a third example in which the resolution of an image is inspected.
(79)
(80) Then, the control unit 18 of the image inspection device 100 inputs the inspection image data to the control unit 44 of the image projection device 30 to cause the light beam 50 forming the inspection image to be emitted from the image projection device 30, thereby projecting the inspection image 60 on the target projection unit 14 (step S132).
(81) Then, the control unit 18 captures the inspection image 60 projected on the target projection unit 14 with the imaging unit 16 (step S134). That is, as illustrated in
(82) Then, the control unit 18 obtains the brightness data from the inspection pattern 69 captured by the imaging unit 16 (step S136). For example, the brightness data of the inspection pattern 69 as illustrated in
(83) Then, the control unit 18 calculates the spatial frequency response (SFR) from the brightness data of the inspection pattern 69 (step S138). For example, the spatial frequency response characteristic as illustrated in
(84) Then, the control unit 18 identifies the resolution of the inspection pattern 69 from the spatial frequency response characteristic (step S140). For example, the control unit 18 calculates the spatial frequency at which the brightness is 0.5, and identifies the calculated spatial frequency as the resolution.
(85) Then, the control unit 18 determines whether the resolutions of all the inspection patterns 69 contained in the inspection image 60 have been identified (step S142). When there is the inspection pattern 69 of which the resolution is not identified yet (step S142: No), the control unit 18 returns to step S136. When the resolutions of all the inspection patterns 69 have been identified (step S142: Yes), the control unit 18 identifies the resolution of the inspection image 60 (step S144). For example, the control unit 18 identifies the average of the resolutions of the inspection patterns 69 as the resolution of the inspection image 60. The control unit 18 may identify the maximum value of the resolutions of the inspection patterns 69 as the resolution of the inspection image 60, or may identify the minimum value as the resolution of the inspection image 60. Then, the control unit 18 displays the inspection result of the resolution on the display unit 24 (step S146).
Fourth Embodiment
(86) A fourth embodiment describes an example in which the brightness and the pattern shape of an image are inspected.
(87)
(88) Then, the control unit 18 of the image inspection device 400 inputs the inspection image data to the control unit 44 of the image projection device 30 to cause the light beam 50 forming the inspection image to be emitted from the image projection device 30, thereby projecting the inspection image on the target projection unit 14 (step S104). The inspection image 60 projected on the target projection unit 14 has a region 66 with decreased brightness due to the effect of the apertured plate 70 as illustrated in
(89) Then, the control unit 18 captures the inspection image 60 projected on the target projection unit 14 with the imaging unit 16 (step S106). Then, the control unit 18 executes the curved image transformation of the captured inspection image 60 (step S108). Then, the control unit 18 measures the average brightness and the pattern shape (such as the width) of the inspection image 60 after the curved image transformation (step S110).
(90) Then, the user moves the position of the apertured plate 70 by a predetermined distance (step S112). Then, the control unit 18 captures the inspection image 60 projected on the target projection unit 14 with the imaging unit 16 (step S114). Then, the control unit 18 executes the curved image transformation of the captured inspection image 60 (step S116). Then, the control unit 18 measures the average brightness and the pattern shape (such as the width) of the inspection image 60 after the curved image transformation (step S118). Until the position of the apertured plate 70 reaches the final position, step S112 through step S118 are repeated (step S120). For example, as illustrated in
(91) The control unit 18 inspects whether the difference in measured average brightness among the inspection images 60 is within a predetermined range and the difference in pattern shape among the inspection images 60 is within a predetermined range (step S122). The control unit 18 displays the inspection result on the display unit 24 (step S124).
(92) As described above, in the fourth embodiment, it is inspected whether the difference in average brightness and/or difference in pattern shape among images formed of the light beam 50 passing through the aperture 72 located at respective positions different from each other due to the movement of the apertured plate 70, which is inserted near the condensing lens 12, in the plane direction perpendicular to the optical axis of the condensing lens 12 is within a predetermined range. The aperture 72 of the apertured plate 70 is considered as the pupil of the user wearing the image projection device 30. Thus, when the difference in average brightness and/or the difference in pattern shape is within a predetermined range, it is considered that change in brightness of the image viewed by the user and/or change in pattern shape of the image viewed by the user is small even when the user wearing the image projection device 30 faces in various directions. Therefore, in the fourth embodiment, it can be inspected whether the image projection device 30 can provide an image of which change in brightness and/or change in pattern shape is small to the user even when the user wearing the image projection device 30 faces in different directions. Since the aperture 72 of the apertured plate 70 corresponds to the pupil, the apertured plate 70 is preferably located near the condensing lens 12 so as to model the positional relationship between the crystalline lens and the pupil.
(93) The fourth embodiment describes a case where the user moves the position of the apertured plate 70 as an example, but a drive unit such as an actuator capable of moving the position of the apertured plate 70 may be provided, and the control unit 18 may move the position of the apertured plate 70 with use of the drive unit.
Fifth Embodiment
(94)
(95) When the inspection image passing through the target projection unit 14 is captured by the imaging unit 16 as in the first embodiment, there may be an effect of unnecessary light. On the other hand, as in the fifth embodiment, when the reflection system composed of the half mirror 80 is provided on the light path of the light beam 50 between the condensing lens 12 and the target projection unit 14 and the target projection unit 14 is made of a material with a high light-diffusion property, the effect of unnecessary light is reduced by capturing the inspection image reflected by the target projection unit 14 and the half mirror 80.
Sixth Embodiment
(96)
(97) When the half mirror 80 is located between the condensing lens 12 and the target projection unit 14 as in the fifth embodiment, the amount of light entering the imaging unit 16 decreases. In contrast, as in the sixth embodiment, when the reflection system composed of the polarizer 82, the polarization beam splitter 84, and the quarter wavelength plate 86 is located on the light path of the light beam 50 between the condensing lens 12 and the target projection unit 14 and the inspection image reflected by the target projection unit 14 and the polarization beam splitter 84 is captured, the amount of light entering the imaging unit 16 is inhibited from being decreased.
Seventh Embodiment
(98)
(99)
(100) The first through sixth embodiments describe the combination of the target projection unit 14 formed of glass and having a substantially hemispherical shape or a flat shape and the imaging unit 16, but a detector with a planar shape may be used as the target projection unit 14a as in the seventh embodiment. In this case, the inspection image can be inspected by detecting the inspection image projected on the target projection unit 14a by the light beam 50 emitted from the image projection device 30 and condensed by the condensing lens 12 with use of the target projection unit 14a.
(101) The first through seventh embodiments describe a case where the distortion, the resolution, the brightness, and the pattern shape of an image are inspected as the inspection of the image as examples. However, at least one of the distortion, the resolution, the brightness, the pattern shape, the gamma characteristic, the contrast ratio, the aspect ratio, and the hue may be inspected. A conventionally known inspection method may be used as the inspection method. Hereinafter, examples of the inspection method will be described.
(102)
(103)
(104)
Eighth Embodiment
(105) An eighth embodiment describes an example of an inspection of the region of convergence of a scanning light projected from the image projection device 30.
(106) As illustrated in
(107) Then, the user identifies the position in the Z direction of the detector 110 when the width of the inspection image 60 becomes equal to the width of the pupil 94 as Z0 (step S154). Then, the user moves the position of the detector 110 in the Z direction till the width of the inspection image 60 detected by the detector 110 is minimum (step S156). That is, as illustrated in
(108) Then, the user identifies, as Z1, the position in the Z direction of the detector 110 when the width of the inspection image 60 becomes minimum (step S158). Then, the user identifies the width in the X direction when the inspection image 60 is minimum (step S160). That is, as illustrated in
(109) Then, the user calculates the size of the region of convergence with use of the values identified at step S154, S158, and step S160 (step S162). That is, the size of the region of convergence in the X direction is calculated from the difference (ID−Hi(Z1)) between the width Hi(Z1) in the X direction when the inspection image 60 is minimum and the dimension ID of the pupil 94. The size of the region of convergence in the Y direction is calculated from two times of the distance from the position Z0 in the Z direction of the detector 110 when the width of the inspection image 60 is equal to the width of the pupil 94 to the position Z1 in the Z direction of the detector 110 when the width of the inspection image 60 is minimum, i.e., (2(Z1−Z0)).
(110) The eighth embodiment describes a case where the region of convergence of the scanning light is inspected by the user as an example, but the region of convergence of the scanning light may be inspected by the control unit of the inspection device (the control unit 18 in
(111) Although the embodiments of the present invention has been described in detail, the present invention is not limited to a certain embodiment, and it should be understood that the various change, substitutions, and alterations could be made hereto without departing from the scope of the invention.
DESCRIPTION OF REFERENCE NUMERALS
(112) 10 mounting unit 12 through 12b condensing lens 14, 14a target projection unit 16 imaging unit 18 control unit 20 image transformation unit 22 inspection unit 24 display unit 30 image projection device 50 light beam 52 dummy eye 54 convergence point 58 center point 60 inspection image 70 apertured plate 80 half mirror 82 polarizer 84 polarization beam splitter 86 quarter wavelength plate 100 through 700 image inspection device 110 Detector