METHOD AND DEVICE FOR MEASURING HUD GHOST IMAGE
20260112129 · 2026-04-23
Inventors
Cpc classification
International classification
Abstract
An optical characteristic measurement method according to embodiments may comprise the steps of: generating multi-view images, the multi-view images including a left image, a central image, and a right image; displaying the position of a virtual image for the multi-view images; calculating the depth of the position of the virtual image; estimating the depth of the virtual image; matching the depth of the position to the depth of the virtual image; and calibrating errors between the multi-view images on the basis of the matched depth. Additionally, the optical characteristic measurement method may comprise the steps of: generating multi-view images, the multi-view images including a left image, a central image, and a right image; displaying the positions of virtual images for the multi-view images; estimating the depths of the positions of the virtual images; grouping the positions on the basis of the depths; selecting some positions from the grouped positions; and projecting a corrected virtual object on the basis of the selected positions and the depths.
Claims
1. A method for measuring optical characteristics comprising: generating images including points in each pattern for a virtual image plane using one or more optical measurement devices, wherein each image is captured based on the one or more optical measurement devices, and each image corresponds to at least one of a left image, a central image, and a right image; and generating positions of the points based on the one or more optical measurement devices and each pattern; and generating a level for at least one ghost image for the virtual image plane based on the generated positions, wherein the positions are obtained based on a field of view (FOV) of the one or more optical measurement devices, and a gap between a left optical measurement device and a right optical measurement device.
2. The method according to claim 1, wherein: the left image including points in a pattern is captured based on the left optical measurement device of the one or more optical measurement devices; the central image including points in a pattern is captured based on a central optical measurement device of the one or more optical measurement devices; and the right image including points in a pattern is captured based on the right optical measurement device of the one or more optical measurement devices.
3. The method according to claim 1, wherein: coordinate values of the positions are calculated based on a horizontal pixel index of the left optical measurement device, a horizontal pixel index of the right optical measurement device, a horizontal pixel index of the central optical measurement device, and a field of view (FOV) of the left optical measurement device.
4. The method according to claim 1, further comprising: measuring a virtual image distance for the virtual image plane based on a position within the pattern in a center region and the optical measurement device.
5. The method according to claim 4, further comprising: measuring a look-down angle and a look-up angle for the virtual image plane based on the position within the pattern and the virtual image distance.
6. The method according to claim 1, further comprising: measuring a horizontal field of view (FOV) for the virtual image plane based on a distance to a left point of a center region and a distance to a right point of the center region; and measuring a vertical field of view (FOV) for the virtual image plane based on a distance to a top point of the center region and a distance to a bottom point of the center region.
7. The method according to claim 1, further comprising: measuring a horizontal distortion for the virtual image plane based on a line connecting a center region, a top point of the center region, and a bottom point of the center region; and measuring a vertical distortion for the virtual image plane based on a line connecting the center region, a left point of the center region and a right point of the center region.
8. The method according to claim 1, wherein the generating the level for the ghost image includes: generating a test signal, generating positions of a test pattern of a virtual image for the test signal and positions of a ghost pattern for the positions; and generating a ghost level based on the positions of the test pattern and the positions of the ghost pattern.
9. The method according to claim 8, wherein: the ghost level is obtained based on an average of distances between positions of the test pattern and positions of the ghost pattern and an average of angles between the positions of the test pattern and the positions of the ghost pattern.
10. A device for measuring optical characteristics comprising: a photographing unit configured to generate images including points in each pattern for a virtual image plane using one or more optical measurement devices, wherein each image is captured based on the one or more optical measurement devices, and each image corresponds to at least one of a left image, a central image, and a right image; and a calculation unit configured to generate positions of the points based on the one or more optical measurement devices and each pattern, and generate a level for at least one ghost image for the virtual image plane based on the generated positions, wherein the positions are obtained based on a field of view (FOV) of the one or more optical measurement devices, and a gap between a left optical measurement device and a right optical measurement device.
11. The device according to claim 10, wherein: the left image including points in a pattern is captured based on the left optical measurement device of the one or more optical measurement devices; the central image including points in a pattern is captured based on a central optical measurement device of the one or more optical measurement devices; and the right image including points in a pattern is captured based on the right optical measurement device of the one or more optical measurement devices.
12. The device according to claim 10, wherein: coordinate values of the positions are calculated based on a horizontal pixel index of the left optical measurement device, a horizontal pixel index of the right optical measurement device, a horizontal pixel index of the central optical measurement device, and a field of view (FOV) of the left optical measurement device.
13. The device according to claim 10, wherein the calculation unit is configured to: measure a virtual image distance for the virtual image plane based on a position within the pattern in a center region and the optical measurement device.
14. The device according to claim 13, wherein the calculation unit is configured to: measure a look-down angle and a look-up angle for the virtual image plane based on the position within the pattern and the virtual image distance.
15. The device according to claim 10, wherein the calculation unit is configured to: measure a horizontal field of view (FOV) for the virtual image plane based on a distance to a left point of a center region and a distance to a right point of the center region; and measure a vertical field of view (FOV) for the virtual image plane based on a distance to a top point of the center region and a distance to a bottom point of the center region.
16. The device according to claim 10, wherein the calculation unit is configured to: measure a horizontal distortion for the virtual image plane based on a line connecting a center region, a top point of the center region, and a bottom point of the center region; and measure a vertical distortion for the virtual image plane based on a line connecting the center region, a left point of the center region and a right point of the center region.
17. The device according to claim 10, wherein the calculation unit is configured to: generate a test signal, generate positions of a test pattern of a virtual image for the test signal and positions of a ghost pattern for the positions; and generate a ghost level based on the positions of the test pattern and the positions of the ghost pattern.
18. The device according to claim 17, wherein: the ghost level is obtained based on an average of distances between positions of the test pattern and positions of the ghost pattern and an average of angles between the positions of the test pattern and the positions of the ghost pattern.
Description
DESCRIPTION OF DRAWINGS
[0051]
[0052]
[0053]
[0054]
[0055]
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
BEST MODE
[0074] The present disclosure may be modified in various ways and implemented by various exemplary embodiments, so that specific exemplary embodiments are shown in the drawings and will be described in detail herein. However, it is to be understood that the present disclosure is not limited to the specific exemplary embodiments, but includes all modifications, equivalents, and substitutions included in the spirit and the scope of the present disclosure. Similar reference numerals are assigned to similar components in the following description of drawings.
[0075] Terms used in the specification, first, second, A, B, etc., may be used to describe various components, but the components are not to be construed as being limited to the terms. The terms are used only to distinguish one component from another component. For example, the first component may be named the second component, and vice versa, without departing from the scope of the present disclosure. The term and/or includes a combination of a plurality of relevant items or any one of a plurality of relevant terms.
[0076] It is to be understood that when one element is referred to as being connected to or coupled to another element, it may be connected directly to or coupled directly to another element or be connected to or coupled to another element, having the other element intervening therebetween. On the other hand, it should be understood that when one element is referred to as being connected directly to or coupled directly to another element, it may be connected to or coupled to another element without the other element intervening therebetween.
[0077] Terms used in the present specification are used only to describe specific exemplary embodiments rather than limiting the present disclosure. Singular forms are intended to include plural forms unless the context clearly indicates otherwise. It will be further understood that the terms comprises or have used in this specification, specify the presence of stated features, numerals, steps, operations, components, parts, or a combination thereof, but do not preclude the presence or addition of one or more other features, numerals, steps, operations, components, parts, or a combination thereof.
[0078] Unless defined otherwise, it is to be understood that all the terms used in the specification including technical and scientific terms have the same meaning as those that are understood by those who are skilled in the art. It will be further understood that terms such as terms defined in common dictionaries should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0079] Hereinafter, a preferred embodiment according to the present disclosure will be described in detail with reference to the accompanying drawings.
[0080] The present disclosure relates to a method and apparatus for measuring optical characteristics of an augmented reality device and measurement may be performed under the following environments. For example, referring to
[0081] However, it should be noted that the present disclosure is not limited to being implemented only in this environment and may be implemented in various different environments. For example, the position and size of the eye box, the number and arrangement of cameras, the number and arrangement of patterns included in the test image, etc. may depend on the measurement environment.
[0082]
[0083] In operation S110, an apparatus for measuring optical characteristics captures (photographs) a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0084] For example, referring to
[0085] In this case, the apparatus for measuring optical characteristics is connected to a plurality of cameras by wire or wirelessly, and may thus transmit an instruction to capture a test image on the virtual plane.
[0086] In operation S120, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and arrangement information including information about the arrangement of the at least one camera.
[0087] For example, the apparatus for measuring optical characteristics may acquire field of view (FOV) information and arrangement information by receiving information of the field of view (FOV) of a camera and information about the arrangement of cameras from a user. Preferably, the information about the field of view (FOV) of the camera may be a horizontal field of view (FOV) and the information about the arrangement of cameras may be the distance between cameras symmetrically disposed at both sides of the measurement reference position.
[0088] Finally, in operation S130, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and arrangement information.
[0089] In this case, the apparatus for measuring optical characteristics may calculate 3D coordinates of a plurality of patterns in a virtual plane having the measurement reference position as the origin (0, 0, 0), using information about the sizes of a plurality of captured images, information about the coordinates of a plurality of patterns included in the plurality of captured images, information about the field of view (FOV) of at least one camera, and information about the arrangement of the at least one camera.
[0090] Meanwhile, a detailed method of calculating the coordinates of the plurality of patterns will be described in detail in the following exemplary embodiment.
[0091] In another exemplary embodiment, when the at least one camera includes a central camera positioned at the measurement reference position, and a left camera and a right camera symmetrically positioned with the measurement reference position therebetween and when a plurality of patterns is horizontally and vertically arranged in one test image, the apparatus for measuring optical characteristics may calculate the coordinates of a plurality of patterns, using the number of horizontal pixels of a plurality of captured images, the coordinates of a plurality of patterns in the plurality of captured images, the field of view (FOV) of the at least one camera included in field of view (FOV) information, and the distance between the left camera and the right camera included in the camera arrangement information.
[0092] For example, referring to
[0093] In this case, the apparatus for measuring optical characteristics may calculate the 3D coordinates of each of the nine patterns on the virtual plane having the measurement reference position as the origin (0, 0, 0), using the number of horizontal pixels of a plurality of captured images, the coordinates of a plurality of patterns in the plurality of captured images, the field of view (FOV) of the at least one camera included in field of view (FOV) information, and the distance between the left camera and the right camera included in the camera arrangement information.
[0094] In another exemplary embodiment, the apparatus for measuring optical characteristics may calculate the coordinates of a plurality of patterns, using Equation 1.
[0095] In Equation 1, x.sub.ij, y.sub.ij, and z.sub.ij are X-axial, Y-axial, and Z-axial coordinates of the horizontal i-th and vertical j-th pattern with respect to the measurement reference position, a is the distance between the left camera and the right camera, M is the number of horizontal pixels of a plurality of captured images, is the field of view (FOV) of at least one camera,
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the left camera,
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the right camera, and
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the central camera.
[0096] In this case, referring to
[0097] Meanwhile, referring to
[0098] Further, referring
and may mean the coordinates of patterns shown in the captured image of each of the left camera (cam.sub.L), the central camera (cam.sub.C), and the right camera (cam.sub.R). In this case,
In this case,
may be pixel coordinates of the center of the horizontal i-th and vertical j-th pattern.
[0099] Meanwhile, referring to
[0100] In Equation 2, z is the z-axial distance from a measurement reference position to a virtual plane, is the field of view of a camera, a is the distance between a left camera and a right camera,
is the horizontal coordinate of a horizontal i-th and vertical j-th pattern in the captured image of the left camera,
is the horizontal coordinate of a horizontal i-th and vertical j-th pattern in the captured image of the right camera, and M is the number of horizontal pixels of a captured image.
[0101] In this case, it is apparent that Equation 1 may be obtained by rearranging Equation 2.
[0102] For example, referring to
[0103]
[0104] In operation S210, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0105] In operation S220, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and the arrangement information including information about the arrangement of the at least one camera.
[0106] In operation S230, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and the camera arrangement information.
[0107] Finally, in operation S240, the apparatus for measuring optical characteristics calculates a virtual image distance between the measurement reference position and the virtual plane, using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns.
[0108] For example, referring to
[0109] In another exemplary embodiment, the apparatus for measuring optical characteristics may calculate a virtual image distance, using Equation 3.
[0110] In Equation 3, D.sub.VI is a virtual image distance, and x.sub.22, y.sub.22, z.sub.22 are three-dimensional (3D) coordinates of a pattern where i=2 and j=2.
[0111]
[0112] In operation S310, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0113] In operation S320, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and arrangement information including information about the arrangement of the at least one camera.
[0114] In operation S330, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and the arrangement information.
[0115] Finally, in operation S340, the apparatus for measuring optical characteristics calculates a look down/up angle to the virtual plane from the measurement reference position, using the coordinates of the measurement reference position and the coordinates of at least one of the plurality of patterns.
[0116] The look down/up angle, which is the angle showing the height difference between an eye box and a virtual plane, shows whether a user looks up or looks down the virtual plane.
[0117] For example, when the coordinates (x.sub.22, y.sub.22, z.sub.22) of P.sub.22 are calculated with respect to (0, 0, 0), which is the measurement reference position where the user's eyes are positioned, it may be a looking-down situation if y.sub.22<0, as shown in
[0118] In another exemplary embodiment, the apparatus for measuring optical characteristics may calculate look down/up angle, using Equation 4.
[0119] In Equation 4, .sub.down/up is a lookdown/up angle, and x.sub.22, y.sub.22, z.sub.22 are three-dimensional (3D) coordinates of a pattern where i=2 and j=2.
[0120]
[0121] In operation S410, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0122] In operation S420, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and the arrangement information including information about the arrangement of the at least one camera.
[0123] In operation S430, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view information (FOV) and the arrangement information.
[0124] Finally, in operation S440, the apparatus for measuring optical characteristics calculates the horizontal field of view (FOV) of the measurement reference position, using the coordinates of the measurement reference position and the coordinates of two patterns positioned at both horizontal ends of the plurality of patterns on the virtual plane.
[0125] For example, referring to
[0126] In another exemplary embodiment, the apparatus for measuring optical characteristics may calculate a horizontal field of view (FOV), using Equation 5.
[0127] In Equation 5,
is a horizontal field of view (FOV), O is the coordinate of a measurement reference position, and P.sub.21 and P.sub.23 are coordinates of two patterns positioned at both horizontal ends of a plurality of patterns.
[0128]
[0129] In operation S510, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0130] In operation S520, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and the arrangement information including information about the arrangement of the plurality of cameras.
[0131] In operation S530, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and the arrangement information.
[0132] Finally, in operation S540, the apparatus for measuring optical characteristics calculates the vertical field of view (FOV) of the measurement reference position, using the coordinates of the measurement reference position and the coordinates of two patterns positioned at both vertical ends of the plurality of patterns on the virtual plane.
[0133] For example, referring to
[0134] In another exemplary embodiment, the apparatus for measuring optical characteristics may calculate a vertical field of view (FOV), using Equation 6.
[0135] In Equation 6,
is a vertical field of view (FOV), O is the coordinate of a measurement reference position, and P12 and P32 are coordinates of two patterns positioned at both vertical ends.
[0136]
[0137] In operation S610, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0138] In operation S620, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and the arrangement information including information about the arrangement of the at least one camera.
[0139] In operation S630, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and arrangement information.
[0140] Finally, in operation S640, the apparatus for measuring optical characteristics calculates a static distortion for each of three axes with respect to the measurement reference position on the basis of the coordinates of the plurality of patterns on the virtual plane.
[0141] In this case, referring to
[0142] Meanwhile, the apparatus for measuring optical characteristics may calculate the static distortion for each of three axes, using Equation 7.
[0143] In Equation 7,
are linear distortion values from x, y, and z axes, respectively, and x.sub.ab, y.sub.ab, and z.sub.ab are x, y, and z coordinates of the horizontal a-th (a=1, 2, 3) and vertical b-th (b=1, 2, 3) pattern, respectively.
[0144]
[0145] In operation S710, the apparatus for measuring optical characteristics captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0146] In operation S720, the apparatus for measuring optical characteristics acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and arrangement information including information about the arrangement of the at least one camera.
[0147] In operation S730, the apparatus for measuring optical characteristics calculates the coordinates of a plurality of patterns and the coordinates of a plurality of ghost patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and arrangement information.
[0148] For example, a ghost pattern may be generated on an automotive windshield that transmits a half of input light and reflects the other half. In more detail, referring to
[0149] In this case, the apparatus for measuring optical characteristics may calculate the coordinates of a plurality of ghost patterns corresponding to the coordinates of a plurality of patterns, respectively, in the same way as the method of calculating the coordinates of a plurality of patterns.
[0150] Finally, in operation S740, the apparatus for measuring optical characteristics may calculate ghosting levels on the basis of the coordinates of the plurality of patterns and the coordinates of the plurality of ghost patterns.
[0151] In this case, the apparatus for measuring optical characteristics may calculate a ghosting level from the gap between an original pattern and a corresponding ghost pattern.
[0152] In more detail, the apparatus for measuring optical characteristics may calculate a ghosting level, using Equation 8.
[0153] In Equation 8, Ghost is a ghosting level, x.sub.ij, y.sub.ij, and z.sub.ij are the x, y, and z coordinates of a horizontal i-th (i=1, 2, 3) and vertical j-th (j=1, 2, 3) pattern, and x.sub.Gij, y.sub.Gij, and z.sub.Gij are the x, y, and z coordinates of a horizontal i-th and vertical j-th ghost pattern.
[0154]
[0155] Referring to
[0156] The photographing unit 810 captures a test image including a plurality of patterns that is output in a virtual plane by the augmented reality (AR) device, using at least one camera disposed around a predetermined measurement reference position.
[0157] The acquisition unit 820 acquires field of view (FOV) information including information about the field of view (FOV) of the at least one camera, and arrangement information including information about the arrangement of the at least one camera.
[0158] Finally, the calculation unit 830 calculates the coordinates of a plurality of patterns with respect to the measurement reference position on the basis of a plurality of images captured by the at least one camera, field of view (FOV) information, and arrangement information.
[0159] In another exemplary embodiment, the at least one camera is a central camera positioned at the measurement reference position, and a left camera and a right camera symmetrically positioned with the measurement reference position therebetween. When a plurality of patterns is horizontally and vertically arranged in one test image, the calculation unit 830 may calculate the coordinates of a plurality of patterns, using the number of horizontal pixels of a plurality of captured images, the coordinates of a plurality of patterns in the plurality of captured images, the field of view (FOV) of at least one camera included in field of view (FOV) information, and the distance between the left camera and the right camera included in the arrangement information.
[0160] In another exemplary embodiment, the calculation unit 830 may calculate the coordinates of a plurality of patterns, using Equation 9.
[0161] In Equation 9, x.sub.ij, y.sub.ij, and z.sub.ij are x-axial, y-axial, and z-axial coordinates of the horizontal i-th and vertical j-th pattern with respect to the measurement reference position, is the distance between the left camera and the right camera, M is the number of horizontal pixels of a plurality of captured images, is the field of view (FOV) of at least one camera,
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the left camera,
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the right camera, and
is the horizontal coordinate of the horizontal i-th and vertical j-th pattern in the captured image of the central camera.
[0162] In another exemplary embodiment, the calculation unit 830 may further calculate a virtual image distance between a measurement reference position and a virtual plane, using the coordinates of the measurement reference position and the coordinates of at least one of a plurality of patterns on the virtual plane.
[0163] In another exemplary embodiment, the calculation unit 830 may calculate a virtual image distance, using Equation 10.
[0164] In Equation 10, D.sub.VI is a virtual image distance, and x.sub.22, y.sub.22, and z.sub.22 are the coordinates of one of a plurality of patterns.
[0165] In another exemplary embodiment, the calculation unit 830 may further calculate a look down/up angle from a measurement reference position to a virtual plane, using the coordinates of the measurement reference position and the coordinates of at least one of a plurality of patterns on the virtual plane.
[0166] In another exemplary embodiment, the calculation unit 830 may calculate look down/up angle, using Equation 11.
[0167] In Equation 11, .sub.down/up is a look down/up angle, and x.sub.22, y.sub.22, and z.sub.22 are the coordinates of one of a plurality of patterns.
[0168] In another exemplary embodiment, the calculation unit 830 may further calculate the horizontal field of view (FOV) of a measurement reference position, using the coordinates of the measurement reference position and the coordinates of two patterns positioned at both horizontal ends of a plurality of patterns in a virtual plane.
[0169] In another exemplary embodiment, the calculation unit 830 may calculate a horizontal field of view (FOV), using Equation 12.
[0170] In Equation 12,
is a horizontal field of view (FOV), O is the coordinate of a measurement reference position, and P.sub.21 and P.sub.23 are coordinates of two patterns positioned at both horizontal ends.
[0171] In another exemplary embodiment, the calculation unit 830 may further calculate the virtual field of view (FOV) of a measurement reference position, using the coordinates of the measurement reference position and the coordinates of two patterns positioned at both virtual ends of a plurality of patterns in a virtual plane.
[0172] In another exemplary embodiment, the calculation unit 830 may calculate a virtual field of view (FOV), using Equation 13.
[0173] In Equation 13,
is a vertical field of view (FOV), O is the coordinate of a measurement reference position, and P.sub.12 and P.sub.32 are coordinates of two patterns positioned at both vertical ends.
[0174] In another exemplary embodiment, the calculation unit 830 may further calculate a static distortion for each of three axes with respect to a measurement reference position on the basis of the coordinates of a plurality of patterns in a virtual plane.
[0175] In another exemplary embodiment, the calculation unit 830 may further calculate the coordinates of a plurality of ghost patterns corresponding to the coordinates of a plurality of patterns, respectively, on the basis of a plurality of captured images, field of view (FOV) information, and arrangement information, and may further calculate ghosting levels on the basis of the coordinates of the plurality of patterns and the coordinates of the plurality of ghost patterns.
[0176] Meanwhile, the exemplary embodiments of the present disclosure described above may be written as programs that can be executed in a computer and may be implemented in a common digital computer that executes the programs using a computer-readable recording medium.
[0177] The computer-readable recording medium includes a magnetic storage medium (e.g., a ROM, a floppy disk, and a hard disk) and an optical reading medium (e.g., a CD-ROM and a DVD).
[0178] Preferred exemplary embodiments of the present disclosure have been described. It would be understood by those skilled in the art that the present disclosure may be modified without departing from the scope of the present disclosure. Therefore, the disclosed exemplary embodiments should be considered in terms of explaining, not limiting. The scope of the present disclosure is defined by the claims, not the above description, and all differences within an equivalent range should be construed as being included in the present disclosure.
[0179] Referring to
[0180] Referring to
[0181] Referring to
[0182] Based on Equation 1, the coordinate values of the positions for the pattern of the image may be calculated.
[0183] Referring to Equation 1, according to the optical characteristic measurement method/apparatus according to the embodiments, the left image including the points in the pattern may be captured based on the left optical measurement device of one or more optical measurement devices, the central image including the points in the pattern may be captured based on the central optical measurement device of one or more optical measurement devices, and the right image including the points in the pattern may be captured based on the right optical measurement device of one or more optical measurement devices. The index according to the embodiments may indicate the order of such capture, and the field of view (FOV) may correspond to the angle of view.
[0184] Referring to
[0185] Referring to
[0186] Referring to
[0187] Referring to
[0188] The optical characteristic measurement method according to the embodiments may further include: measuring a horizontal distortion for the virtual image plane based on a line connecting the center point, the top point in the center region and the bottom point in the center region; and measuring a vertical distortion for the virtual image plane based on a line connecting the center point, the left point in the center region and the right point in the center region.
[0189] The optical characteristic measurement method/device according to the embodiments may be referred to as a method/device according to the embodiments.
[0190]
[0191] The optical characteristic measurement device according to the embodiments of
[0192] The method/device according to the embodiments may include the following configuration for measuring image quality characteristics.
[0193] Referring to
[0194] When the user's eyes are positioned in the eye box, it is assumed that the user can see the entire virtual image through natural movement of the eyes. The eye box position may be specified by the supplier. Otherwise, it may be estimated according to the method provided in IEC 62629-62-11 (see especially 4.2.3). The measurement device of the imaging LMD may be set within the eye-box position. A 3D image may be provided on the front or back surface in the virtual image plane. The 3D coordinate system of xyz shown in
[0195] Referring to
[0196] The configuration and conditions for measurement of the contrast and color of the 3D virtual overlay under the surrounding environment conditions are as follows.
[0197] Various virtual image content such as numbers, characters (letters), and other symbols may be reproduced as a 3D display in the form of a virtual image such as a 3D HUD. In general, a clear 3D virtual image can be displayed for the real environment. In order to evaluate the image quality from this perspective, the contrast and color-related attributes should be measured for the 3D virtual image overlapping the real environment.
[0198] However, the real environment is very diverse. Compared to the real objects adjacent to each other, the background effect, where the color and brightness characteristics of the 3D virtual image appear different from those of the adjacent real objects, is a visual recognition problem, and thus is excluded from this criterion. In order to improve visibility, brightness of the 3D virtual image may be controlled according to a predetermined algorithm embedded in the 3D HUD according to a change in the illuminance level of the real environment. Therefore, a measurement method is proposed to evaluate a change in contrast (reflecting a change in black-and-white brightness) and a change in chromaticity of the 3D virtual image under various ambient conditions. The illuminance level and the correlated color temperature for the ambient conditions can be referenced.
[0199] The room where the measurement is performed is a completely dark room. The white diffuser is located behind the virtual image plane with a size larger than each of the horizontal and vertical FOVs of the virtual image plane. Since the entire virtual plane can be observed when the user's eye is inside the eyebox, the white diffuser screen size can also be large enough to overlap the entire virtual plane if there is a light measurement device (LMD) inside the eyebox.
[0200] The illuminance and correlated color temperature of the ambient conditions are measured at the center of the white diffuser. The luminance reflected by the white diffuser is measured with zero input applied to the display to be measured. This measurement value is evaluated for the presence of stray light.
[0201]
[0202]
[0203] The optical characteristic measurement device according to the embodiments of
[0204] The 3D virtual image may be generated through a process of reflection and magnification through the half mirror and optical system of the 3D HUD of
[0205] Conditions for the measurement method according to the embodiments are as follows: [0206] a) test pattern: test image with nine circles in
[0210] The flowchart of the measurement method according to the embodiments includes the following a), b), c), and d) steps.
[0211] In step (a), a test signal is applied to the measurement device.
[0212] In step (b), three test pattern images are acquired by three imaging LMDs (i.e., LMD(L), LMD(C) and LMD(R)) shown in
[0213] In step (c), all positions (x.sub.ij, y.sub.ij, z.sub.ij) ij=1, 2, 3 for P11 to P33 of the original test pattern and ghost patterns for corresponding positions (x.sub.Gij, y.sub.Gij, z.sub.Gij) ij=1, 2, 3 for PG11 to PG33 are determined according to the procedure described above.
[0214] In step (d), the level of the ghost image is calculated as the average of the distances or angles between the original image and the corresponding positions for P11 to P33 as shown in
[0215]
[0216] The method/device according to the embodiments may generate a ghost level (distance) and a ghost level (angle) based on a difference value between the positions of the test pattern and the positions of the ghost patterns of the ghost image, as shown in
[0217] Optical measurement actions related to the ghost image according to the embodiments can be performed based on the levels obtained as described above.
[0218]
[0219] The method for measuring the optical characteristics according to the embodiments may include the following flowchart as shown in
[0220] In operation S2200, the optical measurement method according to the embodiment may include generating a test signal.
[0221] In operation S2201, the optical measurement method according to the embodiments may further include acquiring a test pattern for a virtual image.
[0222] In operation S2202, the optical measurement method according to the embodiments may further include generating positions of a test pattern for a virtual image and positions of a ghost pattern. The method of generating the positions has been described above with reference to
[0223] In operation S2203, the optical measurement method according to the embodiments may further include generating a ghost level. The method of obtaining the ghost level has been described above with reference to
[0224]
[0225] The method according to the embodiments may be performed by the device of
[0226] The optical measurement unit may be a camera. The processor may be a processor that performs operations according to the embodiments. The memory may store data and information related to the operation of the processor. The memory may provide necessary data to the processor. The memory may be connected to the optical measurement unit, the processor, etc.
[0227] With respect to ghost level measurement, referring to the drawings described above, the method according to the embodiments may further include: generating images including points in each pattern for a virtual image plane using one or more optical measurement devices, wherein each image is captured based on the one or more optical measurement devices, and each image corresponds to at least one of a left image, a central image, and a right image. The method may further include: generating positions of points based on one or more optical measurement devices and each pattern; and generating a level for a ghost image for the virtual image plane based on the generated positions, wherein the positions can be obtained based on a field of view (FOV) of the one or more optical measurement devices, a gap between the left optical measurement device and the right optical measurement device.
[0228] In addition, the step of generating a level for the ghost image may include generating a test signal; generating positions of a test pattern of a virtual image for the test signal and positions of a ghost pattern for the positions; and generating a ghost level based on the positions of the test pattern and the positions of the ghost pattern.
[0229] In addition, the ghost level may be obtained based on the average of distances between the positions of the test pattern and the positions of the ghost pattern and the average of angles between the positions of the test pattern and the positions of the ghost pattern.
[0230] The method according to the embodiments may be performed by the device. The device according to the embodiments may include a photographing unit that generates images including points in each pattern for a virtual image plane using one or more optical measurement devices, each image being captured based on the one or more optical measurement devices, and each image corresponding to at least one of a left image, a central image, and a right image; and a calculation unit that generates positions of points based on the one or more optical measurement devices and each pattern. The calculation unit may generate a level for a ghost image for the virtual image plane based on the generated positions, and the positions may be obtained based on a field of view (FOV) of the one or more optical measurement devices, and a gap between the left optical measurement device and the right optical measurement device. Each component of the device may include an interface for transmitting and receiving signals, a memory for storing operation-related information, and a processor for controlling the operation. The operations of the photographing unit and the calculation unit may be performed by the processor.
[0231] In addition, the processor may perform a step of generating a test signal, a step of generating positions of a test pattern of a virtual image for the test signal and positions of a ghost pattern for the positions, and a step of generating a ghost level based on the positions of the test pattern and the positions of the ghost pattern.
[0232] In addition, the ghost level may be obtained based on the average of distances of the positions of the test pattern and distances of the positions of the ghost pattern, and the average of angles of the positions of the test pattern and the positions of the ghost pattern.
[0233] As a result, the 3D HUD vehicle service can be improved in terms of driver visibility and convenience. In addition, since a light measurement device is mounted on the vehicle, measurement parameters for measuring optical characteristics such as points and depth for a HUD virtual image can be efficiently obtained. Based on the obtained parameter information, a calibration process that can reduce errors from the driver's perspective can be efficiently performed. The 3D HUD can be combined with autonomous driving technology to enable safe and accurate autonomous driving. In addition, accurate and safe vehicle services can be provided to the driver by processing ghost images.
[0234] Embodiments have been described from the method and/or device perspective, and descriptions of methods and devices may be applied so as to complement each other.
[0235] Although the accompanying drawings have been described separately for simplicity, it is possible to design new embodiments by merging the embodiments illustrated in the respective drawings. Designing a recording medium readable by a computer on which programs for executing the above-described embodiments are recorded as needed by those skilled in the art also falls within the scope of the appended claims and their equivalents. The devices and methods according to embodiments may not be limited by the configurations and methods of the embodiments described above. Various modifications can be made to the embodiments by selectively combining all or some of the embodiments. Although preferred embodiments have been described with reference to the drawings, those skilled in the art will appreciate that various modifications and variations may be made in the embodiments without departing from the spirit or scope of the disclosure described in the appended claims. Such modifications are not to be understood individually from the technical idea or perspective of the embodiments.
[0236] Various elements of the devices of the embodiments may be implemented by hardware, software, firmware, or a combination thereof. Various elements in the embodiments may be implemented by a single chip, for example, a single hardware circuit. According to embodiments, the components according to the embodiments may be implemented as separate chips, respectively. According to embodiments, at least one or more of the components of the device according to the embodiments may include one or more processors capable of executing one or more programs. The one or more programs may perform any one or more of the operations/methods according to the embodiments or include instructions for performing the same. Executable instructions for performing the method/operations of the device according to the embodiments may be stored in a non-transitory CRM or other computer program products configured to be executed by one or more processors, or may be stored in a transitory CRM or other computer program products configured to be executed by one or more processors. In addition, the memory according to the embodiments may be used as a concept covering not only volatile memories (e.g., RAM) but also nonvolatile memories, flash memories, and PROMs. In addition, it may also be implemented in the form of a carrier wave, such as transmission over the Internet. In addition, the processor-readable recording medium may be distributed to computer systems connected over a network such that the processor-readable code may be stored and executed in a distributed fashion.
[0237] In this specification, the term /and, should be interpreted as indicating and/or. For instance, the expression A/B may mean A and/or B. Further, A, B may mean A and/or B. Further, A/B/C may mean at least one of A, B, and/or C. Also, A/B/C may mean at least one of A, B, and/or C. Further, in this specification, the term or should be interpreted as indicating and/or. For instance, the expression A or B may mean 1) only A, 2) only B, or 3) both A and B. In other words, the term or used in this document should be interpreted as indicating additionally or alternatively.
[0238] Terms such as first and second may be used to describe various elements of the embodiments. However, various components according to the embodiments should not be limited by the above terms. These terms are only used to distinguish one element from another. For example, a first user input signal may be referred to as a second user input signal. Similarly, the second user input signal may be referred to as a first user input signal. Use of these terms should be construed as not departing from the scope of the various embodiments. The first user input signal and the second user input signal are both user input signals, but do not mean the same user input signals unless context clearly dictates otherwise.
[0239] The terms used to describe the embodiments are used for the purpose of describing specific embodiments, and are not intended to limit the embodiments. As used in the description of the embodiments and in the claims, the singular forms a, an, and the include plural referents unless the context clearly dictates otherwise. The expression and/or is used to include all possible combinations of terms. The terms such as includes or has are intended to indicate existence of figures, numbers, steps, elements, and/or components and should be understood as not precluding possibility of existence of additional existence of figures, numbers, steps, elements, and/or components. As used herein, conditional expressions such as if and when are not limited to an optional case and are intended to be interpreted, when a specific condition is satisfied, to perform the related operation or interpret the related definition according to the specific condition.
[0240] Operations according to the embodiments described in this specification may be performed by a transmission/reception device including a memory and/or a processor according to embodiments. The memory may store programs for processing/controlling the operations according to the embodiments, and the processor may control various operations described in this specification. The processor may be referred to as a controller or the like. In embodiments, operations may be performed by firmware, software, and/or a combination thereof. The firmware, software, and/or a combination thereof may be stored in the processor or the memory.
[0241] The operations according to the above-described embodiments may be performed by the transmission device and/or the reception device according to the embodiments. The transmission/reception device includes a transmitter/receiver configured to transmit and receive media data, a memory configured to store instructions (program code, algorithms, flowcharts and/or data) for a process according to embodiments, and a processor configured to control operations of the transmission/reception device.
[0242] The processor may be referred to as a controller or the like, and may correspond to, for example, hardware, software, and/or a combination thereof. The operations according to the above-described embodiments may be performed by the processor. In addition, the processor may be implemented as an encoder/decoder for the operations of the above-described embodiments.
[0243] As described above, related contents have been described in the best mode for carrying out the embodiments.
INDUSTRIAL APPLICABILITY
[0244] As described above, the embodiments may be fully or partially applied to the method and device for measuring the HUD ghost image.
[0245] It will be apparent to those skilled in the art that various changes or modifications can be made to the embodiments within the scope of the embodiments.
[0246] Thus, it is intended that the embodiments cover the modifications and variations of this disclosure provided they come within the scope of the appended claims and their equivalents.