ORAL SCANNER AND 3D OVERLAY IMAGE DISPLAY METHOD USING SAME
20210298582 · 2021-09-30
Assignee
Inventors
Cpc classification
A61C9/006
HUMAN NECESSITIES
A61B1/24
HUMAN NECESSITIES
A61B1/0005
HUMAN NECESSITIES
International classification
A61B1/00
HUMAN NECESSITIES
A61B1/24
HUMAN NECESSITIES
A61C9/00
HUMAN NECESSITIES
Abstract
Disclosed is an oral scanner which is inserted into a patient's mouth and scans the patient's mouth by means of non-contact manner to generate a three-dimensional model and a three-dimensional overlay image display method using the same. The oral scanner includes a projector which irradiates a source light source including a visible light source and an IR light source; a camera which acquires a three-dimensional image for a mouth surface and an IR image for an inside of the gingiva to sense reflected light for the source light source; and an image processor which combines the IR image and the three-dimensional image to generate a three-dimensional duplication model, wherein the IR image is irradiated in the mouth to be overlaid by the projector based on the three-dimensional duplication model.
Claims
1. An oral scanner, comprising: a projector which irradiates a source light source including a visible light source and an IR light source; a camera which acquires a three-dimensional image for a mouth surface and an IR image for an inside of the gingiva to sense reflected light for the source light source; and an image processor which combines the IR image and the three-dimensional image to generate a three-dimensional duplication model, wherein the IR image is irradiated in the mouth to be overlaid by the projector based on the three-dimensional duplication model.
2. The oral scanner according to claim 1, further comprising: a position sensing unit which senses an orientation of the oral scanner, wherein the image processor corrects the three-dimensional duplication model based on the orientation information received by the position sensing unit.
3. The oral scanner according to claim 1, wherein the projector includes: a light source which includes a beam combiner and a visible light source and an IR light source disposed in the vicinity thereof; an illuminating unit which includes a display panel and a TIR prism; and a projective lens unit.
4. The oral scanner according to claim 1, wherein the camera includes: a focal lens unit into which reflected light is incident; and a sensing unit which includes a beam splitter and a visual sensor and an IR sensor disposed in the vicinity thereof
5. A three-dimensional overlay image display method, comprising: (a) acquiring a three-dimensional image for a surface in a mouth by irradiating a visible light source as a pattern to sense reflected light thereof; (b) acquiring an IR image for the inside of the gingiva by irradiating an IR light source to sense reflected light thereof; (c) generating a three-dimensional duplication model by combining the three-dimensional image and the IR image; and (d) irradiating the IR image in the mouth to be overlaid by referring to the three-dimensional duplication model.
6. The 3D overlay image display method according to claim 5, wherein in the steps (a), (b), and (d), the light source is irradiated using the same irradiation optical system.
7. The 3D overlay image display method according to claim 6, wherein the irradiation optical system includes: a light source which includes a beam combiner and a visible light source and an IR light source disposed in the vicinity thereof; an illuminating unit which includes a display panel and a TIR prism; and a projective lens unit.
8. The 3D overlay image display method according to claim 5, wherein in the steps (a) and (b), the reflected light is sensed using the same sensing optical system.
9. The 3D overlay image display method according to claim 8, wherein the sensing optical system includes: a focal lens unit into which reflected light is incident; and a sensing unit which includes a beam splitter and a visual sensor and an IR sensor disposed in the vicinity thereof
10. The 3D overlay image display method according to claim 5, wherein the step (d) is performed to be corrected according to the orientation of the three-dimensional duplication model.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
DETAILED DESCRIPTION
[0030] Hereinafter, the present invention will be described in more detail with reference to exemplary embodiments. Prior to this, terms or words used in the present specification and claims should not be interpreted as being limited to typical or dictionary meanings, but should be interpreted as having meanings and concepts which comply with the technical spirit of the present invention, based on the principle that an inventor can appropriately define the concept of the term to describe his/her own invention in the best manner. Therefore, configurations illustrated in the embodiments described in the present specification are only the most preferred embodiment of the present invention and do not represent all of the technical spirit of the present invention, and thus it is to be understood that various equivalents and modified examples, which may replace the configurations, are possible when filing the present application. In the drawings, like components are denoted by like reference numerals. In addition, in the specification, unless explicitly described to the contrary, the word “comprise” and variations such as “comprises” or “comprising”, will be understood to imply the inclusion of stated elements but not the exclusion of any other elements. In describing the present invention, when it is determined that the detailed description of the known art related to the present invention may obscure the gist of the present invention, the detailed description thereof will be omitted.
[0031]
[0032] Referring to
[0033]
[0034] Referring to
[0035] The visible light source 1124 basically serves as a source light source required to acquire the “first image”. In this case, the visible light source 1124 is changed into a pattern type by the display panel 1142 of the illuminating unit 114 to be irradiated. The visible light source 1124 also serves as a source light source which irradiates the “second image” related to the position of an artifact in the gingiva, for example, an implanted position of a metal fixture at the time of the implant procedure onto a mouth surface to be overlaid to be identified with naked eyes. The visible light source 1124 may be configured by an LED light source and may be configured by white light or if necessary, configured by separate RGB light to extract a 3D image using a chromatic aberration according to each wavelength. An optical system with a large chromatic aberration has a focal surface which varies depending on RGB light, that is, has a projection length which varies depending on a wavelength, so that it is advantageous to extract 3D information.
[0036] The IR light source 1126 serves as a source light source required for acquiring the second image and in this case, a wavelength of the IR light source 1126 is desirably controlled to deeply penetrate into the hypodermis in the gingiva. The IR light source 1126 has a large chromatic aberration so that it is advantageous to extract a 3D image.
[0037] The beam combiner 1122 serves to combine the visible light source 1124 and the IR light source 1126 in the same optical axis in order to simultaneously or sequentially use the same optical system excluding a light source. The beam combiner 1122 may use a prism or have a plate shape which is subjected to a coating treatment. In this case, when the dichroic coating treatment is performed, the beam combiner is designed to reflect the visible light source 1124 and transmit the IR light source 1126.
[0038] The display panel 1142 of the illuminating unit 114 may be configured by DMD of Taxis Instruments Incorporated and a micro element, and serves to induce the change of an angle of light irradiated for every pixel according to image information to transmit the light to the projective lens unit 116. During this process, the TIR prism 1144 of the illuminating unit 114 has little angle change and serves to control the light to be suitable for a total reflection condition to separate incident light and emitted light in the same space.
[0039] In the meantime, a relay lens 118 interposed between the beam combiner 1122 and the illuminating unit 114 serves to induce light emitted from the beam combiner 1122 to be irradiated to the TIR prism 1144 of the illuminating unit 114 with a predetermined angle and illuminance.
[0040] The projective lens unit 116 serves to project image information generated by the display panel 1142 of the illuminating unit 114 into the mouth and generates a chromatic aberration to enhance the 3D effect.
[0041] Referring to
[0042] The image processor 130 generates 3D data for the “first image” and the “second image” and generates a three-dimensional duplication model based thereon. The three-dimensional duplication model is 3D data obtained by combining the “first image” and the “second image” and serves as a reference during the process of irradiating the “second image” in the mouth to be overlaid.
[0043] The position sensing unit 140 serves to sense an orientation of the oral scanner 10. The orientation information of the oral scanner 10 measured by the position sensing unit 140 is transmitted to the image processor 130 and the image processor 130 corrects the three-dimensional duplication model based on the orientation information. Since the overlay irradiation of the second image into the mouth is performed based on the three-dimensional duplication model which is corrected in real time according to the movement of the oral scanner 10, even though the oral scanner 10 moves during the procedure, the irradiation position of the second image which is overlaid in the mouth may be constantly maintained.
[0044] The position sensing unit 140 may be implemented by hardware or software means. As the hardware means, for example, an acceleration sensor, a geomagnetic sensor, or a GPS sensor is used to sense an XYZ coordinate position and αβγ angle information to sense orientation information of the oral scanner 10 or as the software means, for example, real-time image information which is sensed by the sensing optical system of the camera 120 is sensed to sense the orientation information of the oral scanner 10 by comparing the image information with image information which has been acquired in advance.
[0045]
[0046]
[0047] In the meantime, after generating a three-dimensional duplication model (S30 of
[0048]
[0049] The above description relates to specific exemplary embodiments of the present invention. The above-described exemplary embodiment according to the present invention is not understood to limit the matters disclosed for the purpose of explanation or the scope of the present invention, but it is understood that those skilled in the art can make various modifications or changes without departing from the gist of the present invention. Therefore, it can be understood that all changes and modifications corresponds to the scope of the invention disclosed in the claims or an equivalent thereof.