Detecting tooth shade
11701208 · 2023-07-18
Assignee
Inventors
- Bo Esbech (Gentofte, DK)
- Rune FISKER (Virum, DK)
- Lars Henriksen (Bagsvaerd, DK)
- Tais Clausen (Klagshamn, SE)
Cpc classification
G16H20/40
PHYSICS
G01J3/504
PHYSICS
A61C13/082
HUMAN NECESSITIES
International classification
Abstract
Disclosed in a method, a user interface and a system for use in determining shade of a patient's tooth, wherein a digital 3D representation including shape data and texture data for the tooth is obtained. A tooth shade value for at least one point on the tooth is determined based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values.
Claims
1. A method for designing a denture, wherein the method comprises: obtaining a digital 3D representation of the patient's soft tissue, said digital 3D representation comprising: shape data expressing the shape of the soft tissue, and texture data expressing a texture profile of the soft tissue; and designing the denture such that a color of a soft tissue part of the denture is selected based on the texture profile of the corresponding part of the digital 3D representation, wherein the step of obtaining the digital 3D representation is provided by recording a series of sub-scans that comprise the shape data and the texture data, wherein the texture data at least partly is derived by combining texture information from corresponding parts of a number of the sub-scans.
2. The method according to claim 1, wherein the denture is an aesthetically pleasing denture defined by having the color that matches the color of the patient's soft tissue.
3. The method according to claim 1, wherein combining the texture information from the sub-scans comprises interpolating the texture information.
4. The method according to claim 1, wherein combining the texture information from the sub-scans comprises calculating an average value of the texture information.
5. The method according to claim 4, wherein the calculated average value is a weighted average of the texture information.
6. The method according to claim 1, wherein the method further comprises comparing the texture data with known texture values for soft tissue.
7. The method according to claim 1, wherein the method further comprises designing the denture such that a shape of the soft tissue part of the denture is selected based on the shape data of the corresponding part of the digital 3D representation.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The above and/or additional objects, features and advantages of the present invention, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:
(2)
(3)
(4)
(5)
(6)
(7)
DETAILED DESCRIPTION
(8) In the following description, reference is made to the accompanying figures, which show by way of illustration how the invention may be practiced.
(9)
(10) In step 102 a series of sub-scans of the patient's set of teeth is recorded, where a plurality of said sub-scans comprises both texture information and shape information for the tooth.
(11) In step 103 a digital 3D representation of the tooth is generated from said sub-scans, where the digital 3D representation comprises texture data expressing a texture profile of the tooth. The digital 3D representation further comprises shape data expressing the shape of the tooth such that the shape of the tooth can be visualized in a user interface.
(12) In step 104 a tooth shade value for a point on the tooth is determined based on the texture data. This is done at least in part by comparing the texture data of the corresponding point of the digital 3D representation with a known texture value of one or more reference tooth shade values. The reference tooth shade values may be provided in the form of a library file and comprise tooth shade values and corresponding texture values based on e.g. the VITA 3D-Master and/or the VITA Classic tooth shade systems.
(13)
(14) The point or points on the tooth for which the tooth shade value(s) is/are determined can be selected by an operator. This can be the case e.g. when the digital 3D representation of the tooth is visualized in a user interface and the operator uses a pointing tool, such as a computer mouse, to indicate where on the digital 3D representation of the tooth, he wishes to determine the tooth shade value. The point or points can also be selected by a computer implemented algorithm based on predetermined positions on the digital 3D representation of the tooth, such as a point arranged at a certain distance to the incisal edge of the tooth.
(15) The screen shot 210 seen in
(16) The screen shot 310 seen in
(17) The second region 314 is located at the patient's soft tissue. An anatomical correct tooth shade value can hence not be calculated from the texture data of that part of the digital 3D representation of the patient's teeth and the corresponding certainty scope is accordingly very low as seen in the vertical bars of tooth value section 319.
(18)
(19)
(20) In step 531 a digital restoration design is created e.g. based on the shape data of a digital 3D representation of the patient's set of teeth and/or on template digital restoration design loaded from a library. Template digital restoration designs may e.g. be used when the tooth is broken.
(21) In step 532 the tooth shade values of different points or regions of the teeth are derived from the texture data of the digital 3D representation of the patient's set of teeth. From the derived tooth shade values or from tooth shade profiles created based on the derived tooth shade values a desired shade profile for the dental restoration can be determined. This can be based on e.g. feature extraction where shade values are extracted from the other teeth by e.g. identifying shade zones on these teeth and copying these zones to the dental restoration. It can also be based on established shade rules for teeth, e.g. a rule describing a relation between the tooth shades values or profiles of the canines and the anterior teeth.
(22) In step 533 the desired tooth shade value(s) for the dental restoration is merged into the digital restoration design.
(23) When the dental restoration is to be drilled from a multicolored milling block it is important that the dental restoration is milled from the correct parts of the milling block. In step 534 a CAD model of the milling block is provided, where the CAD model comprises information of the shade profile of the milling block material. The optimal position of the digital restoration design relative to the CAD model of the milling block is then determined in 535, where different criteria can be apply to provide the best fit between the desired shade profile and what actually can be obtained as dictated by the shade profile of the milling block.
(24) In step 536 the dental restoration is manufactured from the milling block by removing milling block material until the dental restoration is shaped according to the digital restoration design.
(25)
(26) Many scanning devices have Bayer color filters with Red, Green and Blue filters and hence record color information in the RGB color space. For instance a focus scanner can record series of 2D color images for the generation of sub-scans, where the color information is provided in the RGB color space. The processor 644 then comprises algorithms for transforming the recorded color data into e.g. the L*a*b or L*C*h color spaces.
(27) The system may further comprise a unit 648 for transmitting a digital restoration design and a CAD model of a milling block to e.g. a computer aided manufacturing (CAM) device 649 for manufacturing a shaded dental restoration or to another computer system e.g. located at a milling center where the dental restoration is manufactured. The unit for transmitting the digital restoration design can be a wired or a wireless connection.
(28) The scanning of the patient's set of teeth using the scanning device 641 can be performed at a dentist while deriving the tooth shade values can be performed at a dental laboratory. In such cases the digital 3D representation of the patient's set of teeth can be provided via an internet connection between the dentist and the dental laboratory.
(29)
(30) Different scanner configurations can be used to acquire sub-scans comprising both shape and texture information. In some scanner designs the scanner is mounted on axes with encoders which provides that the sub-scans acquired from different orientations can be combined using position and orientation readings from the encoders. When the scanner operates by the focus-scanning technique the individual sub-scans of the tooth are derived from a sequence of 2D images obtained while scanning a focus plane over a portion of the tooth. The focus scanning technique is described in detail in WO2010145669. The shape information of the sub-scans for an object, such as a tooth, can be combined by algorithms for stitching and registration as widely known in the literature. Texture data relating to the tooth color can be obtained using a scanner having a multi-chromatic light source, e.g. a white light source and a color image sensor. Color information from multiple sub-scans can be interpolated and averaged by methods such as texture weaving, or by simply averaging corresponding color components of the sub-scans corresponding to the same point/location on the surface. Texture weaving is described by e.g. Callieri M, Cignoni P, Scopigno R. “Reconstructing textured meshes from multiple range rgb maps”. VMV 2002, Erlangen, Nov. 20-22, 2002.
(31) In
(32) A digital 3D representation of the tooth can be generated by combining sub-scans acquired from different orientations relative to the teeth, e.g. by sub-scan registration. Sub-scans acquired from three such different orientations are illustrated in
(33) One way of doing this is to calculate the average value for each of the parameters used to describe the texture. For example, when the L*a*b*color system is used to describe the color information provided in each sub-scan, the color data of the digital 3D representation can be derived by averaging over each of the L*, a*, and b*parameters of the sub-scans. For example, the L*parameter of the color data for a given point P is then given by
(34)
where N is the number of sub-scans used in deriving the texture data and L*.sub.i(P) is the L*parameter of the i'th sub-scan for the segment relating to P. Equivalent expressions are true for the a*and b*parameters for point P. The color parameters for each point on the digital 3D representation of the tooth can be determined for sections of or the entire surface of the tooth, such that the generated digital 3D representation comprises both shape and texture information about the tooth. The spatial resolution of the color data does not necessarily have to be identical to the resolution of the shape data of the digital 3D representation. The point P can be described e.g. in Cartesian, cylindrical or polar coordinates.
(35) When the color data is derived for a point on the tooth, the tooth shade value for that point can be determined by comparing the derived color data with the known color data of the reference tooth shade values of a tooth shade guide such as the VITA 3D-Master.
(36)
(37)
(38) In order to obtain more precise color data the averaging of the color information described above in relation to
(39) In
(40) This can be expressed by a modification of the equation given above. For a weighted averaging of the color information, the L*parameter of the color data for a given point P is given by L*(P)=Σ.sub.i.sup.N{α.sub.i(P).Math.L*.sub.i(P)}/Σ.sub.i.sup.Nα.sub.i where α.sub.i(P) is the weight factor for the color information of the i'th sub-scan in the segment at P. When a given sub-scan (e.g. the j'th sub-scan) is recorded at an angle relative to the tooth surface which causes the optical path to be e.g. perpendicular to the tooth surface at P, the corresponding weight factor α.sub.i(P) is given a lower value than the color data of sub-scans acquired with an oblique angle between the optical path and the tooth surface.
(41) Equivalent equations are true for the a*and b*parameters of the color data for point P.
(42)
(43) For a given point P on the digital 3D representation, the color data (L*.sub.P, a*.sub.P, b*.sub.P) has been determined, e.g. by combining the color information of a series of sub-scans used for generating the digital 3D representation. If the color information originally is recorded using the RGB color space it is transformed into the L*a*b*color space using algorithms known to the skilled person.
(44) In the example illustrated by
(45) The reference shade values of the Vita classical shade guide are: B1, A1, B2, D2, A2, C1, C2, D4, A3, D3, B3, A3.5, B4, C3, A4, and C4. The color data of these reference shades can be provided by scanning the corresponding pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P.
(46) The tooth shade value for the point is determined as the reference tooth shade value which has the smallest Euclidian distance to the point in the L*a*b color space. The Euclidian distance ΔE.sub.P-R.sup.1 from the color (L*.sub.P, a*.sub.P, b*.sub.P) to the known colors of the reference tooth shade values are calculated using the expression:
(47)
where Ri refers to the i'th reference tooth shade.
(48) In
(49) The certainty score for the tooth shade value determined for point P depends on how close the color data of the point P is to the known color value of the selected reference tooth shade value. This can be quantified by the Euclidian distance and since point P is not particularly close to R2 in
(50) An alternative approach to using the Euclidian distance is to determine individual parameters of the tooth shade value one at a time. This approach can be used e.g. when the reference tooth shades values are those of the Vita 3D-master system.
(51) The reference tooth shade values of the Vita 3D-master shade guide are expressed in codes consisting of the three parameters Lightness-hue-Chroma, where Lightness is given in values between 1 and 5, the Chroma in values between 1 and 3, and the hue as one of “L”, “M”, or “R”. A shade code in the Vita 3D-master can e.g. be 2M1, where the Lightness parameter equals 2, the Chroma 1 and the hue “M”.
(52) The known color data of the VITA 3D-master shade guide reference shades can be provided by scanning the pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P. The known color data of each reference shade guide (having a code expressed in terms of Lightness, hue and Chroma) is then provided in terms of the L*a*b color space.
(53) Since the lightness L has the largest impact on the human perception of the tooth color, the value of the Lightness-parameter L*.sub.P in the point is determined first. The value of L*.sub.P is compared with the values of the L*parameters for the reference tooth shades. If L*.sub.P is close to the L*-value for the i'th reference tooth shade value, L*.sub.Ri the L*parameter for point P may be set equal to L*.sub.Ri.
(54) In some cases the Lightness parameter is not close to any of the references but instead is located almost in the middle between two L*-values. For example when L*.sub.P in the point is between the values of L*.sub.Ri=2 and L*.sub.Ri+1=3 with almost equal distance to each of these as illustrated in
(55) The same procedure is performed for first the Chroma parameter and finally for the hue such that the three parameter of the tooth shade value are determined.
(56) Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention.
(57) In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
(58) A claim may refer to any of the preceding claims, and “any” is understood to mean “any one or more” of the preceding claims.
(59) It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
(60) The features of the method described above and in the following may be implemented in software and carried out on a data processing system or other processing means caused by the execution of computer-executable instructions. The instructions may be program code means loaded in a memory, such as a RAM, from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software or in combination with software.
REFERENCES
(61) Hassel 2012: Hassel et al. “Determination of VITA Classical shades with the 3D-Master shade guide”. Acta Ocdontol Scand. 2013; 71(3-4):721-6.
(62) Dozic 2007: Dozic et al. “Performance of five commercially available tooth color-measuring devices”, J Prosthodont. 2007; 16(2):93-100.
EMBODIMENTS
(63) 1. A method for determining shade of a patient's tooth, wherein the method comprises: obtaining a digital 3D representation of the tooth, where the digital 3D representation comprises shape data and texture data for the tooth; and determining a tooth shade value for at least one point on the tooth based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values.
(64) 2. The method according to embodiment 1, wherein determining the tooth shade value for the point comprises selecting the reference tooth shade value with the known texture value closest to the texture data of the point.
(65) 3. The method according to embodiment 1 or 2, wherein determining the tooth shade value for the point comprises an interpolation of the two or more reference tooth shade values having known texture values close to the texture data of the point.
(66) 4. The method according to any one of the preceding embodiments, wherein the method comprises deriving a certainty score expressing the certainty of the determined tooth shade value.
(67) 5. The method according to embodiment 4, wherein the method comprises generating a visual representation of the certainty score and displaying this visual representation in a user interface.
(68) 6. The method according to embodiment 5, wherein the visual representation of the certainty score is displayed together with or is mapped onto the digital 3D representation of the tooth.
(69) 7. The method according to any one of embodiments 4 to 6, wherein the method comprises comparing the derived certainty score with a range of acceptable certainty score values.
(70) 8. The method according to any one of embodiments 4 to 7, wherein the certainty measure relates to how uniform the sub-scan texture information is at the point, and/or to how close the texture data is to the known texture value of the determined tooth shade value, and/or to the amount of texture information used to derive the texture data at the point.
(71) 9. The method according to any one of embodiments 4 to 8, wherein the visual representation of the certainty score comprises a binary code, a bar structure with a color gradient, a numerical value, and/or a comparison between the texture data and the known texture value of the determined tooth shade value.
(72) 10. The method according to any one of the preceding embodiments, wherein the one or more reference tooth shade values relate to shade values for natural teeth with intact surface and/or to shade values for teeth prepared for a dental restoration.
(73) 11. The method according to any one of the preceding embodiments, wherein the method comprises comparing the texture data with known texture values for soft oral tissue, such as gum tissue and gingiva.
(74) 12. The method according to any of the previous embodiments, wherein the texture information comprises at least one of tooth color or surface roughness.
(75) 13. The method according to any one of the preceding embodiments, wherein the method comprises creating a shade profile for the tooth from shade values determined one or more points on the tooth.
(76) 14. The method according to embodiment 13, wherein the tooth shade profile comprises a one or more tooth shade regions on the tooth surface where an average tooth shade is derived for each region from tooth shade values determined for a number of points within the region.
(77) 15. The method according to any of the previous embodiments, wherein obtaining the digital 3D representation of the tooth comprises recording a series of sub-scans of the tooth, where at least one of said sub-scans comprises both texture information and geometry information for said tooth, and generating the digital 3D representation of the tooth from the recorded series of sub-scans.
(78) 16. The method according to embodiment 15, wherein the texture data at least partly are derived by combining the texture information from corresponding parts of a number of the sub-scans.
(79) 17. The method according to embodiment 16, wherein combining the texture information from the sub-scans comprises interpolating the texture information and/or calculating an average value of the texture information.
(80) 18. The method according to embodiment 17, wherein the calculated average value is a weighted average of the texture information.
(81) 19. A user interface for determining and displaying shade of a patient's tooth, wherein the user interface is configured for: obtaining a digital 3D representation of the tooth, said digital 3D representation comprising shape data and texture data for the tooth; displaying at least the shape data of the digital 3D representation such that the shape of the tooth is visualized in the user interface; determining a tooth shade value for at least one point on the tooth based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values; and
displaying the determined tooth shade value.