Detecting tooth shade

11701208 · 2023-07-18

Assignee

Inventors

Cpc classification

International classification

Abstract

Disclosed in a method, a user interface and a system for use in determining shade of a patient's tooth, wherein a digital 3D representation including shape data and texture data for the tooth is obtained. A tooth shade value for at least one point on the tooth is determined based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values.

Claims

1. A method for designing a denture, wherein the method comprises: obtaining a digital 3D representation of the patient's soft tissue, said digital 3D representation comprising: shape data expressing the shape of the soft tissue, and texture data expressing a texture profile of the soft tissue; and designing the denture such that a color of a soft tissue part of the denture is selected based on the texture profile of the corresponding part of the digital 3D representation, wherein the step of obtaining the digital 3D representation is provided by recording a series of sub-scans that comprise the shape data and the texture data, wherein the texture data at least partly is derived by combining texture information from corresponding parts of a number of the sub-scans.

2. The method according to claim 1, wherein the denture is an aesthetically pleasing denture defined by having the color that matches the color of the patient's soft tissue.

3. The method according to claim 1, wherein combining the texture information from the sub-scans comprises interpolating the texture information.

4. The method according to claim 1, wherein combining the texture information from the sub-scans comprises calculating an average value of the texture information.

5. The method according to claim 4, wherein the calculated average value is a weighted average of the texture information.

6. The method according to claim 1, wherein the method further comprises comparing the texture data with known texture values for soft tissue.

7. The method according to claim 1, wherein the method further comprises designing the denture such that a shape of the soft tissue part of the denture is selected based on the shape data of the corresponding part of the digital 3D representation.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) The above and/or additional objects, features and advantages of the present invention, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:

(2) FIG. 1 shows an example of a flow chart for an embodiment.

(3) FIGS. 2 to 4 show parts of screen shots of user interfaces.

(4) FIG. 5 shows steps of a method for designing a dental restoration.

(5) FIG. 6 shows a schematic of a system for determining tooth shade values.

(6) FIGS. 7A-7D and 8A-8B show schematics of intra-oral scanning.

(7) FIGS. 9A-9B illustrates one way of determining tooth shade values from texture data.

DETAILED DESCRIPTION

(8) In the following description, reference is made to the accompanying figures, which show by way of illustration how the invention may be practiced.

(9) FIG. 1 shows an example of a flow chart 100 for an embodiment of the method for determining shade of a patient's tooth.

(10) In step 102 a series of sub-scans of the patient's set of teeth is recorded, where a plurality of said sub-scans comprises both texture information and shape information for the tooth.

(11) In step 103 a digital 3D representation of the tooth is generated from said sub-scans, where the digital 3D representation comprises texture data expressing a texture profile of the tooth. The digital 3D representation further comprises shape data expressing the shape of the tooth such that the shape of the tooth can be visualized in a user interface.

(12) In step 104 a tooth shade value for a point on the tooth is determined based on the texture data. This is done at least in part by comparing the texture data of the corresponding point of the digital 3D representation with a known texture value of one or more reference tooth shade values. The reference tooth shade values may be provided in the form of a library file and comprise tooth shade values and corresponding texture values based on e.g. the VITA 3D-Master and/or the VITA Classic tooth shade systems.

(13) FIGS. 2 to 4 show parts of screen shots from user interfaces in which derived tooth shade values and visual representation of the corresponding certainty scores for a number of tooth regions are displayed at the digital 3D representations of the patient's set of teeth.

(14) The point or points on the tooth for which the tooth shade value(s) is/are determined can be selected by an operator. This can be the case e.g. when the digital 3D representation of the tooth is visualized in a user interface and the operator uses a pointing tool, such as a computer mouse, to indicate where on the digital 3D representation of the tooth, he wishes to determine the tooth shade value. The point or points can also be selected by a computer implemented algorithm based on predetermined positions on the digital 3D representation of the tooth, such as a point arranged at a certain distance to the incisal edge of the tooth.

(15) The screen shot 210 seen in FIG. 2 shows three regions 212, 213, 214 on the digital 3D representation of the patient's set of teeth. Two of these 212, 213, are selected at the part of the digital 3D representation corresponding to the tooth 211 while the third 214 is selected on the soft tissue part 215 of the digital 3D representation. Average tooth shade value for a region can be calculated by averaging over tooth shade values derived for a number of points within the region or by calculating an average texture value for the region and determining the average tooth shade value therefrom. The average tooth shade values are displayed in tooth value sections 217, 218, 219 linked to the regions in the user interface. In the tooth value sections 217, 218 relating to the regions 212, 213 two tooth shade values are displayed where the upper shade value is derived using known texture values corresponding to the reference tooth shade values of the VITA 3D-Master tooth shade system and the lower tooth shade values relates to the VITA Classic tooth shade system. It is also seen that for the region 213 closest to the gingiva, the tooth shade is determined to be 2L1.5 in the VITA 3D-Master system and B1 in the VITA Classic system. In FIG. 2 the certainty scores for the derived tooth shade values are visualized as a certainty score indicator displayed next to the tooth shade values. In FIG. 2 the visualization of the certainty score indicator is in the form of a checkmark which indicates that the certainty score is sufficiently good to provide that the derived tooth shade values can be relied upon. The color of the checkmark may provide further information to the certainty score, such as in cases where a green checkmark indicates a more certain tooth shade value than a yellow checkmark. The third region 214 is located at the patient's soft tissue. An anatomical correct tooth shade value can hence not be calculated from the texture data of that part of the digital 3D representation of the patient's teeth and the corresponding certainty scope is accordingly very low. The visualization of the certainty score in the tooth value section 219 is hence a cross indicating that the derived shade value was rejected. Further no shade value is indicated in the tooth value section 219.

(16) The screen shot 310 seen in FIG. 3 shows two regions 312, 314 on the digital 3D representation of the patient's set of teeth. One of these regions 312 is selected at the part of the digital 3D representation corresponding to the tooth 311 while the second region 314 is selected on the soft tissue part 315 of the digital 3D representation. Average tooth shade value for a region can be calculated as described above in relation to FIG. 2. Shade value sections 317, 319 are also displayed for the regions 312, 314. Two tooth shade values 321 are derived for the region 312 and displayed in the corresponding tooth value section 317, where the upper value is derived using known texture values corresponding to the reference tooth shade values of the VITA 3D-Master tooth shade system (derived tooth shade value is 1.5M1) and the lower value using the VITA Classic tooth shade system (derived tooth shade value is B1). In FIG. 3 the certainty score is visualized in the form of a certainty score indicator 322 comprising a vertical bar with a color gradient going from red representing a poor certainty score to green representing a good certainty score. The certainty score indicator has a marker indicating the certainty score on the bar. It is seen that the tooth shade value 1.5M1 of the VITA 3D-Master system is more certain than the tooth shade value B1 of the VITA Classic system for this region. The tooth shade value of 1.5M1 is found by interpolation of the reference tooth shades 1M1 and 2M2.

(17) The second region 314 is located at the patient's soft tissue. An anatomical correct tooth shade value can hence not be calculated from the texture data of that part of the digital 3D representation of the patient's teeth and the corresponding certainty scope is accordingly very low as seen in the vertical bars of tooth value section 319.

(18) FIG. 4 shows a screen shot 410 where determined tooth shade values are derived for a total of 15 regions on the digital 3D representation of the tooth 411. The tooth shade values are all derived based on the known texture values of the reference tooth shade values of the VITA 3D-Master tooth shade system. The certainty scores are visualized in the form of a certainty score indicator comprising a vertical bar with a color gradient going from red representing a poor certainty score to green representing a good certainty score. As can be seen in the tooth value sections 417, 418 of the user interface there are large variations in the certainty scores. For example, the certainty score for the region 412 is almost at maximum while the certainty score of the region 413 is much close to a threshold for acceptable certainty score values. When tooth shade values are determined for a number of points on the tooth, the points may be arranged in a grid over the part of the digital 3D representation of the tooth.

(19) FIG. 5 shows steps of a method for designing a dental restoration.

(20) In step 531 a digital restoration design is created e.g. based on the shape data of a digital 3D representation of the patient's set of teeth and/or on template digital restoration design loaded from a library. Template digital restoration designs may e.g. be used when the tooth is broken.

(21) In step 532 the tooth shade values of different points or regions of the teeth are derived from the texture data of the digital 3D representation of the patient's set of teeth. From the derived tooth shade values or from tooth shade profiles created based on the derived tooth shade values a desired shade profile for the dental restoration can be determined. This can be based on e.g. feature extraction where shade values are extracted from the other teeth by e.g. identifying shade zones on these teeth and copying these zones to the dental restoration. It can also be based on established shade rules for teeth, e.g. a rule describing a relation between the tooth shades values or profiles of the canines and the anterior teeth.

(22) In step 533 the desired tooth shade value(s) for the dental restoration is merged into the digital restoration design.

(23) When the dental restoration is to be drilled from a multicolored milling block it is important that the dental restoration is milled from the correct parts of the milling block. In step 534 a CAD model of the milling block is provided, where the CAD model comprises information of the shade profile of the milling block material. The optimal position of the digital restoration design relative to the CAD model of the milling block is then determined in 535, where different criteria can be apply to provide the best fit between the desired shade profile and what actually can be obtained as dictated by the shade profile of the milling block.

(24) In step 536 the dental restoration is manufactured from the milling block by removing milling block material until the dental restoration is shaped according to the digital restoration design.

(25) FIG. 6 shows a schematic of a system for determining tooth shade values. The system 640 comprises a computer device 642 comprising a computer readable medium 643 and a processor 644. The system further comprises a visual display unit 647, a computer keyboard 645 and a computer mouse 646 for entering data and activating virtual buttons in a user interface visualized on the visual display unit 647. The visual display unit can be a computer screen. The computer device 642 is capable of receiving a digital 3D representation of the patient's set of teeth from a scanning device 641, such as the TRIOS intra-oral color scanner manufactured by 3 shape A/S, or capable of receiving scan data from such a scanning device and forming a digital 3D representation of the patient's set of teeth based on such scan data. The obtained digital 3D representation can be stored in the computer readable medium 643 and provided to the processor 644. The processor is configured for implementing the method according to any of the embodiments. This may involve presenting one or more options to the operator, such as where to derive the tooth shade value and whether to accept a derived tooth shade value. The options can be presented in the user interface visualized on the visual display unit 647.

(26) Many scanning devices have Bayer color filters with Red, Green and Blue filters and hence record color information in the RGB color space. For instance a focus scanner can record series of 2D color images for the generation of sub-scans, where the color information is provided in the RGB color space. The processor 644 then comprises algorithms for transforming the recorded color data into e.g. the L*a*b or L*C*h color spaces.

(27) The system may further comprise a unit 648 for transmitting a digital restoration design and a CAD model of a milling block to e.g. a computer aided manufacturing (CAM) device 649 for manufacturing a shaded dental restoration or to another computer system e.g. located at a milling center where the dental restoration is manufactured. The unit for transmitting the digital restoration design can be a wired or a wireless connection.

(28) The scanning of the patient's set of teeth using the scanning device 641 can be performed at a dentist while deriving the tooth shade values can be performed at a dental laboratory. In such cases the digital 3D representation of the patient's set of teeth can be provided via an internet connection between the dentist and the dental laboratory.

(29) FIGS. 7A-7D and 8A-8B show schematics of intra-oral scanning.

(30) Different scanner configurations can be used to acquire sub-scans comprising both shape and texture information. In some scanner designs the scanner is mounted on axes with encoders which provides that the sub-scans acquired from different orientations can be combined using position and orientation readings from the encoders. When the scanner operates by the focus-scanning technique the individual sub-scans of the tooth are derived from a sequence of 2D images obtained while scanning a focus plane over a portion of the tooth. The focus scanning technique is described in detail in WO2010145669. The shape information of the sub-scans for an object, such as a tooth, can be combined by algorithms for stitching and registration as widely known in the literature. Texture data relating to the tooth color can be obtained using a scanner having a multi-chromatic light source, e.g. a white light source and a color image sensor. Color information from multiple sub-scans can be interpolated and averaged by methods such as texture weaving, or by simply averaging corresponding color components of the sub-scans corresponding to the same point/location on the surface. Texture weaving is described by e.g. Callieri M, Cignoni P, Scopigno R. “Reconstructing textured meshes from multiple range rgb maps”. VMV 2002, Erlangen, Nov. 20-22, 2002.

(31) In FIG. 7A the scanner 741 (here represented by a cross-sectional view of the scanner tip) is held in one position relative to the teeth 711, 760 (also represented by a cross-sectional view) while recording a sequence of 2D images for one sub-scan. The illustrated teeth can e.g. be the anterior teeth in the lower jaw. The size of the Field of View (here represented by the full line 761 on the teeth) of the scanner is determined by the light source, the optical components and the image sensor of the scanner. In the illustrated example, the Field of View 761 covers part of the surface of the tooth 711 and part of the surface of the neighbor tooth 760. The generated digital 3D representation can thus also contain data for the neighbor teeth. This is often advantageous, e.g. when the generated digital 3D representation is used for creating a digital restoration design for the manufacture of a dental restoration for the tooth. In the Figure, the scanner is arranged such that the acquired sub-scan comprises shape and color information for the incisal edge 762 of the teeth. The probe light rays 763 from the scanner corresponding to the perimeter of the Field of View are also shown in the Figure. These probe light rays 763 define the optical path 764 of the scanner probe light at the tooth 711.

(32) A digital 3D representation of the tooth can be generated by combining sub-scans acquired from different orientations relative to the teeth, e.g. by sub-scan registration. Sub-scans acquired from three such different orientations are illustrated in FIGS. 7B, 7C and 7D, where only the optical path 763 of the scanner probe light is used to represent the relative scanner/tooth orientation in FIGS. 7C and 7D. The sub-scans (here represented by the full line 765 on the teeth) covers different but overlapping sections of the tooth surface such that the sub-scans can be combined by registration into a common coordinate system using e.g. an Iterative Closest Point (ICP) algorithm as described above. A segment of each of the sub-scans corresponds to the point P on the tooth surface. When the sub-scans are registered to generate a digital 3D representation of the tooth, a correlation between these segments is established and the texture information of these sub-scan segments can be combined to determine the texture data for point P on the generated digital 3D representation of the tooth.

(33) One way of doing this is to calculate the average value for each of the parameters used to describe the texture. For example, when the L*a*b*color system is used to describe the color information provided in each sub-scan, the color data of the digital 3D representation can be derived by averaging over each of the L*, a*, and b*parameters of the sub-scans. For example, the L*parameter of the color data for a given point P is then given by

(34) L * ( P ) = 1 N .Math. i N L i * ( P )
where N is the number of sub-scans used in deriving the texture data and L*.sub.i(P) is the L*parameter of the i'th sub-scan for the segment relating to P. Equivalent expressions are true for the a*and b*parameters for point P. The color parameters for each point on the digital 3D representation of the tooth can be determined for sections of or the entire surface of the tooth, such that the generated digital 3D representation comprises both shape and texture information about the tooth. The spatial resolution of the color data does not necessarily have to be identical to the resolution of the shape data of the digital 3D representation. The point P can be described e.g. in Cartesian, cylindrical or polar coordinates.

(35) When the color data is derived for a point on the tooth, the tooth shade value for that point can be determined by comparing the derived color data with the known color data of the reference tooth shade values of a tooth shade guide such as the VITA 3D-Master.

(36) FIG. 8A-8B illustrates some potentially problematic tooth surface areas for particular arrangements of the scanner 841 relative to the tooth 811.

(37) FIG. 8A shows two points P.sub.i and P.sub.ii on the tooth 811 where the tooth surface is either substantially perpendicular or parallel to the optical path, such that the texture information recorded at P.sub.i and P.sub.ii may be unreliable. This is because the tooth surface at P.sub.i is perpendicular to the optical path 864i at point P.sub.i which introduces the risk of having specular reflections of the probe light. The optical path 864ii at point P.sub.ii is parallel to the tooth surface at P.sub.ii such that the signal recorded from this part of the tooth surface in this sub-scan is relatively weak. This may cause that the color information in this section of the sub-scan are unreliable.

(38) In order to obtain more precise color data the averaging of the color information described above in relation to FIG. 7 can be a weighted averaging where the color information of unreliable sub-scans segments are assigned a lower weight than others.

(39) In FIG. 8B is indicated three different optical paths 864i, 864ii and 864iii at which sub-scans are acquired. When combining the color information for point P the color information of the segments of the sub-scans recorded with optical paths 864i and 864ii should be given a lower weight that the color information of the segment of the sub-scan recorded with the optical path 864iii.

(40) This can be expressed by a modification of the equation given above. For a weighted averaging of the color information, the L*parameter of the color data for a given point P is given by L*(P)=Σ.sub.i.sup.N{α.sub.i(P).Math.L*.sub.i(P)}/Σ.sub.i.sup.Nα.sub.i where α.sub.i(P) is the weight factor for the color information of the i'th sub-scan in the segment at P. When a given sub-scan (e.g. the j'th sub-scan) is recorded at an angle relative to the tooth surface which causes the optical path to be e.g. perpendicular to the tooth surface at P, the corresponding weight factor α.sub.i(P) is given a lower value than the color data of sub-scans acquired with an oblique angle between the optical path and the tooth surface.

(41) Equivalent equations are true for the a*and b*parameters of the color data for point P.

(42) FIG. 9A-9B illustrates how a tooth shade value for a point P on a tooth can be determined based on reference tooth shade values.

(43) For a given point P on the digital 3D representation, the color data (L*.sub.P, a*.sub.P, b*.sub.P) has been determined, e.g. by combining the color information of a series of sub-scans used for generating the digital 3D representation. If the color information originally is recorded using the RGB color space it is transformed into the L*a*b*color space using algorithms known to the skilled person.

(44) In the example illustrated by FIG. 9A, the color data of the digital 3D representation and the known color values of the reference tooth shades are expressed in the L*a*b*color space, and the reference tooth shades are those from the VITA classical shade guide.

(45) The reference shade values of the Vita classical shade guide are: B1, A1, B2, D2, A2, C1, C2, D4, A3, D3, B3, A3.5, B4, C3, A4, and C4. The color data of these reference shades can be provided by scanning the corresponding pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P.

(46) The tooth shade value for the point is determined as the reference tooth shade value which has the smallest Euclidian distance to the point in the L*a*b color space. The Euclidian distance ΔE.sub.P-R.sup.1 from the color (L*.sub.P, a*.sub.P, b*.sub.P) to the known colors of the reference tooth shade values are calculated using the expression:

(47) Δ E P - R i = ( L P * - L R i * ) 2 + ( a P * - a R i * ) 2 + ( b P * - b R i * ) 2 2
where Ri refers to the i'th reference tooth shade.

(48) In FIG. 9A only the known colors (L*.sub.R1, a*.sub.R1, b*.sub.R1) and (L*.sub.R2, a*.sub.R2, b*.sub.R2) for the two closest reference values R1 and R2, respectively, are illustrated for simplicity. It can be seen that the Euclidian distance in the color space from P to R2 is the smallest, and the tooth shade in point P is hence selected as that of R2.

(49) The certainty score for the tooth shade value determined for point P depends on how close the color data of the point P is to the known color value of the selected reference tooth shade value. This can be quantified by the Euclidian distance and since point P is not particularly close to R2 in FIG. 9A the determined tooth shade has a poor certainty value.

(50) An alternative approach to using the Euclidian distance is to determine individual parameters of the tooth shade value one at a time. This approach can be used e.g. when the reference tooth shades values are those of the Vita 3D-master system.

(51) The reference tooth shade values of the Vita 3D-master shade guide are expressed in codes consisting of the three parameters Lightness-hue-Chroma, where Lightness is given in values between 1 and 5, the Chroma in values between 1 and 3, and the hue as one of “L”, “M”, or “R”. A shade code in the Vita 3D-master can e.g. be 2M1, where the Lightness parameter equals 2, the Chroma 1 and the hue “M”.

(52) The known color data of the VITA 3D-master shade guide reference shades can be provided by scanning the pre-manufactured teeth of the shade guide. These color data are then also initially obtained in the RGB color space and can be converted to the L*a*b color space using the same algorithms applied to the color information/data for the point P. The known color data of each reference shade guide (having a code expressed in terms of Lightness, hue and Chroma) is then provided in terms of the L*a*b color space.

(53) Since the lightness L has the largest impact on the human perception of the tooth color, the value of the Lightness-parameter L*.sub.P in the point is determined first. The value of L*.sub.P is compared with the values of the L*parameters for the reference tooth shades. If L*.sub.P is close to the L*-value for the i'th reference tooth shade value, L*.sub.Ri the L*parameter for point P may be set equal to L*.sub.Ri.

(54) In some cases the Lightness parameter is not close to any of the references but instead is located almost in the middle between two L*-values. For example when L*.sub.P in the point is between the values of L*.sub.Ri=2 and L*.sub.Ri+1=3 with almost equal distance to each of these as illustrated in FIG. 9B. Since the L*a*b color space is a linear space, the individual parameters of the shade values can be interpolated such that the Lightness for point P, L*.sub.P, can be set to 2.5.

(55) The same procedure is performed for first the Chroma parameter and finally for the hue such that the three parameter of the tooth shade value are determined.

(56) Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention.

(57) In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.

(58) A claim may refer to any of the preceding claims, and “any” is understood to mean “any one or more” of the preceding claims.

(59) It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

(60) The features of the method described above and in the following may be implemented in software and carried out on a data processing system or other processing means caused by the execution of computer-executable instructions. The instructions may be program code means loaded in a memory, such as a RAM, from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software or in combination with software.

REFERENCES

(61) Hassel 2012: Hassel et al. “Determination of VITA Classical shades with the 3D-Master shade guide”. Acta Ocdontol Scand. 2013; 71(3-4):721-6.

(62) Dozic 2007: Dozic et al. “Performance of five commercially available tooth color-measuring devices”, J Prosthodont. 2007; 16(2):93-100.

EMBODIMENTS

(63) 1. A method for determining shade of a patient's tooth, wherein the method comprises: obtaining a digital 3D representation of the tooth, where the digital 3D representation comprises shape data and texture data for the tooth; and determining a tooth shade value for at least one point on the tooth based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values.

(64) 2. The method according to embodiment 1, wherein determining the tooth shade value for the point comprises selecting the reference tooth shade value with the known texture value closest to the texture data of the point.

(65) 3. The method according to embodiment 1 or 2, wherein determining the tooth shade value for the point comprises an interpolation of the two or more reference tooth shade values having known texture values close to the texture data of the point.

(66) 4. The method according to any one of the preceding embodiments, wherein the method comprises deriving a certainty score expressing the certainty of the determined tooth shade value.

(67) 5. The method according to embodiment 4, wherein the method comprises generating a visual representation of the certainty score and displaying this visual representation in a user interface.

(68) 6. The method according to embodiment 5, wherein the visual representation of the certainty score is displayed together with or is mapped onto the digital 3D representation of the tooth.

(69) 7. The method according to any one of embodiments 4 to 6, wherein the method comprises comparing the derived certainty score with a range of acceptable certainty score values.

(70) 8. The method according to any one of embodiments 4 to 7, wherein the certainty measure relates to how uniform the sub-scan texture information is at the point, and/or to how close the texture data is to the known texture value of the determined tooth shade value, and/or to the amount of texture information used to derive the texture data at the point.

(71) 9. The method according to any one of embodiments 4 to 8, wherein the visual representation of the certainty score comprises a binary code, a bar structure with a color gradient, a numerical value, and/or a comparison between the texture data and the known texture value of the determined tooth shade value.

(72) 10. The method according to any one of the preceding embodiments, wherein the one or more reference tooth shade values relate to shade values for natural teeth with intact surface and/or to shade values for teeth prepared for a dental restoration.

(73) 11. The method according to any one of the preceding embodiments, wherein the method comprises comparing the texture data with known texture values for soft oral tissue, such as gum tissue and gingiva.

(74) 12. The method according to any of the previous embodiments, wherein the texture information comprises at least one of tooth color or surface roughness.

(75) 13. The method according to any one of the preceding embodiments, wherein the method comprises creating a shade profile for the tooth from shade values determined one or more points on the tooth.

(76) 14. The method according to embodiment 13, wherein the tooth shade profile comprises a one or more tooth shade regions on the tooth surface where an average tooth shade is derived for each region from tooth shade values determined for a number of points within the region.

(77) 15. The method according to any of the previous embodiments, wherein obtaining the digital 3D representation of the tooth comprises recording a series of sub-scans of the tooth, where at least one of said sub-scans comprises both texture information and geometry information for said tooth, and generating the digital 3D representation of the tooth from the recorded series of sub-scans.

(78) 16. The method according to embodiment 15, wherein the texture data at least partly are derived by combining the texture information from corresponding parts of a number of the sub-scans.

(79) 17. The method according to embodiment 16, wherein combining the texture information from the sub-scans comprises interpolating the texture information and/or calculating an average value of the texture information.

(80) 18. The method according to embodiment 17, wherein the calculated average value is a weighted average of the texture information.

(81) 19. A user interface for determining and displaying shade of a patient's tooth, wherein the user interface is configured for: obtaining a digital 3D representation of the tooth, said digital 3D representation comprising shape data and texture data for the tooth; displaying at least the shape data of the digital 3D representation such that the shape of the tooth is visualized in the user interface; determining a tooth shade value for at least one point on the tooth based on the texture data of the corresponding point of the digital 3D representation and on known texture values of one or more reference tooth shade values; and
displaying the determined tooth shade value.