DETECTING AND MONITORING DEVELOPMENT OF A DENTAL CONDITION

20220287811 · 2022-09-15

Assignee

Inventors

Cpc classification

International classification

Abstract

A method, user interface and system for detecting and monitoring development of a dental condition. In particular, detecting and monitoring such a development by comparing digital 3D representations of the patient's set of teeth recorded at a first and a second point in time. For example, determining tooth movement for at least one tooth between the first and second point in time based on derived distances.

Claims

1. A non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for detecting or monitoring a dental condition of a patient's teeth, wherein the method comprises: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; and subsequently detecting a change in a parameter representing a dental condition based on the comparison, wherein, when a detected change exceeds a set threshold, an operator is prompted through a graphical user interface.

2. The non-transitory computer readable medium of claim 1, wherein the operator is prompted with a notification on the graphical user interface.

3. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.

4. The non-transitory computer readable medium of claim 3, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.

5. The non-transitory computer readable medium of claim 3, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.

6. The non-transitory computer readable medium of claim 1, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.

7. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.

8. The non-transitory computer readable medium of claim 7, wherein the threshold value relates to an expected depth of a patient's enamel.

9. The non-transitory computer readable medium of claim 7, wherein the threshold value is set on the basis of a measured reduction in tooth height.

10. The non-transitory computer readable medium of claim 7, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.

11. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is gingival recession.

12. The non-transitory computer readable medium of claim 11, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.

13. The non-transitory computer readable medium of claim 1, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.

14. The non-transitory computer readable medium of claim 1, wherein aligning is based on the patient's rugae.

15. The non-transitory computer readable medium of claim 1, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.

16. A system comprising a non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for visualize a change in a dental condition of a patient's set of teeth and a graphical user interface, wherein the graphical user interface is configured to visualize the change in a dental condition of a patient's set of teeth by the method, the method comprising: obtaining a first digital 3D representation of the teeth recorded at a first point in time; obtaining a second digital 3D representation of the teeth recorded at a second point in time; comparing at least parts of the first and second digital 3D representations; and detecting, based on the comparison, a change in a parameter relating to the dental condition, wherein, when a change exceeding a set threshold is detected, an operator is prompted through the graphical user interface.

17. The system of claim 16, wherein the graphical user interface is configured to visualize a development of a dental condition by aligning the digital 3D representations and controlling transparency of the digital 3D representations based on a timeline indicator.

18. The system of claim 16, wherein the graphical user interface is configured to visualize a change in color over different parts of the patient's teeth by displaying a difference map.

19. A method for detecting or monitoring a dental condition of a patient's teeth, the method comprising: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; subsequently detecting a change in a parameter representing a dental condition based on the comparison; and prompting an operator through a graphical user interface when a detected change exceeds a set threshold.

20. The method of claim 19, wherein the operator is prompted with a notification on the graphical user interface.

21. The method of claim 19, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.

22. The method of claim 21, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.

23. The method of claim 21, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.

24. The method of claim 19, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.

25. The method of claim 19, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.

26. The method of claim 25, wherein the threshold value relates to an expected depth of a patient's enamel.

27. The method of claim 25, wherein the threshold value is set on the basis of a measured reduction in tooth height.

28. The method of claim 25, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.

29. The method of claim 19, wherein the parameter representing a dental condition is gingival recession.

30. The method of claim 29, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.

31. The method of claim 19, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.

32. The method of claim 19, wherein aligning is based on the patient's rugae.

33. The method of claim 19, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.

34. A system comprising: a non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for visualize a change in a dental condition of a patient's set of teeth; and a graphical user interface, wherein the method comprises: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; and subsequently detecting a change in a parameter representing a dental condition based on the comparison, wherein, when a detected change exceeds a set threshold, an operator is prompted through the graphical user interface.

35. The system of claim 34, wherein the operator is prompted with a notification on the graphical user interface.

36. The system of claim 34, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.

37. The system of claim 36, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.

38. The system of claim 36, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.

39. The system of claim 34, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.

40. The system of claim 34, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.

41. The system of claim 40, wherein the threshold value relates to an expected depth of a patient's enamel.

42. The system of claim 40, wherein the threshold value is set on the basis of a measured reduction in tooth height.

43. The system of claim 40, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.

44. The system of claim 34, wherein the parameter representing a dental condition is gingival recession.

45. The system of claim 44, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.

46. The system of claim 34, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.

47. The system of claim 34, wherein aligning is based on the patient's rugae.

48. The system of claim 34, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0123] The above and/or additional objects, features and advantages of the present invention, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:

[0124] FIG. 1 shows a workflow for an embodiment.

[0125] FIGS. 2A-2B show a set of teeth and segmentation of a tooth.

[0126] FIGS. 3A-3C illustrate an embodiment for detecting gingival retraction.

[0127] FIGS. 4A-4D illustrate how anatomically correct measurement of tooth movement can be made.

[0128] FIG. 5 shows an outline of a system.

DETAILED DESCRIPTION

[0129] In the following description, reference is made to the accompanying figures, which show by way of illustration how the invention may be practiced.

[0130] FIG. 1 shows a workflow for an embodiment of the method for detecting development of a dental condition for a patient's set of teeth between a first and a second point in time. The workflow 100 includes steps 102, 103 for obtaining a first and a second digital 3D representation of the patient's set of teeth. The digital 3D representations can be recorded using an intra oral scanner, such as the TRIOS 3 intra oral scanner by 3shape A/S which can record both the topography and color of the patient's set of teeth. The recorded digital 3D representations then expresses both the geometry and colors of the scanned teeth at the first and second points in time. Color calibrating the scanner regularly or just prior to the scanning provides that the measured colors are true and that the color recorded at one visit at the dentist can be compared with the colors measured at another visit.

[0131] In step 104 the first and second digital 3D representations are globally aligned using, e.g., a 3-point alignment where 3 corresponding regions are marked on the first and second digital 3D representations. The aligned digital 3D representations can then be visualized in the same user interface and comparisons between shapes and sizes of teeth can be made straightforward. The global alignment of the digital 3D representations can be performed using a computer implemented algorithm, such as an Iterative Closest Point (ICP) algorithm, employed to minimize the difference between digital 3D representations.

[0132] In step 105 the aligned first and second digital 3D representations are compared, e.g., by calculating a difference map showing the distance between the digital 3D representations at the different parts of the teeth of teeth. Such as difference map can, e.g., be used for monitoring tooth movement during an orthodontic treatment. Based on the comparison a change in a parameter relating to the dental condition can be detected in step 106 and the change in the parameter can be correlated with a development of a dental condition in step 107.

[0133] When the dental condition corresponds to caries and the development of the caries is monitored using change in tooth color from white to brown in the infected region, the global alignment and comparison of the digital 3D representations provide that a change in the tooth color to a more brownish color in a region of the teeth can be detected and the region can be visualized to the operator. The change in the color can be measured using color values of, e.g., the RGB system and can be correlated with knowledge of the usual changes in tooth colors during development of caries.

[0134] FIGS. 2A-2B show a digital 3D representation of a set of teeth and segmentation of the digital 3D representation to create a 3D model of a tooth. The digital 3D representation 230 has topography data for four anterior teeth 2311, 2312, 2313, 2314 and for a portion of the corresponding gingiva with the gingival boundary 232 as indicated in FIG. 2A. The segmentation of the digital 3D representation provides a 3D tooth model 233 which has the shape of the corresponding tooth part of the digital 3D representation 2312 and is bounded by the gingival boundary 232. In FIG. 2B the 3D tooth model 233 is still arranged along with the other parts of the digital 3D representation according to the arrangement of the tooth in the digital 3D representation.

[0135] FIGS. 3A-3C illustrates an embodiment for detecting gingival retraction at the patient's left central incisor 3313. This tooth is segmented in both the first 340 and second 341 digital 3D representation of the set of teeth showing the two central incisors in the lower jaw and the gingival boundary 332 as seen inn FIGS. 3A and 3B. The change in the position of the gingival boundary is so small that when the two digital 3D representations are seen separately the change is hardly visible. A section having topography data relating to both the tooth surface and the gingiva is selected and the two sections are locally aligned based on the tooth topography data. The local alignment can be performed using iterative closest point algorithm. All data of the sections are aligned according to this local transformation as seen in FIG. 3C and the gingival retraction from the boundary 3321 in the first digital 3D representation to the 3322 boundary in the second digital 3D representation becomes clearly visible. The gingival retraction can now be measured as the distance between the two boundaries.

[0136] FIG. 4A shows cross sections of a first 3D tooth model 451 and a second 3D tooth model 451 segmented from a first and a second digital 3D representation, respectively. The first digital 3D representation can for example represent the patient's teeth at the onset of an orthodontic treatment and the second digital 3D representation at some point during the treatment. A transformation T which aligns the first 3D tooth model with the second 3D tooth model is determined and applied to the first 3D tooth model to provide that the two tooth models are locally aligned as illustrated in FIG. 4B. In FIG. 4C three portions 4551, 4561 and 4571 are selected on the first 3D tooth model 451. Since the first and second 3D tooth models are locally aligned the anatomically corresponding portions 4552, 4562 and 4572 can readily and precisely be identified on the second 3D tooth model 451. In FIG. 4D portions 4551, 4561 and 4571 are marked on corresponding portions of the first digital 3D representation 460 and portions 4552, 4562 and 4572 are marked on the corresponding portions of the second digital 3D representation 461. The first and second digital 3D representations are global aligned based, e.g., on the other teeth of the same quadrant or same arch. The anatomical correct distance between the marked regions can then be determined and based on these distances a measure for the movement of the tooth between the first and second point in time can be derived.

[0137] In short the workflow described here has the following steps: [0138] selecting one or more corresponding regions on the locally aligned segmented teeth, [0139] globally aligning the first and second digital 3D representations; [0140] identifying the selected corresponding regions on the globally aligned first and second digital 3D representations [0141] deriving the distances between the selected corresponding regions on the globally aligned first and second digital 3D representations [0142] determining the tooth movement based on the derived distances.

[0143] In a computer program product configured for implementing the method the portions on the first 3D tooth model can be selected by an operator or by the computer program product when this is configured for detecting appropriate portions, such as characteristic portions on the cusp. The selected portion can also be the entire tooth surface such that a distance map is derived showing the movement for the entire surface.

[0144] Other workflows can also be used to measure the distance such as: [0145] selecting one or more corresponding regions on the locally aligned segmented teeth, [0146] arranging the first and second 3D tooth models according to the global alignment; [0147] deriving the distances between the selected corresponding regions in the global alignment of the first and second 3D tooth models; [0148] determining the tooth movement based on the derived distances.

[0149] FIG. 5 shows a schematic of a system according to an embodiment. The system 570 has a computer device 571 with a data processing unit in the form of a microprocessor 572 and a non-transitory computer readable medium 573 encoded with a computer program product providing a digital tool for determining the movement of teeth, e.g., during an orthodontic treatment. The system further has a visual display unit 576, a computer keyboard 574 and a computer mouse 575 for entering data and activating virtual buttons of a user interface visualized on the visual display unit 576. The visual display unit 576 can, e.g., be a computer screen. The computer device 571 is capable of storing obtained digital 3D representations of the patient's teeth in the computer readable medium 573 and loading these into the microprocessor 572 for processing. The digital 3D representations can be obtained from a 3D color scanner 577, such as the 3Shape TRIOS 3 intra-oral scanner, which is capable of recording a digital 3D representation containing both geometrical data and color data for the teeth.

[0150] Besides color and geometry data the digital 3D representation can also include diagnostic data, such fluorescence data obtained using an intra-oral scanner.

[0151] The computer readable medium 573 can further store computer implemented algorithms for segmenting a digital 3D representation to create digital 3D models of the individual teeth and for selecting regions on the surface for a local alignment. When digital 3D models for the same tooth is created from different digital 3D representations, such as digital 3D representations recorded at different points in time, the digital 3D models can be locally aligned using, e.g., Iterative Closest Point algorithms (ICP) for minimizing the distance between the surfaces of the digital 3D representations. The digital 3D representations of the patient's entire set of teeth or sections thereof can be globally aligned also using such ICP algorithms. When the digital 3D representations of the teeth are globally aligned with the anatomically correct corresponding regions of a given tooth identified by the local alignment procedure applied to the digital 3D model of that tooth, the precise measure of the movement of the tooth between the points in time where the two digital 3D representations were recorded can be determined.

[0152] When the tooth movement has been determined it can be visualized to the operator in the visual display unit 576, e.g., as a distance map or using a cross sectional view of the 3D tooth models or the digital 3D representations.

[0153] The digital 3D models of the individual teeth can be stored on the computer readable medium and be re-used at the next visit for the identification of individual teeth in a digital 3D representation recorded at the next visit.

[0154] Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention.

[0155] In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.

[0156] A claim may refer to any of the preceding claims, and “any” is understood to mean “any one or more” of the preceding claims.

[0157] It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

[0158] The features of the method described above and in the following may be implemented in software and carried out on a data processing system or other processing means caused by the execution of computer-executable instructions. The instructions may be program code means loaded in a memory, such as a RAM, from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software or in combination with software.