DETECTING AND MONITORING DEVELOPMENT OF A DENTAL CONDITION
20220287811 · 2022-09-15
Assignee
Inventors
Cpc classification
A61C9/0053
HUMAN NECESSITIES
A61C19/04
HUMAN NECESSITIES
International classification
A61B5/11
HUMAN NECESSITIES
A61C19/04
HUMAN NECESSITIES
Abstract
A method, user interface and system for detecting and monitoring development of a dental condition. In particular, detecting and monitoring such a development by comparing digital 3D representations of the patient's set of teeth recorded at a first and a second point in time. For example, determining tooth movement for at least one tooth between the first and second point in time based on derived distances.
Claims
1. A non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for detecting or monitoring a dental condition of a patient's teeth, wherein the method comprises: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; and subsequently detecting a change in a parameter representing a dental condition based on the comparison, wherein, when a detected change exceeds a set threshold, an operator is prompted through a graphical user interface.
2. The non-transitory computer readable medium of claim 1, wherein the operator is prompted with a notification on the graphical user interface.
3. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.
4. The non-transitory computer readable medium of claim 3, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.
5. The non-transitory computer readable medium of claim 3, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.
6. The non-transitory computer readable medium of claim 1, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.
7. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.
8. The non-transitory computer readable medium of claim 7, wherein the threshold value relates to an expected depth of a patient's enamel.
9. The non-transitory computer readable medium of claim 7, wherein the threshold value is set on the basis of a measured reduction in tooth height.
10. The non-transitory computer readable medium of claim 7, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.
11. The non-transitory computer readable medium of claim 1, wherein the parameter representing a dental condition is gingival recession.
12. The non-transitory computer readable medium of claim 11, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.
13. The non-transitory computer readable medium of claim 1, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.
14. The non-transitory computer readable medium of claim 1, wherein aligning is based on the patient's rugae.
15. The non-transitory computer readable medium of claim 1, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.
16. A system comprising a non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for visualize a change in a dental condition of a patient's set of teeth and a graphical user interface, wherein the graphical user interface is configured to visualize the change in a dental condition of a patient's set of teeth by the method, the method comprising: obtaining a first digital 3D representation of the teeth recorded at a first point in time; obtaining a second digital 3D representation of the teeth recorded at a second point in time; comparing at least parts of the first and second digital 3D representations; and detecting, based on the comparison, a change in a parameter relating to the dental condition, wherein, when a change exceeding a set threshold is detected, an operator is prompted through the graphical user interface.
17. The system of claim 16, wherein the graphical user interface is configured to visualize a development of a dental condition by aligning the digital 3D representations and controlling transparency of the digital 3D representations based on a timeline indicator.
18. The system of claim 16, wherein the graphical user interface is configured to visualize a change in color over different parts of the patient's teeth by displaying a difference map.
19. A method for detecting or monitoring a dental condition of a patient's teeth, the method comprising: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; subsequently detecting a change in a parameter representing a dental condition based on the comparison; and prompting an operator through a graphical user interface when a detected change exceeds a set threshold.
20. The method of claim 19, wherein the operator is prompted with a notification on the graphical user interface.
21. The method of claim 19, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.
22. The method of claim 21, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.
23. The method of claim 21, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.
24. The method of claim 19, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.
25. The method of claim 19, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.
26. The method of claim 25, wherein the threshold value relates to an expected depth of a patient's enamel.
27. The method of claim 25, wherein the threshold value is set on the basis of a measured reduction in tooth height.
28. The method of claim 25, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.
29. The method of claim 19, wherein the parameter representing a dental condition is gingival recession.
30. The method of claim 29, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.
31. The method of claim 19, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.
32. The method of claim 19, wherein aligning is based on the patient's rugae.
33. The method of claim 19, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.
34. A system comprising: a non-transitory computer readable medium encoded with a computer program for causing a processor to perform a method for visualize a change in a dental condition of a patient's set of teeth; and a graphical user interface, wherein the method comprises: obtaining a first digital 3D representation of the teeth recorded at the first point in time and segmenting the first digital 3D representation such that a first 3D tooth model is formed for at least one tooth; obtaining a second digital 3D representation of the teeth recorded at the second point in time and segmenting the second digital 3D representation such that a second 3D tooth model is formed for the least one tooth; aligning the first and second 3D tooth models; comparing one or more corresponding regions on the aligned first and second 3D tooth models; and subsequently detecting a change in a parameter representing a dental condition based on the comparison, wherein, when a detected change exceeds a set threshold, an operator is prompted through the graphical user interface.
35. The system of claim 34, wherein the operator is prompted with a notification on the graphical user interface.
36. The system of claim 34, wherein the parameter representing a dental condition is caries, and wherein the detecting comprises: detecting a change in color and/or shade value between the one or more corresponding regions on the aligned first and second digital 3D tooth models, wherein a detected change in color and/or shade value is correlated with an expected change in the color or shade in response to the development of caries.
37. The system of claim 36, wherein when the detected change reaches a set threshold value defined by a darkening color change, the detected change triggers prompting the operator.
38. The system of claim 36, wherein the detecting comprises determining a color or shade difference map for the one or more corresponding regions by comparing the shade data or color data of the one or more corresponding regions.
39. The system of claim 34, wherein the detecting comprises selecting a region of interest as the one or more corresponding regions, wherein the detecting comprises determining a color value for at least said region of interest in the first and second digital 3D tooth models and determining the change in the color value between the first and second digital 3D tooth models.
40. The system of claim 34, wherein the parameter representing a dental condition is tooth wear, wherein the detecting comprises: detecting a change in the shape and/or size of at least one tooth; and correlating a detected change in the tooth size with a threshold value, wherein, when the detected change exceeds a set threshold value, the operator is prompted.
41. The system of claim 40, wherein the threshold value relates to an expected depth of a patient's enamel.
42. The system of claim 40, wherein the threshold value is set on the basis of a measured reduction in tooth height.
43. The system of claim 40, wherein the threshold value is set on the basis of a tooth shape at an occlusal surface of the patient's teeth.
44. The system of claim 34, wherein the parameter representing a dental condition is gingival recession.
45. The system of claim 44, wherein the detecting comprising detecting a change in the positon of the gingival boundary by comparing gingival boundaries in first and second digital 3D tooth models.
46. The system of claim 34, wherein the alignment is based on at least two teeth in the digital 3D representation, such as the neighboring teeth.
47. The system of claim 34, wherein aligning is based on the patient's rugae.
48. The system of claim 34, wherein aligning the first and second 3D tooth models comprises determining a transformation matrix which provides the alignment, and where distances are derived from the transformation matrix.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0123] The above and/or additional objects, features and advantages of the present invention, will be further elucidated by the following illustrative and non-limiting detailed description of embodiments of the present invention, with reference to the appended drawings, wherein:
[0124]
[0125]
[0126]
[0127]
[0128]
DETAILED DESCRIPTION
[0129] In the following description, reference is made to the accompanying figures, which show by way of illustration how the invention may be practiced.
[0130]
[0131] In step 104 the first and second digital 3D representations are globally aligned using, e.g., a 3-point alignment where 3 corresponding regions are marked on the first and second digital 3D representations. The aligned digital 3D representations can then be visualized in the same user interface and comparisons between shapes and sizes of teeth can be made straightforward. The global alignment of the digital 3D representations can be performed using a computer implemented algorithm, such as an Iterative Closest Point (ICP) algorithm, employed to minimize the difference between digital 3D representations.
[0132] In step 105 the aligned first and second digital 3D representations are compared, e.g., by calculating a difference map showing the distance between the digital 3D representations at the different parts of the teeth of teeth. Such as difference map can, e.g., be used for monitoring tooth movement during an orthodontic treatment. Based on the comparison a change in a parameter relating to the dental condition can be detected in step 106 and the change in the parameter can be correlated with a development of a dental condition in step 107.
[0133] When the dental condition corresponds to caries and the development of the caries is monitored using change in tooth color from white to brown in the infected region, the global alignment and comparison of the digital 3D representations provide that a change in the tooth color to a more brownish color in a region of the teeth can be detected and the region can be visualized to the operator. The change in the color can be measured using color values of, e.g., the RGB system and can be correlated with knowledge of the usual changes in tooth colors during development of caries.
[0134]
[0135]
[0136]
[0137] In short the workflow described here has the following steps: [0138] selecting one or more corresponding regions on the locally aligned segmented teeth, [0139] globally aligning the first and second digital 3D representations; [0140] identifying the selected corresponding regions on the globally aligned first and second digital 3D representations [0141] deriving the distances between the selected corresponding regions on the globally aligned first and second digital 3D representations [0142] determining the tooth movement based on the derived distances.
[0143] In a computer program product configured for implementing the method the portions on the first 3D tooth model can be selected by an operator or by the computer program product when this is configured for detecting appropriate portions, such as characteristic portions on the cusp. The selected portion can also be the entire tooth surface such that a distance map is derived showing the movement for the entire surface.
[0144] Other workflows can also be used to measure the distance such as: [0145] selecting one or more corresponding regions on the locally aligned segmented teeth, [0146] arranging the first and second 3D tooth models according to the global alignment; [0147] deriving the distances between the selected corresponding regions in the global alignment of the first and second 3D tooth models; [0148] determining the tooth movement based on the derived distances.
[0149]
[0150] Besides color and geometry data the digital 3D representation can also include diagnostic data, such fluorescence data obtained using an intra-oral scanner.
[0151] The computer readable medium 573 can further store computer implemented algorithms for segmenting a digital 3D representation to create digital 3D models of the individual teeth and for selecting regions on the surface for a local alignment. When digital 3D models for the same tooth is created from different digital 3D representations, such as digital 3D representations recorded at different points in time, the digital 3D models can be locally aligned using, e.g., Iterative Closest Point algorithms (ICP) for minimizing the distance between the surfaces of the digital 3D representations. The digital 3D representations of the patient's entire set of teeth or sections thereof can be globally aligned also using such ICP algorithms. When the digital 3D representations of the teeth are globally aligned with the anatomically correct corresponding regions of a given tooth identified by the local alignment procedure applied to the digital 3D model of that tooth, the precise measure of the movement of the tooth between the points in time where the two digital 3D representations were recorded can be determined.
[0152] When the tooth movement has been determined it can be visualized to the operator in the visual display unit 576, e.g., as a distance map or using a cross sectional view of the 3D tooth models or the digital 3D representations.
[0153] The digital 3D models of the individual teeth can be stored on the computer readable medium and be re-used at the next visit for the identification of individual teeth in a digital 3D representation recorded at the next visit.
[0154] Although some embodiments have been described and shown in detail, the invention is not restricted to them, but may also be embodied in other ways within the scope of the subject matter defined in the following claims. In particular, it is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope of the present invention.
[0155] In device claims enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or described in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
[0156] A claim may refer to any of the preceding claims, and “any” is understood to mean “any one or more” of the preceding claims.
[0157] It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
[0158] The features of the method described above and in the following may be implemented in software and carried out on a data processing system or other processing means caused by the execution of computer-executable instructions. The instructions may be program code means loaded in a memory, such as a RAM, from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software or in combination with software.