LOCALIZATION SENSING METHOD FOR AN ORAL CARE DEVICE

20210059395 ยท 2021-03-04

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for monitoring the position of an oral care device in the mouth of a user, the method comprising emitting energy towards the user's face, receiving reflected energy from the user's face corresponding to the emitted energy, and determining the position of an oral care device in the mouth of the user using the received reflected energy and facial characteristics information of the user which relates to one or more facial features of the user.

    Claims

    1. A method for monitoring the position of an oral care device in the mouth of a user, the method comprising: emitting energy towards the user's face; receiving reflected energy from the user's face corresponding to the emitted energy; and determining the position of an oral care device in the mouth of the user using the received reflected energy and facial characteristics information of the user which relates to one or more facial features of the user.

    2. The method as claimed in claim 1, wherein the facial characteristics information of the user further comprises at least one of: data relating to one or more facial characteristics of the user, metadata related to the user, facial characteristics information derived from the received reflected energy.

    3. The method as claimed in claim 2, wherein the facial characteristics information of the user are at least one of: obtained from an image of the user, input by the user, obtained by processing the received reflected energy.

    4. The method as claimed in claim 2, wherein the metadata is based on information on at least one of: the user's weight, height, complexion, gender, age.

    5. The method as claimed in claim 1, wherein the position of the oral care device in the mouth of the user is determined using a mapping which indicates the position of the oral care device in the mouth of the user based on the received reflected energy and facial characteristics information of the user.

    6. The method as claimed in claim 5, wherein the mapping is selected from a plurality of mappings, the selected mapping being a mapping which is determined to be the most relevant based on the facial characteristics information of the user.

    7. The method as claimed in claim 6, wherein each of the plurality of mappings relates to a different group of people, each group sharing at least one of: certain facial characteristics, metadata; the at least one of: certain facial characteristics, metadata, of each group and the facial characteristics information of the user are used to identify which group is most relevant to the facial characteristics information of the user; and the mapping corresponding to the identified group is selected.

    8. The method as claimed in claim 5, wherein the mapping is adjusted based on the facial characteristics information.

    9. The method as claimed in claim 1, wherein the method further comprises emitting setting energy towards the user's face; and receiving reflected setting energy from the user's face corresponding to the emitted energy; and determining the amount of the energy to be emitted towards the user's face in the step of emitting energy based on the reflected setting energy; or determining the amount of energy to be emitted towards the user's face in the step of emitting energy based on at least one of: data relating to one or more facial characteristics of the user, metadata related to the user.

    10. The method as claimed in claim 1, wherein the energy is at least one of: electromagnetic energy, acoustic energy.

    11. The method as claimed in claim 1, wherein the receiving of reflected energy comprises a measurement based on a measurement of at least one of: capacitance, reflected intensity, reflected polarization.

    12. (canceled)

    13. (canceled)

    14. An oral care system comprising: an oral care device having an energy emitter and an energy detector; and a computing device configured to receive and process signals from energy emitted and received by the oral care device, and wherein the oral care system is configured to perform the method claimed in claim 1.

    15. An oral care system as claimed in claim 14, wherein at least one of: the oral care device (10) is a device chosen from a group of devices comprising: a toothbrush, a flossing device, an oral irrigator, a handle for receiving a care head for any of the foregoing devices, a care head for any of the foregoing devices; and the computing device is comprised in at least one of: a remote server, an interface device to provide user information about the use of the oral care device; wherein the interface device is chosen from a group of interface devices comprising: a smart phone, a tablet, the oral care device, the care head.

    16. An oral care device comprising an energy emitter and an energy detector, wherein the oral care device is configured to communicate with the computing device as claimed in claim 14.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0052] Embodiments of the present disclosure may take form in various components and arrangements of components, and in various steps and arrangements of steps. Accordingly, the drawings are for purposes of illustrating the various embodiments and are not to be construed as limiting the embodiments. In the drawing figures, like reference numerals refer to like elements. In addition, it is to be noted that the figures may not be drawn to scale.

    [0053] FIG. 1 is a diagram of a power toothbrush to which embodiments of aspects of the present invention may be applied;

    [0054] FIG. 2 is a schematic diagram of an oral care system according to an aspect of an embodiment;

    [0055] FIG. 3 is a diagram illustrating energy being emitted and detected to/from a face of the user according to an aspect of an embodiment;

    [0056] FIG. 4 is a flow diagram illustrating a method of monitoring the position of an oral care device in the mouth of the user according to an aspect of an embodiment;

    [0057] FIG. 5 is a diagram illustrating the relative position and dimensions of facial characteristics of the user according to an aspect of an embodiment;

    [0058] FIG. 6 is a flow diagram illustrating a method of monitoring the position of an oral care device in the mouth of the user according to an aspect of an embodiment;

    [0059] FIG. 7 is a flow diagram illustrating a method of a step of determining the position of an oral care device in the mouth of the user according to an aspect of an embodiment;

    [0060] FIG. 8 is a flow diagram illustrating a method of monitoring the position of an oral care device in the mouth of the user according to an aspect of an embodiment;

    [0061] FIG. 9 is a graph illustrating grouping of people based on their facial characteristics according to an aspect of an embodiment;

    [0062] FIG. 10 is a flow chart illustrating a method of determining the amount of energy to be emitted towards the user's face according to an aspect of an embodiment; and

    [0063] FIG. 11 is a flow chart illustrating a method of determining the amount of energy to be emitted towards the user's face according to an aspect of an embodiment.

    DETAILED DESCRIPTION

    [0064] The embodiments of the present disclosure and the various features and advantageous details thereof are explained more fully with reference to the non-limiting examples that are described and/or illustrated in the drawings and detailed in the following description. It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment may be employed with other embodiments as the skilled artisan would recognize, even if not explicitly stated herein. Descriptions of well-known components and processing techniques may be omitted so as to not unnecessarily obscure the embodiments of the present disclosure. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments of the present may be practiced and to further enable those of skill in the art to practice the same. Accordingly, the examples herein should not be construed as limiting the scope of the embodiments of the present disclosure, which is defined solely by the appended claims and applicable law.

    [0065] FIG. 1 shows an exemplary oral care device in which the teaching of the present disclosure may be implemented. The oral care device in FIG. 1 is in the form of an electric toothbrush (power toothbrush), but it will be appreciated that this is not limiting, and the teaching of the present disclosure may be implemented in other devices where location sensing is required. For example the teachings may be applied to personal care devices such as tongue cleaners, shavers, hair clippers or trimmers, hair removal devices, or skin care devices, and the position which is determined may be in relation to the surface of the face of the user, rather than the position within the mouth of the user.

    [0066] Referring to FIG. 1 a handheld oral care device 10 is provided that includes a body portion 12 and a head member 14 removably or non-removably mounted on the body portion 12. The body portion 12 includes a housing, at least a portion of which is hollow, to contain components of the device, for example, a drive assembly/circuit, a computing device, and/or a power source (e.g., battery or power cord), not shown. The particular configuration and arrangement shown in FIG. 1 is by way of example only and does not limit the scope of the embodiments disclosed below.

    [0067] Oral care device 10 includes one or more energy emitters 20 and one or more energy detectors 22 located in the handheld oral care device 10. The energy emitters and detectors 20, 22 may be directly integrated in the body portion 12 of the oral care device 10 (as shown in FIG. 1). Alternatively, the sources and detectors 20, 22 may be in a device attachment such as head member 14 or a module that may be attached to the device body portion 12. In this example, energy emitter 20 is configured to generate near infrared light energy using light emitting diodes and the energy detector 22 is configured to detect the wavelength of light emitted by the energy emitter 20.

    [0068] Referring to FIG. 1, body portion 12 includes a long axis, a front side, a back side, a left side, and a right side. The front side is typically the side of the oral care device 10 that contains the operating components and actuators. Typically, operating components are components such as the bristles of a power toothbrush, the nozzle of a flossing device, the blade of a shaver, the brush head of a face cleansing device, etc. If the operating side is the front side of the body portion 12, the energy emitter 20 may be located on the right side of the body portion, opposite the left side, at its end proximate to the head member 14. However, the energy emitter 20 may be located anywhere within the device along the long axis or around a circumference of the oral care device 10. Similarly, the energy detector 22 may be located on the right side of the body portion, opposite the left side, at its end proximate to the head member 14. Although FIG. 1 depicts energy detector 22 located adjacent to the energy emitter 20, the energy detector 22 may be located anywhere within the device along the long axis or around a circumference of the device. Additional sensors may be included in the oral care device 10 shown in FIG. 1, including but not limited to a proximity sensor and other types of sensors, such as an accelerometer, a gyroscope, a magnetic sensor, a capacitive sensor, a camera, a photocell, a clock, a timer, any other types of sensors, or any combination of sensors, including, for example, an inertial measurement unit.

    [0069] FIG. 2 shows a schematic representation of an example of an oral care system 200. The oral care system comprises an energy emitter 20 and an energy detector 22 and a computing device 30. The oral care system 200 may be implemented in one or more devices. For example, all the modules may be implemented in an oral care device. Alternatively, one or more of the modules or components may be implemented in a remote device, such as a smart phone, tablet, wearable device, computer, or other computing device. The computing device may communicate with a user interface via a connectivity module.

    [0070] The oral care system 200 includes the computing device 30 having a processor and a memory (not shown), which may store an operating system as well as sensor data. System 200 also includes an energy emitter 20 and an energy detector 22 configured to generate and provide sensor data to computing device 30. The system 200 may include a connectivity module (not shown) which may be configured and/or programmed to transmit sensor data to a wireless transceiver. For example, the connectivity module may transmit sensor data via a Wi-Fi connection over the Internet or an Intranet to a dental professional, a database, or other location. Alternatively, the connectivity module may transmit sensor or feedback data via a Bluetooth or other wireless connection to a local device (e.g., a separate computing device), database, or other transceiver. For example, connectivity module allows the user to transmit sensor data to a separate database to be saved for long-term storage, to transmit sensor data for further analysis, to transmit user feedback to a separate user interface, or to share data with a dental professional, among other uses. The connectivity module may also be a transceiver that may receive user input information. Other communication and control signals described herein may be effectuated by a hard wire (non-wireless) connection, or by a combination of wireless and non-wireless connections. System 200 may also include any suitable power source. In embodiments, system 200 also includes the user interface which may be configured and/or programmed to transmit information to the user and/or receive information from the user. The user interface may be or may comprise a feedback module that provides feedback to the user via haptic signal, audio signal, visual signal, and/or any other type of signal.

    [0071] Computing device 30 may receive the sensor data in real-time or periodically. For example, a constant stream of sensor data may be provided by the energy detector 22 to the computing device 30 for storage and/or analysis, or the energy detector 22 may temporarily store and aggregate or process data prior to sending it to computing device 30. Once received by computing device 30, the sensor data may be processed by a processor. The computing device 30 may relay information and/or receive information from the energy emitter and the energy detector 22.

    [0072] FIG. 3 shows an example of the oral care device 10 in use. In use, the oral care device 10 is inserted into the mouth of a user 300. Typically, the user 300 will move the oral care device around their mouth so that the teeth of the user 300 are brushed by the bristles of the head of the oral care device 10. In the example shown in FIG. 3, the energy emitter 20 provided on the oral care device 10 emits energy towards the face of the user 300. As is shown in this figure, the energy may be directed to a particular portion of the face of the user 300, in this case the nose of the user 300. Energy reflects off the nose of the user 300 and is detected by the energy detector 22 which is also provided on the oral care device 10. The detected energy which has been reflected from the face of the user indicates the dimensions of the portion of face onto which the emitted energy was directed and the distance of the feature to the oral care device 10, and the orientation of the oral care device 10 relative to the feature. In this case, the reflected energy will indicate the dimensions and position of the nose of the user 300.

    [0073] The movement of the oral care device 10 relative to the face of the user 300 will cause energy to be directed onto different portions of the face of the user 300, for example, the eyes or mouth.

    [0074] FIG. 4 shows a flow chart of an example of a method which may be performed for monitoring the position of the oral care device. In step S100, energy is emitted towards the user's face. In step S102, reflected energy is received from the user's face which corresponds to the emitted energy. For example, at least a portion of the energy which is emitted towards the face of the user will be reflected or scattered from the user's face. A portion of the reflected or scattered energy will be received. At step S104, facial characteristics information is obtained. For example, the facial characteristics information may be obtained from the reflected energy, or from data relating to one or more facial characteristics of the user, for example, an image of the user, or from metadata relating to the user, or a combination thereof. In step S106, the position of an oral care device in the mouth of the user is determined using the reflected energy and the obtained facial characteristics information.

    [0075] FIG. 5 shows an example of the relative position and dimensions of facial features of the user. FIG. 5 represents an image of the user from which facial characteristics are determined in step S104 of FIG. 4. In FIG. 5, the regions of the facial features of interest are indicated by dotted lines. In this case, the nose, mouth and eye of the user are indicated as regions of facial features of interest. The positions of the eye and nose relative to the mouth are determined as indicated by the arrows in FIG. 5. Facial characteristics such as the dimensions and location of each of the facial features indicated by dotted lines are determined based on the image of the user. One facial feature may be used as the facial characteristics, or several facial features may be used, or the whole face of the user may be used. In this example, the facial characteristics information is obtained from an image of the user. The image is obtained by the user taking a self-image which in this case is a two dimensional image but may be a three dimensional image. A three dimensional image may be obtained by moving an imaging device, such as that found in a mobile device, around the head and/or face of the user, and processing the image to determine the dimensions and/or the relative position of the features of the user. Alternatively, a three dimensional image may be obtained by a multi focus imaging device.

    [0076] Additionally or alternatively, the information on the facial characteristics of the user may be obtained using the emitter and detector. Prior to use the user may perform a scanning motion of their face using the oral care device, where the oral care device is positioned at a predetermined distance from the face of the user. The received reflected energy may be used to collect information on the topology of the face of the user, for example, to create a three dimensional image. The facial characteristics information may be collected in real time, as the user uses the oral care device.

    [0077] Additionally or alternatively, the facial characteristics information may include metadata such as the weight, height, complexion, gender and/or age of the user. This data may be collected by processing an image of the user or may be input by the user, using an application on a mobile phone or the like. The metadata may be used to estimate, or improve the estimation, of facial features of the user using a predetermined correlation between the size/position of facial features and metadata.

    [0078] FIG. 6 shows an example of a method involved in step S106 of determining the position of the oral care device in the mouth of the user as shown in FIG. 4. Step S106 comprises the steps of S110, inputting the received reflected energy to the mapping, and thereby S112, estimating the location of the oral care device in the mouth of the user. The mapping is a trained (machine learning) algorithm which is developed during controlled, or guided, sessions with a diverse group of people, whereby the location of the oral care device in the mouth of a person is monitored while reflected energy is received. For example, the facial characteristics of each person of the group of people may be represented as a vector of parameters describing the surface of the face of a user which is used to train the algorithm. Thus, a general algorithm based on a generic person is provided which will estimate the location of an oral care device in the mouth of the user using the received reflected energy from the user.

    [0079] FIG. 7 shows an example of the method as shown in FIG. 6, comprising the additional step S108, which defines adjusting a mapping based on the obtained facial characteristics information. Where the mapping is a machine learned algorithm, the user's facial characteristics are added as additional inputs to the algorithm, whereby the mapping is then adapted based on the user's facial characteristics. The facial characteristics of the user may be represented as a vector of parameters describing the surface of the face of a user, which may be fed as the additional input to the algorithm. The facial characteristics information is used to determine the location and dimensions of each facial feature of the user relative to one another. The location and dimensions of the facial features of the user are compared to the location and dimensions of the facial features upon which the mapping is based, and the mapping is adjusted so that when reflected energy from the face of the user is received, the mapping correlates the reflected energy with a facial feature of the user, rather than with a facial feature of the generic person. Thus, the location of the oral care device is indicated more accurately determined with respect to the mouth of the user.

    [0080] FIG. 8 shows an example of a method of monitoring the position of an oral care device in the mouth of the user which may be implemented alternatively or additionally to the method shown in FIG. 7. In FIG. 8, data and/or metadata related to the user and/or reflected energy are obtained, S101, and information on facial characteristics of the user are obtained or extracted from the data and/or metadata and/or reflected energy S103. Information on facial characteristics relating to groups of people sharing similar facial characteristics are obtained, S105. For example, the information may be obtained from a database which stores information on the facial characteristics of groups of people. The information on facial characteristics of the groups and the information on facial characteristics of the user are compared to determine which group of people has facial characteristics most similar to those of the user, S107. A mapping which corresponds to the determined group of people is then selected, S109. The selected mapping is an algorithm which has been trained using data compiled during a controlled brushing session, where the location of the oral care device relative to the facial characteristics of each member of the group of people is monitored while each member uses the oral cleaning device, such that reflected energy can be correlated to the location of the oral cleaning device. The reflected energy is then input to the selected mapping in order to determine the position of the oral care device in the mouth of the user S106. The process of step S106 may include the steps S108-S112 shown in FIG. 7.

    [0081] FIG. 9 shows an example of the grouping of people used to develop a plurality of mappings for different facial characteristics. In FIG. 9, a first feature, such as the distance of the nose from the mouth, is correlated with a second feature, such as the distance from the nose to the eyes. Each point on the graph indicates a different person. People with similar first and second features are grouped together, as indicated by the rings shown in FIG. 9. The data relating to people who are grouped together are used to develop a mapping which corresponds to that group of people. The point indicated by an arrow illustrates the user of the oral care device. The facial characteristics of the user are extracted using one of the abovementioned techniques. For example the first and second facial features described above are extracted. As is shown in this figure, the user has first and second facial features which are similar to a particular group of people, as they fall within the perimeter defined by a ring surrounding a particular group. The perimeter of the ring represents threshold values of the first and second characteristics. If the user falls within the threshold values of the first and second facial characteristics of a particular group of people, the user has facial characteristics most similar to those people. There may be provided any number of groups. The groups may comprise any number of people. Each group may only comprise one person, where the mapping corresponding to the person with the most similar facial features to the user is used as the selected mapping.

    [0082] FIG. 10 shows an example of a method which may be applied additionally or alternatively to any of the previously specified methods. The method of FIG. 10 is performed before, for example, the step of emitting energy towards the user's face (step S100 in FIG. 4). At step S114, setting energy is emitted towards the user's face. This may be energy which is of a predefined intensity. The energy may be emitted from a predefined location, for example the oral care device may be held in front of a particular feature of the user, for example the nose, at a predefined distance. The setting energy may be emitted and reflected from the oral care device onto the particular feature of the user. The reflected setting energy corresponding to the emitted setting energy is received from the user's face S116. The reflected setting energy is then analysed S117, for example the amount or intensity of reflected setting energy which is received is compared to a predefined value of required energy. The amount of energy to be subsequently emitted towards the user's face is then determined, or corrected, based on the result of the comparison. For example, the amount of energy is increased or decreased based on the result of the comparison, so that the subsequently emitted energy returns a desired amount or intensity of reflected energy. Subsequently, any of the methods as described above may be implemented.

    [0083] Alternatively or additionally, the method as set out in FIG. 11 may be implemented. At step S120, facial characteristics information may be extracted as is described above. At step S122, the amount of energy to be emitted may be determined based on the extracted information. For example, an image of the user may be analysed to determine the skin tone of the user. This may be used as the facial characteristics information to determine the amount of energy to be emitted in a step of emitting by increasing or decreasing the amount of energy by comparison of the skin tone to a predefined skin tone and energy.

    [0084] While embodiments described herein include near infrared light energy sources and detectors, other types of energy may also be used. For example, alternative wavelengths of light, such as within the visible spectrum, radio frequency electromagnetic radiation forming a radar sensor, or electrostatic energy, such as in a mutual capacitance sensor may also be used. The sensor output may be derived from different aspects of the detected energy such as magnitude of the detected energy and/or phase or time delay between the energy source and the detected signal, time of flight.

    [0085] It is understood that the embodiments of the present disclosure are not limited to the particular methodology, protocols, devices, apparatus, materials, applications, etc., described herein, as these may vary. It is also to be understood that the terminology used herein is used for the purpose of describing particular embodiments only, and is not intended to be limiting in scope of the embodiments as claimed. It must be noted that as used herein and in the appended claims, the singular forms a, an, and the include plural reference unless the context clearly dictates otherwise.

    [0086] Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art to which the embodiments of the present disclosure belong. Preferred methods, devices, and materials are described, although any methods and materials similar or equivalent to those described herein may be used in the practice or testing of the embodiments.

    [0087] Although only a few exemplary embodiments have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of the embodiments of the present disclosure. The above-described embodiments of the present invention may advantageously be used independently of any other of the embodiments or in any feasible combination with one or more others of the embodiments.

    [0088] Accordingly, all such modifications are intended to be included within the scope of the embodiments of the present disclosure as defined in the following claims. In the claims, means-plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures.

    [0089] In addition, any reference signs placed in parentheses in one or more claims shall not be construed as limiting the claims. The word comprising and comprises, and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. The singular reference of an element does not exclude the plural references of such elements and vice-versa. One or more of the embodiments may be implemented by means of hardware comprising several distinct elements. In a device or apparatus claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures may not be used to an advantage.