MOBILE DEVICE, METHOD OF CONTROLLING THE SAME, AND COMPUTER PROGRAM STORED IN RECORDING MEDIUM
20220096015 · 2022-03-31
Assignee
Inventors
Cpc classification
A61B5/0059
HUMAN NECESSITIES
A61B5/165
HUMAN NECESSITIES
A61B5/02416
HUMAN NECESSITIES
A61B5/0816
HUMAN NECESSITIES
A61B5/7455
HUMAN NECESSITIES
A61B5/721
HUMAN NECESSITIES
A61B5/684
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A61B5/6898
HUMAN NECESSITIES
A61B5/6843
HUMAN NECESSITIES
International classification
A61B5/00
HUMAN NECESSITIES
A61B5/0205
HUMAN NECESSITIES
A61B5/08
HUMAN NECESSITIES
A61B5/1455
HUMAN NECESSITIES
Abstract
A mobile device is provided. The mobile device includes a display, a front camera provided to face in a forward direction of the display, a touch sensor provided at a front side of the display, and a processor configured to acquire a photoplethysmography (PPG) signal from an image of a finger captured by the front camera in a PPG measurement mode, and output guide information that guides at least one of a position of the finger or a contact pressure with which the finger presses the front camera based on at least one of an output of the touch sensor or the image of the finger.
Claims
1. A mobile device comprising: a display; a front camera provided to face in a forward direction of the display; a touch sensor provided at a front side of the display; and a processor configured to: acquire a photoplethysmography (PPG) signal from an image of a finger captured by the front camera in a PPG measurement mode, and output guide information that guides at least one of a position of the finger or a contact pressure with which the finger presses the front camera based on at least one of an output of the touch sensor or the image of the finger.
2. The mobile device of claim 1, wherein the display is configured to emit light of a specific wavelength in an area corresponding to the front camera in the PPG measurement mode, and wherein the front camera is configured to, upon the light emitted from the display in the PPG measurement mode being reflected from or transmitted through the finger, receive the light reflected from or transmitted through the finger thereby capturing the image of the finger.
3. The mobile device of claim 2, wherein the processor is further configured to: identify whether the finger is located in a predetermined area on the display based on the output of the touch sensor, and output the guide information with at least one of a visual method, an auditory method, or a tactile method based on a result of the identification.
4. The mobile device of claim 3, wherein the processor is further configured to control the display to display a guide image that guides the position of the finger to the predetermined area.
5. The mobile device of claim 2, wherein the processor is further configured to: identify whether the contact pressure is included within a predetermined range based on the output of the touch sensor, and output the guide information with at least one of a visual method, an auditory method, or a tactile method based on a result of the identification.
6. The mobile device of claim 5, further comprising a speaker, wherein the processor is further configured to control the speaker to present a guide speech that guides the contact pressure to be provided within the predetermined range.
7. The mobile device of claim 2, wherein the processor is further configured to: identify whether a motion of the finger occurs based on the at least one of the output of the touch sensor or the image of the finger, and based on identifying that the motion of the finger occurs, output the guide information with at least one of a visual method, an auditory method, or a tactile method.
8. The mobile device of claim 2, wherein the processor is further configured to, based on at least one of the output of the touch sensor or the image of the finger, perform a distortion preventive process that prevents distortion due to a motion of the finger.
9. The mobile device of claim 8, wherein the processor is further configured to: track the motion of the finger based on at least one of the output of the touch sensor or the image of the finger, and determine, based on a current position of the finger, at least one pixel from among a plurality of pixels of the image of the finger to use to acquire the PPG signal.
10. The mobile device of claim 8, wherein the processor is further configured to: identify a degree to which the finger has moved based on at least one of the output of the touch sensor or the image of the finger, and determine, based on the degree to which the finger has moved, to use a single pixel or multiple pixels to acquire the PPG signal.
11. The mobile device of claim 2, further comprising a motion sensor configured to detect a motion of the mobile device, wherein the processor is further configured to output guide information related to the motion of the mobile device with at least one of a visual method, an auditory method, or a tactile method, based on an output of the motion sensor.
12. The mobile device of claim 2, wherein the processor is further configured to identify a change in the contact pressure based on at least one of the output of the touch sensor or the image of the finger, and control at least one of a size or a brightness of a light emission area in which the light of the specific wavelength is emitted, based on the change in the contact pressure.
13. The mobile device of claim 2, wherein the processor is further configured to control the display to alternately emit light rays of a plurality of different specific wavelengths in the PPG measurement mode.
14. The mobile device of claim 2, wherein the processor is further configured to acquire biometric information of a user based on the acquired PPG signal, and wherein the biometric information of the user includes at least one of a heart rate, a blood oxygenation, a stress index, a respiration rate, a blood pressure, an oxygen delivery time, or a pulse speed.
15. A method of controlling a mobile device including a display, a front camera provided to face in a forward direction of the display, and a touch sensor provided at a front side of the display, the method comprising: identifying at least one of a position of a finger or a contact pressure with which the finger presses the front camera based on at least one of an output of the touch sensor or an image of the finger captured by the front camera; outputting guide information that guides at least one of the position of the finger or the contact pressure based on a result of the identifying step; and acquiring a photoplethysmography (PPG) signal from the image of the finger captured by the front camera.
16. The method of claim 15, wherein the identifying at least one of the position of the finger or the contact pressure includes identifying whether the finger is located in a predetermined area on the display based on the output of the touch sensor, and wherein the outputting of the guide information includes outputting the guide information with at least one of a visual method, an auditory method, or a tactile method based on a result of the identifying whether the finger is located in a predetermined area on the display .
17. The method of claim 15, wherein the identifying of at least one of the position of the finger or the contact pressure includes: identifying whether the contact pressure is included within a predetermined range based on the output the touch sensor, and the outputting of the guide information includes outputting the guide information using at least one of a visual method, an auditory method, or a tactile method based on a result of the identification.
18. The method of claim 15, wherein the identifying of the at least one of the position of the finger or the contact pressure includes identifying whether a motion of the finger occurs based on at least one of the output of the touch sensor or the image of the finger, and wherein the outputting of the guide information includes, in response to identifying that the motion of the finger has occurred, outputting the guide information with at least one of a visual method, an auditory method, or a tactile method.
19. The method of claim 15, further comprising performing a distortion preventive process that prevents distortion due to a motion of the finger based on at least one of the output of the touch sensor or the image of the finger.
20. The method of claim 19, wherein the performing of the distortion preventive process includes: tracking the motion of the finger based on at least one of the output of the touch sensor or the image of the finger, and determining based on a current position of the finger, at least one pixel from among a plurality of pixels of the image of the finger to use to acquire the PPG signal ; or identifying a change in the contact pressure based on at least one of the output of the touch sensor or the image of the finger, and controlling at least one of a size or a brightness of a light emission area in the display, in which light of a specific wavelength is emitted, based on the change in the contact pressure.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DETAILED DESCRIPTION
[0037] Like numerals refer to like elements throughout the specification. Not all elements of embodiments of the present disclosure will be described, and description of what are commonly known in the art or what overlap each other in the embodiments will be omitted. The terms as used throughout the specification, such as “˜ part”, “˜ module”, “˜ member”, “˜ block”, etc., may be implemented in software and/or hardware, and a plurality of “˜ parts”, “˜ modules”, “˜ members”, or “˜ blocks” may be implemented in a single element, or a single “˜ part”, “˜ module”, “˜ member”, or “˜ block” may include a plurality of elements.
[0038] It will be further understood that the term “connect” or its derivatives refer both to direct and indirect connection, and the indirect connection includes a connection over a wireless communication network or a connection through an electrical wire.
[0039] It should be further understood that the terms “comprises” and/or “comprising,” when used in this specification, identify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof, unless the context clearly indicates otherwise.
[0040] In the specification, it should be understood that, when a member is referred to as being “on/under” another member, it can be directly on/under the other member, or one or more intervening members may in addition be present.
[0041] Further, it will be further understood when a signal or data is transferred, sent or transmitted from “an element” to “another element”, it does not exclude another element between the element and the other element passed by the signal or data therethrough, unless the context clearly indicates otherwise.
[0042] Although the terms “first,” “second,” “A,” “B,” etc. may be used to describe various components, the terms do not limit the corresponding components, but are used only for the purpose of distinguishing one component from another component. The ordinal numbers used do not indicate the arrangement order, manufacturing order, or importance between components.
[0043] As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
[0044] As used herein, expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list. For example, the expression, “at least one of a, b, or c,” should be understood as including only a, only b, only c, both a and b, both a and c, both b and c, or all of a, b, and c.
[0045] Reference numerals used for method steps are just used for convenience of explanation, but not to limit an order of the steps. Thus, unless the context clearly dictates otherwise, the written order may be practiced otherwise.
[0046] Hereinafter, embodiments of a mobile device, a method of controlling the same, and a computer program stored in a recording medium according to an aspect will be described in detail with reference to the accompanying drawings.
[0047]
[0048] A mobile device according to an embodiment may be a portable electronic device having a display and a camera, such as a smart phone or a tablet personal computer (PC). For example, referring to
[0049] Referring to the cross-sectional side view of
[0050] The touch sensor 130 may include an upper plate and a lower plate on which a transparent electrode is deposited, and when information about the position at which a contact has occurred or a change in electrical capacitance has occurred is transmitted to a processor (such as the processor 140 of
[0051] The front camera 121 may be installed into the display 110 and may be located on the rear surface of the touch sensor 130. Referring to
[0052] Due to the structure of the mobile device 100, when the user touches the lens of the front camera 121, the user is caused to come into contact with the touch sensor 130 provided at the front surface of the front camera 121. Details thereof will be described below.
[0053] The rear camera 122 may be mounted in a housing 101 that accommodates and supports the display 110 and other components of the mobile device 100 such that a lens of the rear camera 122 faces in a backward direction of the mobile device 100.
[0054] Referring to
[0055] The structure of the mobile device 100 described with reference to
[0056]
[0057] Referring to
[0058] The display 110 may employ one of various types of displays, such as a light emitting diode (LED) display, an organic light emitting diode (OLED) display, and a liquid crystal display (LCD).
[0059] The display 110 may include a plurality of pixels arranged in two dimensions to implement a two-dimensional image, and each of the pixels may include a plurality of sub-pixels to implement a plurality of colors. For example, in order to implement an RGB image, each of the pixels may include a red sub-pixel, a green sub-pixel, and a blue sub-pixel, and may further include a white sub-pixel or an infrared sub-pixel.
[0060] The front camera 121 may include an image sensor, such as a complementary metal oxide semiconductor (CMOS) sensor or a charge coupled device (CCD) sensor. In addition, although not shown in the control block diagram of
[0061] The touch sensor 130 may be arranged on the front surface of the display 110 in the form of a layer. As a method of the touch sensor 130 detecting a touch, one of various well-known methods, such as a capacitive method, a pressure reduction (e.g., a resistive membrane) method, an ultrasonic method, and an infrared method may be employed.
[0062] The mobile device 100 may perform various functions, such as sending/receiving calls and messages, web browsing, and executing various applications. In particular, the mobile device 100 according to an embodiment may perform a PPG measurement function.
[0063] The PPG signal is one of the indicators representing changes in blood volume synchronized with the heartbeat. When light of a specific wavelength is transmitted to a human body using a light source, some light is absorbed by blood, bones, and tissues, and some other light is reflected or transmitted and reaches a light receiver. The degree of absorption of light may vary depending on blood, bones, and tissues located in a path through which light passes. Since components except for a change in blood flow caused by a heartbeat are unchanging components, a change in the transmitted light or reflected light received by the light receiver reflects a change in blood volume synchronized with a heartbeat.
[0064] The mobile device 100 according to an embodiment may use the display 110 as a light source and the front camera 121 as a light receiver to measure the PPG signal. Accordingly, when the mobile device 100 operates in a PPG measurement mode, the display 110 emits light of a specific wavelength for PPG measurement, and the front camera 121 captures an image of a human body by receiving the light reflected from or transmitted through the human body. That is, the mobile device 100 according to an embodiment may measure the PPG signal using components basically provided in the mobile device 100 without having additional devices, such as additional sensors or light sources.
[0065] In the embodiment to be described below, a human body to be subject to PPG measurement will be referred to as a user, and an image captured by receiving light reflected from or transmitted through the human body by the front camera 121 will be referred to as a user image.
[0066] Here, the user image only needs to include information (e.g., wavelength information, intensity, etc.) about the light reflected from or transmitted through the user, and does not need to be an image in which the user is identified.
[0067] The processor 140 may acquire a PPG signal from the user image captured by the front camera 121. In addition, the processor 140 may acquire biometric information of the user based on the acquired PPG signal, and the biometric information being acquired by the processor 140 may include at least one of a heart rate, a blood oxygenation, a stress index, a respiration rate, a blood pressure, an oxygen delivery time, or a pulse speed. However, the above described biometric information is only an example applicable to the embodiment of the mobile device 100, and it should be understood that various types of biometric information may be acquired by the processor 140.
[0068] In addition, the processor 140 may control the overall operation of the mobile device 100. For example, the processor 140 may control the display 110 to emit light of a specific wavelength, and may control the front camera 121 to capture a user image. In the embodiments to be described below, although not mentioned for the sake of convenience of description, it is assumed that operations performed by the display 110, the front camera 121, and other components of the mobile device 100 may be controlled by the processor 140.
[0069] A program for executing an operation performed by the processor 140 and various types of data required for executing the program may be stored in the memory 150. A program related to PPG measurement may be stored in the form of an application, and such an application may be installed by default in the mobile device 100 or may be installed by a user after the mobile device 100 is sold.
[0070] In the latter case, the user may install the application for PPG measurement in the mobile device 100 by downloading the application for PPG measurement from a server providing the application.
[0071]
[0072] When an application for PPG measurement is executed in the mobile device 100 according to an embodiment, the mobile device 100 may operate in a PPG measurement mode. As described above, the mobile device 100 operating in the PPG measurement mode may measure at least one type of biometric information among a heart rate, a blood oxygenation (SpO2), a stress index, a respiration rate, a blood pressure, an oxygen delivery time and a pulse speed. The mobile device 100 may provide a result of the measurement to the user.
[0073] For example, when an application for PPG measurement is executed in the mobile device 100, a screen for selecting biometric information desired to be measured may be displayed on the display 110 as shown in
[0074] When the biometric information displayed on the screen is not biometric information desired to be measured, the user may swipe the screen to move to the next screen, and when a screen of desired biometric information is displayed, the user may touch the measurement button m.
[0075] Alternatively, a plurality of measurement buttons m respectively corresponding to a plurality of pieces of measurable biometric information may be displayed on one screen.
[0076] When biometric information is selected by the user, the mobile device 100 may perform a series of operations for measuring the selected biometric information. Hereinafter, the operations will be described in detail.
[0077]
[0078] The pressure generated by the heartbeat allows blood to flow in blood vessels, and the pressure by the heartbeat acts up to the end capillaries of the human body. Arterial blood from the capillaries of the fingertips supplies blood to the tissues, enters the veins, and returns to the heart. Accordingly, the arterial blood volume in the fingertip capillaries repeatedly increases and decreases in synchronization with the heartbeat.
[0079] As described above, the PPG signal is an index indicating a change in blood volume synchronized with a heartbeat. Therefore, the measurement of the PPG signal may be performed at the extremities of the body, such as a finger, toe, or earlobe. For the sake of convenience of measurement, the following description will be made in relation to a case in which a PPG signal is measured on a finger of a user as an example.
[0080] Referring to
[0081] When the PPG signal is measured in a reflective type, light of a specific wavelength may be emitted from an area adjacent to the front camera 121. For example, as shown in
[0082] According to the design of the mobile device 100, components of the display 110 emitting light may be disposed on the front surface of the lens of the front camera 121, or components of the display 110 emitting light may not be disposed on the front surface of the lens of the front camera 121. In the former case, light may be emitted from the front surface of the lens of the front camera 121, and in the latter case, light may not be emitted from the front surface of the lens of the front camera 121 (i.e., the shape of the emission area EA may be a shape in which the center is empty).
[0083] In addition, the touch sensor 130 may or may not be located on the front surface of the lens of the front camera 121. Because the user's finger 600 is larger than the lens of the front camera 121, when the user places the finger 600 on the lens of the front camera 121, the finger 600 is caused to come into contact with the touch sensor 130 around the lens regardless of whether the touch sensor 130 is located on the front surface of the lens of the front camera 121.
[0084] Referring again to
[0085] The front camera 121 may capture the finger image by receiving the reflected light incident onto the lens, and the processor 140 may acquire a PPG signal from the captured finger image.
[0086] Referring to
[0087] That is, when the mobile device 100 is implemented in a foldable form, light of a specific wavelength is emitted from the opposite side rather than from the same side as the front camera 121 such that the front camera 121 may receive the light transmitting through the finger.
[0088] For example, referring to
[0089] However, even when the mobile device 100 is implemented in a foldable form, the emission area EA may be formed in an area adjacent to the front camera 121 as shown in
[0090]
[0091] The wavelength of light emitted from the emission area EA may vary depending on biometric information to be measured. For example, as shown in
[0092] When it is desired to measure a blood pressure, a combination of red light, green light, blue light, and infrared light may be emitted from the emission area EA, and when it is desired to measure a respiration rate, red light or infrared light may be emitted, and when it is desired to measure a hear rate, green light may be emitted.
[0093] However, the table of
[0094] Information about the emission wavelengths each matched with corresponding biometric information may be stored in the memory 150, and when biometric information is selected by a user, the processor 140 may control the display 110 to emit light of an emission wavelength matched with the selected biometric information from the emission area EA.
[0095] The depth transmitted through the human tissue may vary depending on the wavelength band. Accordingly, when the PPG signal is measured using multi-wavelengths, more diverse and accurate information may be acquired. Referring to the example of
[0096] As described above, the display 110 of the mobile device 100, which includes a plurality of sub-pixels for each single pixel, may implement various colors, and thus may emit light of various wavelengths for acquiring biometric information.
[0097] In order to measure the PPG signal using multi-wavelength light, the processor 140 may control the display 110 to emit light from at least two of the red sub-pixel, the green sub-pixel, or the blue sub-pixel included in the emission area EA. In some cases, an infrared sub-pixel may also be used.
[0098] For example, when using red light and infrared light to measure the respiration rate, as shown in
[0099] Light emission from the emission area EA may be performed after biometric information is selected. The light emission may be performed immediately after selection of biometric information, or may be performed after the user's finger 600 contacts the lens of the front camera 121, or may be performed when it is confirmed that the user's finger is properly positioned.
[0100] Light emitted from the emission area EA of the display 110 may be reflected from or transmitted through a finger 600 and then be incident onto the lens of the front camera 121. The front camera 121 may capture frame images according to a set frame rate, and each of the frame images captured by receiving light reflected from or transmitted through a finger 600 may be referred to as a finger image.
[0101] The finger image captured by the front camera 121 may be transmitted to the processor 140, and the processor 140 may acquire a PPG signal from the finger image. In addition, the processor 140 may identify or calculate the biometric information selected by the user using the acquired PPG signal.
[0102] For example, the processor 140 may extract a specific wavelength component from the finger images captured at regular time intervals. A change in a value of the specific wavelength component according to time change may indicate a PPG signal. The processor 140 may divide a specific wavelength component into an alternating current (AC) component and a direct current (DC) component, and calculate the selected biometric information using the divided AC component and DC component.
[0103] The calculated biometric information may be provided to the user through the display 110 or the speaker 160, and may be used to provide healthcare-related services. For example, the calculated biometric information may be used to monitor a health status of a user having a specific disease. When the calculated biometric information is out of a reference range, a warning message may be output or relevant information may be transmitted to a related medical institution.
[0104]
[0105] For accurate measurement of the PPG signal, it is important that the user's finger 1200 is placed at an accurate position corresponding to the front camera 121. Accordingly, the mobile device 100 according to an embodiment may output guide information for guiding the position of the finger. For example, guide information for guiding the position of a finger may be output using at least one of a visual method, an auditory method, or a tactile method.
[0106] For example, as shown in
[0107] For example, the position guide image may include an arrow pointing to the lens of the front camera 121 or a finger-shaped image FI. When the finger-shaped image FI is displayed on the display 110, the user may place his or her finger to overlap the finger-shaped image Fl displayed on the display 110.
[0108] As another example, when the mobile device 100 is implemented in a foldable form, as shown in
[0109] The embodiment of the mobile device 100 is not limited to the examples of
[0110] The mobile device 100 according to an embodiment may output guide information for guiding at least one of a finger position or a finger contact pressure, or perform a distortion preventive process for preventing distortion due to motion of the finger based on an output of the touch sensor 130 and an output of the front camera 121. The guide information output in this case may also be output using at least one of a visual method, an auditory method, or a tactile method. Hereinafter, detailed operations thereof will be described.
[0111]
[0112] Referring to
[0113] The guide speech output through the speaker 160 may include information for guiding the position of the finger 1500 or the contact pressure. The contact pressure may refer to a pressure of the user's finger 1500 with which the lens of the front camera 121 is pressed.
[0114] When the above-described guide image is displayed on the display 110, a guide speech corresponding to the displayed guide image (e.g., a guide speech, such as “place your finger on the lens of the front camera”) may be output through the speaker 160 together with the guide image.
[0115] In addition, when the user's finger 1500 is not placed according to a predetermined position or a predetermined pressure, a guide speech for correcting the position of the finger 1500 or the contact pressure may be output.
[0116] The processor 140 may identify whether the user's finger 1500 is located in a predetermined area on the display 110 based on the output of the touch sensor 130, and output the identification result using at least one of a visual, auditory, or tactile manner. In the present example, a case of outputting using an auditory method will be described.
[0117] The predetermined area may be an area in which a finger 1500 needs to be positioned for PPG signal measurement, and has a predetermined size or a predetermined shape at a predetermined position. For example, the predetermined area may be defined as a circular or rectangular area having a predetermined size with respect to the center of the front camera 121.
[0118] The output of the touch sensor 130 indicates the position of the touch sensor 130 being in contact with an object. Accordingly, the processor 140 may identify the position of the finger 1500 being in contact with the touch sensor 130 based on an output of the touch sensor 130. Alternatively, the processor 140 may identify the position of the finger 1500 in contact with the touch sensor 130 based on an output of the front camera 121 (i.e., a finger image captured by the front camera 121). In particular, when the resolution of the front camera 121 is higher than the resolution of the touch sensor 130, the accuracy of position identification may be improved using the output of the front camera 121.
[0119] Information about the above-described predetermined area may be stored in the memory 150, and the processor 140 may compare the finger position, for which the output of the touch sensor 130 is provided, with the information about the predetermined area stored in the memory 150 to identify whether the finger 1500 is located in the predetermined area.
[0120] Referring to
[0121] Alternatively, guide information may be provided in a visual manner by outputting information, which is output as a guide speech, in the form of text, and upon identifying that the finger 1500 is not located in the predetermined area, vibration may be generated in the mobile device 100 to provide guide information in a tactile method.
[0122] On the other hand, when the finger 1500 comes in contact with the lens of the front camera 121 serving as a light receiver with a suitable pressure, a more accurate PPG signal may be measured. When the pressure of the finger 1500 pressing the lens of the front camera 121 is weak, a PPG signal having a shape as shown in (a) of
[0123] Among the three PPG signals shown in
[0124] The processor 140 may determine the contact pressure of the finger based on at least one of the output of the touch sensor 130 or the output of the front camera 121, and may output guide information for guiding the finger contact pressure to fall within a predetermined range using at least one of a visual manner, an auditory manner, or a tactile manner.
[0125] When the touch sensor 130 is implemented in a resistive membrane method and thus is capable of directly measuring the finger contact pressure, the processor 140 may directly identify the finger contact pressure based on the output of the touch sensor 130.
[0126] When the touch sensor 130 is implemented in a non-resistive membrane method, and thus is incapable of directly measuring the contact pressure of the finger, the processor 140 may indirectly identify the contact pressure of the finger 1500 based on the contact area between the finger 1500 and the touch sensor 130. For example, the contact pressure may be identified to be greater as the contact area between the finger 1500 and the touch sensor 130 is larger, and weaker as the contact area is smaller.
[0127] The contact area between the finger 1500 and the touch sensor 130 may be identified based on the output of the touch sensor 130, or may be identified based on the output of the front camera 121 (i.e., the finger image captured by the front camera 121).
[0128] As a result of the determination, when it is identified that the contact pressure of the finger 1500 is weaker than a predetermined range of pressures as shown in
[0129] Alternatively, text having the same content as that of the guide speech may be displayed on the display 110 to output guide information in a visual manner, or vibration may be generated in the mobile device 100 to output guide information in a tactile manner.
[0130] For example, the mobile device 100 may guide the contact pressure of the finger 1500 after guiding the position of the finger first. That is, as described above, the processor 140 may identify the position of the user's finger 1500 based on the output of the touch sensor 130 or the output of the front camera 121. When the position of the finger 1500 is not located in a predetermined area, the processor 140 may output information for guiding the position to the predetermined area, and then when a result of re-identification is that the position of the user's finger 1500 is located in the predetermined area, the processor 140 may identify the finger contact pressure, and output information for guiding the finger's contact pressure according to the identification result.
[0131] As another example, the contact pressure of the finger 1500 may be guided first, or the position of the finger and the contact pressure of the finger may be simultaneously guided.
[0132]
[0133] Referring to
[0134] The processor 140 may determine whether the mobile device 100 moves based on the output of the motion sensor 170, and may output guide information related to the motion of the mobile device 100.
[0135] When the processor 140 identifies that the mobile device 100 has moved in the PPG measurement mode, the processor 140 may output guide information for indicating that a motion of the mobile device 100 is not allowed, such that distortion of the PPG signal due to the motion of the mobile device 100 is prevented. In the example of
[0136] The processor 140 may identify the motion of the user's finger 1900 based on the output of the touch sensor 130 or the output of the front camera 121. The motion of the finger 1900 may include at least one of a change in position of the finger 1900 or a change in a contact pressure of the finger 1900. Accordingly, the processor 140 may identify the change in position of the finger 1900 or the change in contact pressure of the finger 1900 based on the output of the touch sensor 130 or the output of the front camera 121.
[0137] When the processor 140 identifies that the user's finger 1900 has moved based on the output of the touch sensor 130 or the output of the front camera 121, the processor 140 may output guide information for indicating that a motion is not allowed using at least one of a visual method, an auditory method, or a tactile method similar to the above.
[0138] On the other hand, when the motion of the finger 1900 occurs in response to the guide information for preventing the motion of the finger 1900 being output during a PPG measurement, the processor 140 may perform a distortion preventive process to prevent distortion due to the motion of the finger 1900 using various components provided in the mobile device 100. The distortion caused by the motion of the finger 1900 may include at least one of noise or artifacts appearing in the PPG signal.
[0139] For example, when a finger motion has occurred in response to the guide information being output a predetermined number of times, the processor 140 may perform the distortion preventive process, which will be described below.
[0140] Alternatively, the output of the guide information for the motion may be omitted, and the distortion preventive process, which will be described below, may be performed.
[0141] .20 is a diagram illustrating an operation performed by a mobile device to correct distortion caused by a change in position of a finger according to an embodiment.
[0142] When assuming a frame area FA of the front camera 121 shown in
[0143] In the example, the frame area FA of the front camera 121 refers to an area included in a frame image captured by the front camera 121 (i.e., a coverage of the front camera 121). The frame area FA may be an area set assuming a case in which the user's finger is in contact with the lens of the front camera 121.
[0144] In order to prevent distortion due to the motion of the finger, the processor 140 may track the motion of the finger based on the output of the touch sensor 130 or the output of the front camera 121, and determine at least one pixel to be used for acquiring a PPG signal from a finger image based on the current position of the finger.
[0145] As described above, a plurality of frame images captured by the front camera 121 may be used to acquire the PPG signal, and the plurality of frame images may be captured according to a set frame rate and transmitted to the processor 140.
[0146] The processor 140 may extract the PPG signal from at least one pixel corresponding to the current position of the finger in the transmitted frame image. That is, when the finger is located in the first area PPG_A1, the processor 140 may extract the PPG signal from the pixel in the first area PPG_A1, and when the finger moves to be located in the second area PPG_A2, the processor 140 may extract the PPG signal from the pixel in the second area PPG_A2. Accordingly, when the finger moves, the PPG signal may be acquired from the same part of the finger.
[0147] The PPG signal may be extracted from a single pixel or may be extracted from multiple pixels. When extracting a PPG signal from multi-pixels, the processor 140 may remove a motion component from a pixel value.
[0148] For example, an input signal intensity (input intensity: I) may be expressed as a function I (t, x, y) of time t and position (x, y) on a two-dimensional plane, and may be decomposed into an amplitude component and a motion component using a Gaussian distribution as shown in Equation (1) below.
[0149] In Equation (1), A(t) represents the amplitude component and a component,
following the amplitude component represents the motion component. Therefore, when the motion component is removed and only the amplitude component is used, a PPG signal in which motion artifacts have been removed may be acquired.
[0150]
[0151] Referring to
[0152] Referring to
[0153] Accordingly, when acquiring the PPG signal, the processor 140 may use the multi-pixels of the front camera 121 to reduce noise appearing in the PPG signal due to the motion of a finger.
[0154] Alternatively, whether to use a single pixel or multi-pixels may be determined according to the motion of a finger. The processor 140 may identify the degree to which a finger moves based on at least one of the output of the touch sensor 130 or the output of the front camera 121, and when the degree to which the finger moves is less than a predetermined threshold level, the PPG signal may be acquired from a single pixel. When the degree to which the finger moves is equal to or greater than the predetermined threshold level, the PPG signal may be acquired from multi-pixels.
[0155]
[0156] Distortion may occur in the PPG signal when the contact pressure of the finger changes during measurement of the PPG signal. In order to prevent distortion due to a change in contact pressure of a finger, the processor 140 may control at least one of a brightness or a size of the emission area EA of the display 110.
[0157] The processor 140 may identify a change in the contact pressure of the finger based on at least one of an output of the touch sensor 130 or an output of the front camera 121. A method of identifying the contact pressure is the same as described above.
[0158] When it is identified that the finger contact pressure has changed during measurement of the PPG signal, the processor 140 may control at least one of the brightness or the size of the emission area EA of the display 110.
[0159] For example, in response to the contact pressure of the finger decreasing during measurement of the PPG signal, the processor 140 may increase the size of the emission area EA of the display 110, or may increase the brightness of the emission area EA of the display 110, as shown in
[0160] Conversely, in response to the contact pressure of the finger increasing during measurement of the PPG signal, the processor 140 may reduce the size of the emission area EA of the display 110 or the brightness of the emission area EA of the display 110, as shown in
[0161] As described above, the size or brightness of the emission area EA of the display 110 may be dynamically changed according to a change in the contact pressure of the finger, such that distortion appearing in the PPG signal may be prevented.
[0162] Hereinafter, a method of controlling a mobile device according to an embodiment will be described. In performing the method of controlling the mobile device according to the embodiment, the above-described mobile device 100 may be used. Accordingly, the contents described above with reference to
[0163]
[0164] Referring to
[0165] Guide information for guiding the position of the finger may be output as shown in
[0166] When the user's finger is placed on the lens of the front camera 121 according to the output guide information, the touch sensor 130 in an area adjacent to the front camera 121 may come into contact with the user's finger. Accordingly, the processor 140 may identify at least one of the position or the contact pressure of the finger based on the output of the touch sensor 130.
[0167] In operation 330, the guide information may be output based on the identification result, and the outputting of the guide information may be achieved using at least one of a visual method, an auditory method, or a tactile method.
[0168] In operation 340, when the user's finger is placed at a predetermined position with a predetermined range of pressures, the processor 140 may acquire a PPG signal from a finger image captured by the front camera 121.
[0169] In order to acquire the PPG signal, the display 110 may emit light of a specific wavelength from an area (an emission area) corresponding to the front camera 121. The wavelength of light emitted from the emission area EA may be determined based on biometric information to be measured. In this case, light of a single wavelength or multi-wavelengths may be used according to the type of biometric information.
[0170] The method of acquiring the PPG signal from the finger image is the same as described above in the embodiment of the mobile device 100.
[0171] When the PPG signal is acquired, the processor 140 may acquire biometric information based on the acquired PPG signal, and the acquired biometric information may be provided to the user through the display 110 or the speaker 160.
[0172]
[0173] Referring to
[0174] In response to the finger not being located in the predetermined area (“No” in operation 322), in operation 331, the processor 140 may output guide information for guiding the finger to be located in the predetermined area. The guide information may be visually output through the display 110, audibly output through the speaker 160, or tactilely output by generating vibration in the mobile device 100.
[0175] The identifying of the position and the outputting of the guide information may be repeatedly performed until the finger is located in the predetermined area, and in response to the finger being located in the predetermined area (“Yes” in operation 322), in operation 340, a PPG signal may be acquired from a finger image captured by the front camera 121 as described above.
[0176]
[0177] Referring to
[0178] In response to the finger contact pressure not being included in the predetermined range (“No” in operation 324), in operation 332, the processor 140 may output guide information for guiding the finger contact pressure to fall within the predetermined range. The guide information may be visually output through the display 110, audibly output through the speaker 160, or tactilely output by generating vibrations in the mobile device 100.
[0179] The identifying of the contact pressure and the outputting of the guide information may be repeatedly performed until the finger contact pressure falls within the predetermined range, and in response to the finger contact pressure being within the predetermined range (“Yes” in operation 324), in operation 340, a PPG signal may be acquired from the finger image captured by the front camera 121 as described above.
[0180]
[0181] When the finger moves during measurement of the PPG signal, noise or artifacts due to the motion may occur. Accordingly, the method of controlling a mobile device according to an embodiment may guide the user not to move while measuring the PPG signal.
[0182] Referring to
[0183] In response to identifying that a motion has occurred (“Yes” in operation 422), in operation 430, guide information for indicating that a motion is not allowed may be output. The guide information may be visually output through the display 110, audibly output through the speaker 160, or tactilely output by generating vibration in the mobile device 100.
[0184] In response to no motion having occurred (“No” in operation 422), in operation 440, the processor 140 may acquire a PPG signal from the finger image captured by the front camera.
[0185] When a motion of the mobile device 100 has occurred during measurement of the PPG signal, distortion may occur in the PPG signal. Referring to
[0186] In response to identifying that a motion has occurred (“Yes” in operation 522), in operation 530, guide information indicating that a motion is not allowed may be output. The guide information may be visually output through the display 110, audibly output through the speaker 160, or tactilely output by generating vibration in the mobile device 100.
[0187] In response to no motion having occurred (“No” in operation 522), in operation 540, the processor 140 may acquire a PPG signal from the finger image captured by the front camera.
[0188] In the method of controlling a mobile device according to an embodiment, when the user moves during measurement of the PPG signal, the process of measuring the PPG signal may be controlled to prevent distortion of the PPG signal. Hereinafter, the control of the measurement process of the PPG signal will be described with reference to
[0189]
[0190] Referring to
[0191] In operation 630, the processor 140 may determine a pixel to be used for acquiring a PPG signal based on the current position of the finger, and, in operation 640, the processor 140 may acquire a PPG signal from the determined pixel. As described above, the acquiring of the PPG signal may include using a plurality of frame images captured by the front camera 121, in which the plurality of frame images may be captured according to a set frame rate and transmitted to the processor 140. The processor 140 may extract the PPG signal from at least one pixel corresponding to the current position of the finger in the transmitted frame image. That is, as described above with reference to
[0192] Referring to
[0193] In response to the contact pressure of the finger decreasing during measurement of the PPG signal (“Yes” in operation 721), in operation 731, the processor 140 may increase at least one of the brightness or size of the emission area EA.
[0194] In response to the contact pressure of the finger increasing during measurement of the PPG signal (“No” in operation 721, and “Yes” in operation 722), in operation 732, the processor 140 may decrease at least one of the brightness or size of the emission area EA.
[0195] In response to no change in the contact pressure (“No” in operation 722) or in response to at least one of the brightness or size of the emission area EA being adjusted according to the change in the contact pressure, in operation 740, the processor 140 may acquire a PPG signal from a finger image captured by the front camera.
[0196] According to the examples of
[0197] On the other hand, the computer program according to an embodiment may be a computer program that is stored in a recording medium to perform the operations described in the embodiment of the operations of the processor 140 and the method of controlling the mobile device described in the embodiment of the mobile device 100 described above in combination with the mobile device 100. The computer program may be installed by default in the mobile device 100 as described above, or may be installed by a user after the mobile device 100 is sold.
[0198] Since all operations executed by the computer program according to the embodiment are the same as those of the descriptions in the embodiment of the mobile device 100 and the embodiment of the method of controlling the mobile device, descriptions thereof will be omitted herein.
[0199] According to the mobile device, the control method thereof, and the computer program stored in the recording medium in combination with the mobile device described above, a PPG signal may be measured using components provided in the mobile device, such as a display, a front camera, and a touch sensor, without adding additional equipment.
[0200] In addition, the position or contact pressure of an object may be identified based on the output of the front camera or the output of the touch sensor, and suitable guide information may be output based on the identification result, such that accurate measurement of the PPG signal may be allowed.
[0201] In addition, the user's motion may be tracked based on the output of the front camera or the output of touch sensor, the pixel from which the PPG signal is to be extracted may be changed or the size or brightness of the emission area may be adjusted such that distortion is prevented from occurring in the PPG signal.
[0202] As disclosed herein, a mobile device, a method of controlling the same, and a computer program stored in a recording medium may measure a PPG signal using a display and a camera provided in the mobile device to thereby measure a PPG signal without having additional components or equipment and acquire biometric information based on the PPG signal.
[0203] As disclosed herein, a mobile device, a method of controlling the same, and a computer program stored in a recording medium may provide guide information to a user or correct distortion due to a motion based on a position or contact pressure of a finger identified using a touch sensor or a front camera provided in the mobile device to thereby improve the accuracy and reliability of a PPG signal using a basic configuration provided in the mobile device without having additional components or equipment.
[0204] The foregoing detailed descriptions may be merely an example of the disclosure and the aspects disclosed herein may be used through various combinations, modifications and environments. The aspects disclosed herein may be amended or modified, not being out of the scope, technical idea or knowledge in the art. Further, it is not intended that the scope of this application be limited to these specific embodiments or to their specific features or benefits. Rather, it is intended that the scope of this application be limited solely to the claims which now follow and to their equivalents. Further, the appended claims should be appreciated as a step including even another embodiment.