Display control device, display control system, display control method, display control program, and recording medium
10973441 · 2021-04-13
Assignee
Inventors
Cpc classification
A61B5/107
HUMAN NECESSITIES
H04N23/11
ELECTRICITY
A61B5/744
HUMAN NECESSITIES
G08B13/19691
PHYSICS
H04N23/633
ELECTRICITY
H04N7/18
ELECTRICITY
A61G12/00
HUMAN NECESSITIES
G08B13/19686
PHYSICS
G09G5/00
PHYSICS
International classification
A61B5/1171
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
Abstract
A display control device (10) comprises an image acquisition component (11), an orientation estimation component (12), and a controller (13). The image acquisition component (11) acquires an infrared image (G1) sensed by infrared array sensors (21a and 21b) installed in a room. The orientation estimation component (12) estimates the orientation of a care receiver (P1) in a room (30) on the basis of the infrared image (G1) acquired by the image acquisition component (11). The controller (13) controls a display device (22) so that a dummy image (D1), which shows a simplified view of the orientation of the care receiver (P1) estimated by the orientation estimation component (12), is displayed superimposed over the infrared image (G1).
Claims
1. A display control device comprising: an image acquisition component configured to acquire image data sensed by an image sensor installed in a room; an orientation estimation component configured to estimate the orientation of a person in the room on the basis of the image data acquired by the image acquisition component; and a controller configured to control a display device so that a first dummy image, which shows a simplified view of the orientation of the person estimated by the orientation estimation component, is displayed superimposed over the image data, and a determination component configured to determine a care urgency level according to an amount of flickering of an image near the person in the image data by referring to the plurality of pieces of image data continuously acquired by the image acquisition component, and the controller superimposes and displays the head or face portion of the first dummy image, using as a reference the position of the head or face of the person included in the image data.
2. The display control device according to claim 1, further comprising: a storage component configured to store a plurality of the first dummy images in a state of being associated with orientations of the person.
3. The display control device according to claim 1, wherein the image sensor is an infrared sensor configured to acquire a thermal image, and the determination component makes a determination according to the degree of fluctuation in the position of the thermal center of gravity in an area near the person in the thermal image continuously acquired by the infrared sensor.
4. The display control device according to claim 1, wherein the image sensor is an infrared sensor configured to acquire a thermal image, and the determination component makes a determination by detecting enlargement of a heat source in an area near the person in the thermal image continuously acquired by the infrared sensor.
5. The display control device according to claim 1, further comprising a count estimator configured to estimate the number of people included in the image data acquired by the image acquisition component.
6. The display control device according to claim 5, wherein the controller performs display control of the display device when it is determined by the count estimator that the number of people in the image data is one.
7. The display control device according to claim 1, wherein the controller changes a color of the first dummy image on the display device according to an estimation result produced by the orientation estimation component.
8. The display control device according to claim 1, wherein the controller flashes the first dummy image on the display device according to an estimation result produced by the orientation estimation component.
9. The display control device according to claim 1, wherein the controller displays the first dummy image superimposed over the image data and displays an emergency message on the display device, according to the estimation result produced by the orientation estimation component.
10. The display control device according to claim 1, wherein the orientation estimation component estimates the orientation by using detection results produced by the plurality of the image sensors, which detect a person who is the room from different directions.
11. The display control device according to claim 1, wherein the controller displays the first dummy image, corresponding to at least one of a standing position, a seated position, a supine position, a lateral recumbent position, a prone position, and a half-sitting position of the person, superimposed over the image data.
12. The display control device according to claim 1, wherein the controller displays a second dummy image corresponding to furniture and equipment installed in the room so as to be superimposed with the locations of the furniture and the equipment in the image data.
13. A display control system, comprising: the display control device according to claim 1; and an image sensor configured to supply the image data to the image acquisition component.
14. The display control system according to claim 13, further comprising a display device whose display is controlled by the controller of the display control device.
15. The display control system according to claim 14, wherein the display device includes the display device of a host terminal used by the caregiver who cares for the person who is in the room, or a portable terminal owned by the caregiver.
16. The display control system according to claim 1, wherein the image sensor includes: a first image sensor configured to detect the person from above in the room; and a second image sensor configured to detect the person from a side.
17. The display control system according to claim 1, wherein the image sensor is either an infrared array sensor or a distance sensor.
18. A display control method, comprising: acquiring image data sensed by an image sensor installed in a room; estimating the orientation of a person in the room on the basis of the image data acquired in the image acquisition step; controlling a display device so that a first dummy image, which shows a simplified view of the orientation of the person estimated in the orientation estimation step, is displayed superimposed over the image data, and superimposing and displaying the head or face portion of the first dummy image, using as a reference the position of the head or face of the person included in the image data; and determining a care urgency level according to an amount of flickering of an image near the person in the image data by referring to the plurality of pieces of image data continuously acquired.
19. A non-transitory computer-readable storage medium that stores a display control program for causing a computer to execute a display control method, the display control method comprising: acquiring image data sensed by an image sensor installed in a room; estimating the orientation of a person in the room on the basis of the acquired image data; controlling a display device so that a first dummy image, which shows a simplified view of the estimated orientation of the person, is displayed superimposed over the image data, and superimposing and displaying the head or face portion of the first dummy image, using as a reference the position of the head or face of the person included in the image data; and determining a care urgency level according to an amount of flickering of an image near the person in the image data by referring to the plurality of pieces of image data continuously acquired.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
DETAILED DESCRIPTION
(26) A display control device 10 according to an embodiment of the present invention, and a display control system 20 comprising this device, will be described below through reference to
(27) 1. Configuration of Display Control System 20
(28) The display control system 20 according to this embodiment is, for example, a system for monitoring a care receiver in need of nursing care in his everyday activities, such as in a nursing home or a hospital, and allowing a caregiver to accurately recognize the state of the c are receiver. As shown in
(29) Here, with the display control system 20 in this embodiment, in order to watch over the care receiver (person) P1 in a room 30, as shown in
(30) A bed 31, a wheelchair 32, and other such in-room equipment are installed in the room 30, forming the living space of the care receiver P1. A coordinate system X-Y-Z of the inside of the room is shown in
(31) The display control device 10 uses an infrared image G1 (see
(32) Here, with the display control device 10 in this embodiment, the infrared image G1 (see
(33) The infrared array sensors 21a and 21b are designed so that the portion of a person's face in the captured infrared image G1 taken is made up of 2 to 6 pixels. Consequently, an individual cannot be identified even with individual recognition technology using a portion of a face in an image, for example, which allows someone to watch over the care receiver P1 while still protecting the privacy of the care receiver.
(34) As shown in
(35) As shown in
(36) As shown in
(37) As shown in
(38) The infrared image data acquired by the infrared array sensors 21a and 21b is transmitted to the image acquisition component 11 of the display control device 10.
(39) The infrared image data captured by the infrared array sensors 21a and 21b is such that the temperature of the image region increases the higher is the pixel value, and the temperature of the image region decreases the lower is the pixel value. That is, since an area where a person is located will have a higher temperature, the value of pixels in an area where a person was imaged will be higher. Thus, an area where a person is present (a region of higher temperature) can be identified from the infrared image data by finding an area with higher pixel values.
(40) The display device 22 is, for example, a monitor of a host computer 22a installed in a nursing facility (see
(41) The host computer 22a is installed in a caregiver station in a nursing home, for example. This allows a caregiver to remotely check on a number of care receivers P1.
(42) The portable terminal 22b encompasses a mobile phone, smart phone, tablet terminal, or the like owned by a caregiver.
(43) Consequently, even if the caregiver is in the room of another care receiver P1, or is moving around somewhere, if an image is transmitted to the portable terminal 22b, in the event of an emergency the caregiver can rush straight to the care receiver P1.
(44) 2. Configuration of Display Control Device 10
(45) As shown in
(46) The image acquisition component 11 acquires from the infrared array sensors 21a and 21b the infrared image data captured by the infrared array sensors 21a and 21b.
(47) The orientation estimation component 12 uses the infrared image data acquired by the image acquisition component 11 to estimate the orientation of the care receiver P1 in the room 30.
(48) The orientation estimation done the orientation estimation component 12 is accomplished using a known method (see Patent Literature 1) involving a transition model (described below). The orientations of the care receiver P1 estimated by the orientation estimation component 12 include, for example, a standing position, a seated position, a supine position, a lateral recumbent position, a prone position, a half-sitting position, and so forth.
(49) The controller 13 selects a dummy image (first dummy image) D1 (see
(50) Here, the dummy image D1 is an image showing a simplified view of the appearance and shape of the care receiver P1 whose orientation has been estimated, and images corresponding to a plurality of types of orientation are prepared. In this embodiment, dummy images D1 are prepared for the respective directions need to superimpose them over images of the care receiver P1 captured from above and from the side.
(51) The display control of the display device 22 by the controller 13 will be described below in greater detail.
(52) The storage component 14 stores the infrared image data acquired by the image acquisition component 11, in association with a plurality of dummy images D1 corresponding to a plurality of types of orientation of the care receiver P1 estimated by the orientation estimation component 12.
(53) The dummy images D1 corresponding to the orientation of the care receiver P1 that are stored in the storage component 14 include, for example, a standing position, a seated position, a supine position, a lateral recumbent position, a prone position, and a half-sitting position.
(54) These dummy images D1 are stored in the storage component 14 in a number of types corresponding to the orientations of the infrared image G1 in which the room 30 is imaged from above and the infrared image G1 as imaged from the side.
(55) For example, with a standing position, a dummy image D1 to be superimposed over t the infrared image G1 captured from above, and a dummy image D1 to be superimposed over the infrared image G1 captured from the side may be provided. Similarly, with other orientations, dummy images D1 corresponding to the infrared image G1 from above and to the infrared image G1 from the side are stored in the storage component 14.
(56) The determination component 15 determines how urgently the care receiver P1 in the room 30 needs care, on the basis of the infrared image data acquired by the image acquisition component 11.
(57) More specifically, the determination component 15 refers, for example, to a plurality of infrared images G1 that have been acquired continuously (a moving picture), detects changes in flickering in the image, the expansion of a heat source, or the like, and thereby determines how urgently care is needed.
(58) For instance, if the orientation of the care receiver P1 estimated by the orientation estimation component 12 is a seated position, and there is little change in flickering in the plurality of images continuously acquired by the image acquisition component 11, there is the risk that the care receiver P1 has slumped over while in a seated position.
(59) Conversely, if the orientation of the care receiver P1 is a seated position, and there is a large amount of flicker in the plurality of images continuously acquired by the image acquisition component 11, it is presumed that the care receiver P1 is performing some operation, such as changing clothes or operating a portable terminal, while seated.
(60) Thus, if there is almost no flickering over a long period while the same orientation is maintained in the plurality of infrared images G1 that are continuously acquired, the determination component 15 determines that there is an urgent need for care, and the controller 13 causes the display component 22 to give a warning display.
(61) Also, if the orientation of the care receiver P1 estimated by the orientation estimation component 12 is a recumbent position, and a heat source in the plurality of images continuously acquired by the image acquisition component 11 has expanded, there is a high probability that the care receiver P1 is vomiting (see
(62) Thus, in this case, the determination component 15 determines that there is an urgent need for care, and the controller 13 causes the display device 22 to give a warning display.
(63) Here, examples of a warning display include flashing of the dummy image D1 on the display device 22, changing the color of the display, and giving a warning that combines text information. When the color is changed, for example, it may be changed from the normal display color of the dummy image D1 to red. Also, a warning display that combines text information may include, for example, a text display such as “care receiver needs nursing care!” A warning sound such as a beep may also be used in conjunction with the display.
(64) The count estimation component 16 uses the infrared image data acquired by the image acquisition component 11 to estimate the number of people in the room 30.
(65) More specifically, the count estimation component 16 senses a high-temperature portion presumed to be a human head included in the infrared image G1, and counts the number of such portions to estimate the number of people in the room 30.
(66) The number of people estimated to be in the room 30 by the count estimation component 16 is sent to the controller 13. If the number of people in the room 30 estimated by the count estimation component 16 is two or more, the controller 13 performs no display control of the display device 22.
(67) That is, with the display control device 10 in this embodiment, if the number of care receivers P1 residing in the room 30 is usually just one, for example, it is assumed that there is less need for watching over the room if there is someone in it other than the care receiver P1 (such as a caregiver, a close relative, or a friend).
(68) Thus, with the display control apparatus 10 in this embodiment, when there is someone other than the care receiver P1 in the room 30, that is, when the value estimated by the count estimation component 16 is two or more, no display control is performed on the display device 22.
(69) 3. Display Control
(70) 3-1. Orientation Estimation
(71) With the display control device 10 in this embodiment, as described above, the image acquisition component 11 acquires infrared image data captured by the infrared array sensors 21a and 21b. The orientation estimation component 12 then uses the infrared image G1 taken of the room 30 from above and the infrared image G1 taken from the side (both acquired by the image acquisition component 11) to estimate the orientation of the care receiver P1.
(72) Estimation of the orientation of the care receiver P1 by the orientation estimation component 12 is performed by the following procedure.
(73) That is, the orientation estimation component 12 performs orientation estimation by using examination information that includes region information representing an inspection region of the infrared image data (from above and from the side) at each orientation in a transition model of orientation of the human body, and reference value information representing a reference value for determining whether a transition has been made to each orientation.
(74) The transition model information and examination information are used to estimate the orientation of the body at the current point in time.
(75)
(76) The region RV1 is the total region of the infrared image data, that is, a region in the range of 0≤X≤15, 0≤Y≤15.
(77) The region RV2 is the region a specific width (3 pixels) from a position adjacent to the boundary in the lengthwise direction of the region corresponding to the bed 31 (see
(78) The region RV3 is the region corresponding to the bed 31, that is, a region in the range of 3≤X≤6, 3≤Y≤10.
(79) The region RV4 is the region excluding the region RV3, and is made up of four regions. The first region is a region in the range of 0≤X≤2, 0≤Y≤15. The second region s a region in the range of 7≤X≤15, 0≤Y≤15. The third region is a region in the range of 3≤X≤6, 0≤Y≤2. The fourth region is a region in the range of 3≤X≤6, 11≤Y≤15.
(80)
(81) The region RH1 is the entire region of the second image data, that is, a region in the range of 0≤X≤15, 0≤Z≤15.
(82) The range in the vertical direction (Z direction) of the region RH2 is the total range. The range in the horizontal direction (X direction) of the region RH2 is a range of a specific width (3 pixels) from a position adjacent to the range corresponding to the bed 31 (11≤X≤15). Therefore, the region RH2 is a region in the range of 8≤X≤10, 0≤Z≤15.
(83) The range in the vertical direction (Z direction) of the region RH3 is a range of a specific width (5 pixels) from a position adjacent to the upper boundary of the range corresponding to the bed 31. The range in the horizontal direction (X direction) of the region RH3 is a range the same as the range corresponding to the bed 31. Therefore, the region RH3 is a region in the range of 11≤X≤15, 5≤Z≤9.
(84) The range in the vertical direction (Z direction) of the region RH4 is a range of a specific width (three pixels) from a position adjacent to the upper boundary of the range corresponding to the bed 31. The range in the horizontal direction (X direction) of the region RH4 is a range the same as the range corresponding to the bed 31. Therefore, the region RH4 is a region in the range of 11≤X≤15, 7≤Z≤9.
(85) The range in the horizontal direction of the region RH5 is a range obtained by excluding the range corresponding to the bed 31 from the entire range. The range in the vertical direction of the region RH5 is a range of a specific width (six pixels) upward from the lowest position (Z=15). Therefore, the region RH5 is a region in a range of 0≤X≤10, 10≤Z≤15.
(86) The range in the horizontal direction of the region RH6 is a range obtained by excluding the range corresponding to the bed 31 from the entire range. The range in the vertical direction of the region RH6 is a range of a specific width (three pixels) upward from a specific position (Z=12). Therefore, the region RH6 is a region in a range of 0≤X≤10, 10≤Z≤12.
(87)
(88) The care receiver P1 (the person being monitored) is not in the room 30 in the initial state (A).
(89) The orientation that comes after the orientation (A) is either that the person is in the room (B) or the original orientation (A).
(90) The orientation that comes after the orientation (B) is either standing (C) next to the bed 31, fallen (X), not in the room (A), or the original orientation (B).
(91) The orientation that comes after the orientation (C) is either lying on the bed 31 (D), sitting on the bed 31 (E), sitting on the end of the bed 31 (F), fallen (X), in the room (B), or the original orientation (C).
(92) The orientation that comes after the orientation (D) is either sitting on the bed 31 (E), sitting in end of the bed 31 (F), standing next to the bed 31 (C), fallen from the bed 31 (first pattern) (Y1), or the original orientation (D).
(93) The orientation that comes after the orientation (E) is either lying on the bed 31 (D), sitting on the end of the bed 31 (F), standing next to the bed 31 (C), fallen from the bed 31 (first pattern) (Y1), or the original orientation (E).
(94) The orientation that comes after the orientation (F) is either lying on the bed 31 (D), sitting on the bed 31 (E), standing next to the bed 31 (C), fallen from the bed 31 (second pattern) (Y2), or the original orientation (F).
(95)
(96) In order to determine whether or not there has been a transition to absence (A) in the room 30, the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV1 of the infrared image data captured from above, plus the statistical quantity of the region RH1 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THA, it is determined that there has been a shift to the orientation (A).
(97) In order to determine whether or not there has been a transition to being in the room 30 (B), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV1 of the infrared image data captured from above, plus the statistical quantity of the region RH1 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THB, it is determined that there has been a shift to the orientation (B).
(98) In order to determine whether or not there has been a transition to standing next to the bed 31 (C), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV2 of the infrared image data captured from above, plus the statistical quantity of the region RH2 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THC, it is determined that there has been a shift to the orientation (C).
(99) In order to determine whether or not there has been a transition to lying on the bed 31 (D), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV3 of the infrared image data captured from above, plus the statistical quantity of the region RH4 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THD, it is determined that there has been a shift to the orientation (D).
(100) In order to determine whether or not there has been a transition to sitting on the bed 31 (E), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV3 of the infrared image data captured from above, plus the statistical quantity of the region RH3 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THE, it is determined that there has been a shift to the orientation (E).
(101) In order to determine whether or not there has been a transition to sitting on the end of the bed 31 (F), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV3 of the infrared image data captured from above, plus the statistical quantity of the region RH2 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THF, it is determined that there has been a shift to the orientation (F).
(102) In order to determine whether or not there has been a transition to having fallen (X), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV4 of the infrared image data captured from above, plus the statistical quantity of the region RH5 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THX, it is determined that there has been a shift to the orientation (X).
(103) Here, in order to determine whether or not there has been a transition to having fallen from the bed 31 (first pattern) (Y1), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV2 of the infrared image data captured from above, plus the statistical quantity of the region RH6 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THY1, it is determined that there has been a shift to the orientation (Y1).
(104) The reason why the region RH6 is used for this determination is that the location where the infrared array sensor 21b is installed is only a short distance from the location where the care receiver P1 has fallen out of the bed 31, so with the infrared image data captured from the side, the captured portion of the care receiver P1 who has fallen is at a position higher than the lowest line (Y=15).
(105) Also, in order to determine whether or not there has been a transition to having fallen from the bed 31 (second pattern) (Y2), the infrared array sensors 21a and 21b calculate the sum of the statistical quantity of the region RV2 of the infrared image data captured from above, plus the statistical quantity of the region RH6 of the infrared image data captured from the side. When this sum of the statistical quantities is at or above a reference value THY2, it is determined that there has been a shift to the orientation (Y2).
(106) The reason why the region RH6 is used for this determination is the same as with falling out of the bed 31 (first pattern).
(107) 3-2. Display Control
(108) With the display control device 10 in this embodiment, as a result of the above-mentioned orientation estimation, the display device 22 is controlled so that the dummy image D1 corresponding to each orientation stored in the storage component 14 is displayed superimposed over the infrared image G1 on the basis of the estimated orientation of the care receiver P1.
(109) That is, the storage component 14 stores a plurality of the dummy images D1 corresponding to the orientations of the care receiver P1 estimated by the orientation estimation component 12. Therefore, as shown in
(110) Here, when the dummy image D1 corresponding to the estimated orientation is displayed superimposed over the infrared images G1, the superposition is based on the position of the head of the care receiver P1.
(111) More precisely, in the infrared image G1, the head and face portions are usually located higher up on the body, and are usually exposed most of the time, so these portions are represented as having a higher temperature. Thus, the head and face portions of the dummy image D1 are positioned so as to be superimposed in their display on the display device 22, based on the estimated position of the heat and face portions in the infrared image G1.
(112) 3-3. Flow of Display Control
(113) The display control device 10 in this embodiment carries out display control according to the flowchart shown in
(114) That is, in step S11, the image acquisition component 11 acquires the infrared image G1 captured in the room 30 from above and from the side, using the two infrared array sensors 21a and 21b.
(115) Next, in step S12, the count estimation component 16 uses the infrared image G1 thus acquired to estimate the number of people in the room 30.
(116) Next, in step S13, it is determined whether or not the number of people in the room 30 estimated by the count estimation component 16 just one. If there is only one person, the process proceeds to step S14. On the other hand, if there are two or more people, it is determined that someone other than the care receiver P1 is in the room 30 and can provide care, and the display control process is ended.
(117) Next, in step S14, the orientation estimation component 12 estimates the orientation of the care receiver P1 determined to be the only person in the room 30 in step S13.
(118) Next, in step S15, the controller 13 reads from the storage component 14 the dummy image D1 corresponding to the orientation (such as a standing position) of the care receiver P1 estimated in step S14.
(119) Next, in step S16, the controller 13 superimposes and displays the dummy image D1 read out from the storage component 14, based on the head or face portion of the care receiver P1 included in the infrared image G1.
(120) Consequently, the caregiver can recognize the orientation of the care receiver P1 displayed on the display device 22, and can easily determine whether or not there is any abnormality.
(121) 3-4. Flow of Orientation Estimation
(122) With the display control device 10 in this embodiment, the orientation estimation in step S14 in the flowchart of
(123) That is, after the infrared image G1 is acquired from the infrared array sensors 21a and 21b in step S11, it is determined in step S21 whether or not the orientation of the care receiver P1 in the infrared image G1 is indicates that the person has fallen down. Here, if it is determined that the person has fallen down, the process proceeds to step S22. On the other hand, if it is determined that the person has not fallen down, that means the orientation is something other than having fallen down, so the process proceeds to step S23.
(124) Next, in step S22, the orientation estimation component 12 determines that the care receiver P1 is in a fallen orientation based on the determination result in step S21.
(125) Next, in step S23, it is determined whether or not the care receiver P1 is at the side of the bed in the infrared image G1. Here, if the care receiver P1 is determined to be at the side of the bed, the process proceeds to step S24. On the other hand, if the care receiver P1 has been determined not to be at the side of the bed, the process proceeds to step S25.
(126) Next, in step S24, it is determined from the result of determination by the orientation estimation component 12 in step S23 that there is someone next to the bed.
(127) Next, in step S25, it is determined whether or not the care receiver P1 in the infrared image G1 is on the bed. If it is determined that the care receiver P1 is on the bed, the process proceeds to step S26. On the other hand, if it is determined that the care receiver P1 is not on the bed, the process proceeds to step S29.
(128) Next, in step S26, it is determined whether or not the care receiver P1 in the infrared images G1 is in a seated position on the bed. Here, if it has been determined that the care receiver P1 is in a seated position on the bed, the process proceeds to step S27. On the other hand, if the care receiver P1 has been determined not to be in a seated position on the bed, the process proceeds to step S28.
(129) Next, in step S27, it is determined from the result of determination by the orientation estimation component 12 in step S26 that the care receiver P1 is in a seated position on the bed.
(130) Next, in step S28, it is determined from the result of determination by the orientation estimation component 12 in step S26 that the care receiver P1 is in a recumbent position on the bed.
(131) Next, in step S29, since the care receiver P1 is neither the beside the bed or on the bed, it is determined whether or not the care receiver P1 is within the measurement region in the infrared image G1. Here, if it is determined that the care receiver P1 is within the measurement region, the process proceeds to step S30. On the other hand, if it is determined that the care receiver P1 is outside the measurement region, the process proceeds to step S31.
(132) Next, in step S30, it is determined that the care receiver P1 is moving somewhere within the room 30 other than next to the bed or on the bed.
(133) Next, in step S31, it is checked once again whether the care receiver P1 is on the bed. Here, if it is determined that the care receiver P1 is on the bed, the process proceeds to step S33. On the other hand, if the person is determined not to be on the bed, the process proceeds to step S32.
(134) Next, in step S32, it is determined from the result of determination using the infrared image G1 that the care receiver P1 has not fallen down and is not within the measurement region next to the bed or on the bed, and is therefore outside the measurement region.
(135) Next, in step S33, since it has been determined in step S32 that the care receiver P1 is on the bed, it is determined that the person is in a recumbent position covered by a blanket, for example.
(136) 4. Display Example of Dummy Image for Each Orientation
(137) 4-1. Standing Position
(138) As shown in
(139) More specifically, as shown in
(140) On the other hand, as shown in
(141) A dummy image (second dummy image) D2 showing the bed 31 as a piece of interior furnishing is displayed on the infrared image G1 shown in
(142) Consequently, where the care receiver P1 is located within the room 30 can be recognized while referring to the position of the bed 31 or other such interior furnishings.
(143) 4-2. Seated Position
(144) As shown in
(145) More specifically, as shown in
(146) On the other hand, as shown in
(147) 4-3. Recumbent Position (with Blanket)
(148) As shown in
(149) More specifically, as shown in
(150) If the care receiver P1 sleeping on the bed 31 is covered by a blanket (dummy image D3), as shown in
(151) Meanwhile, as shown in
(152) Similarly to
(153) 4-4. Recumbent Position (Blanket Turned Down)
(154) As shown in
(155) More specifically, as shown in
(156) If the lower half of the body of the care receiver P1 sleeping on the bed 31 is covered by a blanket (dummy image D3), as shown in
(157) On the other hand, as shown in
(158) Similarly to
(159) 4-5. Recumbent Position (Fallen)
(160) As shown in
(161) More specifically, as shown in
(162) In this case, the care receiver P1 is sleeping in a different location from that in the dummy image D2 showing the bed 31, and is assumed to have fallen, so the dummy image D1 is, for example, shown in red, yellow, or some other color that is different from the normal color, or is flashing.
(163) On the other hand, as shown in
(164) Similarly to
(165) This allows a caregiver to check the infrared image G1 shown in
(166) 4-6. Seated Position (Fallen)
(167) As shown in
(168) More specifically, as shown in
(169) In this case, since the care receiver P1 is assumed to have fallen away from the dummy image D2 showing the bed 31 and is sitting next to it, the dummy image D1 is, for example, shown in red, yellow, or some other color that is different from the normal color, or is flashing.
(170) On the other hand, as shown in
(171) Similarly to
(172) This allows a caregiver to check the infrared image G1 shown in
(173) The urgency of a care receiver P1 who is in a seated position next to the bed 31 can be determined by checking a continuously acquired infrared image G1 and looking for any movement.
(174) 4-7. Recumbent Position (with Flickering)
(175) As shown in
(176) Thus, in this case, the determination component 15 determines that this is not a state in which the care receiver P1 needs care.
(177) As shown in
(178) Furthermore, as shown in
(179) In this case, the care receiver P1 needs help getting up, so the determination component 15 determines this to be a state in which care is required.
(180) Here, determination using the flickering of pixels in the infrared image G1 is carried out as follows.
(181) That is, in performing determination based on the flickering of a continuously acquired infrared image, the determination component 15 makes its determination according to the fluctuation of the position of the thermal center of gravity in the image.
(182) More specifically, the position of the thermal center of gravity is found, for example, by cutting out a region in which the temperature is at least a specific amount above room temperature from the infrared image G1, and finding the average position in the XY direction of the pixels of the cut-out region. The position of this thermal center of gravity may be determined as a measure of how urgently care is needed, according to whether or not the position has moved (fluctuated) from a specific threshold.
(183) Consequently, when an infrared sensor is used as the image sensor, the characteristics of an infrared image (a thermal image) can be utilized to determine a state in which the care receiver P1 needs care when the change in the position of the thermal center of gravity is smaller than a predetermined threshold, for example.
(184) Conversely, if the change in the position of the thermal center of gravity is greater than a predetermined threshold, it can be determined, for example, that the care receiver is changing clothes, operating a portable terminal, or in some other such state that does not require care.
(185) As a result, the caregiver can look at the image of the care receiver P1 displayed on the display component 22 and accurately determine whether or not care is needed.
(186) 4-8. Recumbent Position (Heat Source Expansion)
(187) As shown in
(188) Therefore, in this case, the determination component 15 determines that this is a state in which the care receiver P1 needs care urgently.
(189) The controller 13 takes into account the result of estimating the orientation by the orientation estimation component 12 and the result of determining the urgency of care by the determination component 15, and controls the display so that the dummy image D1 is changed to another color, is flashed, etc., as shown in
(190) This allows the caregiver to check the screen displayed on the display device 22 and easily recognize that the care receiver P1 is a recumbent position and urgently needs care.
(191) 4-9. Seated Position (with Flickering)
(192) As shown in
(193) Thus, in this case, the determination component 15 determines that this is not a state in which the care receiver P1 needs care.
(194) Conversely, when the care receiver P1 is in a seated position in the wheelchair 32 and there is almost no flickering for at least a certain length of time near the upper half of the body of the care receiver P1 in the continuously acquired infrared image, there is the risk that the care receiver P1 is slumped over in the wheelchair.
(195) Thus, in this case, the determination component 15 determines that this is a state in which the care receiver P1 needs care.
(196) In this case, the controller 13 takes into account the result of estimating the orientation by the orientation estimation component 12 and the result of determining the urgency of care by the determination component 15, and controls the display so that the dummy image D1 is changed to another color, is flashed, etc.
(197) Consequently, the caregiver can recognize that the care receiver P1 has been stationary in the wheelchair for a long time, and can rush to the room of the care receiver P1.
Other Embodiments
(198) An embodiment of the present invention was described above, but the present invention is not limited to or by the above embodiment, and various modifications are possible without departing from the gist of the invention.
(199) (A)
(200) In the above embodiment, an example was given in which the present invention was implemented as the display control device 10 for conducting the above-mentioned display control method, and as the display control system 20 comprising said device, but the present invention is not limited thereto.
(201) For instance, the present invention may be implemented as the above-mentioned display control method.
(202) Alternatively, the present invention may be implemented as a display control program for causing a computer to execute the above-mentioned display control method. Furthermore, the present invention may be implemented as a non-transitory computer-readable storage medium for storing this display control program.
(203) Regardless of how the invention is implemented, it allows the same effect as described above to be obtained.
(204) (B)
(205) In the above embodiment, as shown in
(206) For example, as shown in
(207) That is, the display control device of the present invention may be configured not to have an internal storage component, so long as the configuration is one in which dummy images are read from an external storage component (cloud space, server, etc.).
(208) (C)
(209) In the above embodiment, an example was described in which the dummy image D1 was displayed superimposed over the infrared image G1 on the display device 22, but the present invention is not limited to this.
(210) For instance, as shown in
(211) In this case, the urgency of the care can be displayed in a way that is more easily understandable to the caregiver.
(212) This text information may consist of a plurality of sets of text information corresponding to the determination result by the determination component that determines the how urgently the care receiver needs care, with this information stored ahead of time in the storage component 14 or the like.
(213) (D)
(214) In the above embodiment, as shown in
(215) For example, an image sensor may be provided on just the ceiling, or just on the wall, an image sensor for imaging the human body from an angle may be used.
(216) However, in order to determine the body orientation correctly, it is preferable to adopt a configuration as in the above embodiment, in which a image sensors are provided on both the ceiling and the wall.
(217) (E)
(218) In the above embodiment, an example was described in which the infrared array sensors 21a and 21b that sensed the temperature distribution were used as image sensors for transmitting images to the image acquisition component 11, but the present invention is not limited to this.
(219) For example, it is also possible to use other image sensors, such as a surveillance camera or a distance sensor, as the image sensors.
(220) (F)
(221) In the above embodiment, an example was described in which the display controller 10 and the display control system 20 of the present invention were applied to monitoring in a care facility or the like where a plurality of care receivers P1 are cared for, but the present invention is not limited to this.
(222) For example, in addition to nursing facilities, the present invention may also be applied to a facility such as a hospital or the like in which disabled people or elderly people live.
(223) Alternatively, the present invention may be applied to the monitoring of elderly people who are living alone.
(224) In this case, mobile terminals own by the children who live apart from the elderly people can be used as a display device, allowing the monitoring to be carried out from a remote location.
INDUSTRIAL APPLICABILITY
(225) The display control device of the present invention has the effect of allowing a situation in which a care receiver is in need of care, for example, to be properly recognized by a caregiver, and therefore can be widely applied to various kinds of device used for monitoring seniors, children, pets, and so forth.
REFERENCE SIGNS LIST
(226) 10 display control device 11 image acquisition component 12 orientation estimation component 13 controller 14 storage component 15 determination component 16 count estimation component 20 display control system 21a infrared array sensor (first image sensor) (ceiling) 21b infrared array sensor (second image sensor) (side) 22 display device 22a host computer (display device) 22b mobile terminal (display device) 30 room 30a ceiling 30b wall 31 bed 32 wheelchair 33 shelf 110 display control device 114 cloud space 120 display control system G1 infrared image (image) G1a flickering region G1b heat source expansion region D1 dummy image (caregiver) (first dummy image) D2 dummy image (bed) (second dummy image) D3 dummy image (blanket) (second dummy image) D4 dummy image (wheelchair) (second dummy image) P1 care receiver (person)