AUGMENTED REALITY PATIENT INTERFACE DEVICE FITTING APPRATUS
20170315359 · 2017-11-02
Assignee
Inventors
Cpc classification
G16H20/30
PHYSICS
G06F3/011
PHYSICS
G02B2027/0187
PHYSICS
A61M16/0605
HUMAN NECESSITIES
G06T19/00
PHYSICS
G02B27/0179
PHYSICS
International classification
Abstract
An augmented reality apparatus for facilitating a patient interface device fitting process includes a device or system for providing a real-time image of the patient, and a processor apparatus structured to: (i) store a plurality of augmented reality component data files, each of the augmented reality component data files relating to either a structure or an aspect of one or more particular patient interface devices or a problem that may be encountered in the fitting process, and (ii) cause an augmented real-time image to be created and transmitted by the augmented reality apparatus by augmenting the real-time image using at least one of the augmented reality component data files.
Claims
1. An augmented reality apparatus for facilitating a patient interface device fitting process, comprising: means for providing a real-time image of the patient; and a processor apparatus structured to: (i) store a plurality of augmented reality component data files, each of the augmented reality component data files relating to either a structure or an aspect of one or more particular patient interface devices or a problem that may be encountered in the fitting process, and (ii) cause an augmented real-time image to be created and transmitted by the augmented reality apparatus by augmenting the real-time image using at least one of the augmented reality component data files.
2. An augmented reality apparatus according to claim 1, wherein the means for providing a real-time image comprises a combiner, wherein the augmented reality apparatus includes a projector, and wherein the processor apparatus is structured to cause the augmented real-time image to be created by causing the projector to create and transmit to the combiner a supplemental image which is combined with the real-time image.
3. An augmented reality apparatus according to claim 2, wherein the augmented reality apparatus is a head mounted display apparatus, wherein the combiner is a partially transmissive and partially reflective lens, and wherein the projector is mounted on a frame member of the head mounted display apparatus in a manner where in the supplemental image is transmitted to a reflective surface of the lens.
4. An augmented reality apparatus according to claim 3, wherein the head mounted display apparatus is a pair of eyeglasses
5. An augmented reality apparatus according to claim 1, wherein the means for providing a real-time image comprises an image capture device, wherein the augmented reality apparatus includes a display device, and wherein the processor apparatus is structured to cause the augmented real-time image to be created by creating a supplemental image using the least one of the augmented reality component data files, supplementing the real-time image using the supplemental image to create the augmented real-time image, and causing the augmented real-time image to be transmitted by the display device.
6. An augmented reality apparatus according to claim 1, wherein the at least one of the augmented reality component data files represents a 2D or 3D patient interface device image, and wherein the augmented real-time image comprises the 2D or 3D patient interface device image rendered on the real-time image.
7. An augmented reality apparatus according to claim 1, wherein the at least one of the augmented reality component data files represents a contact region image showing where the one or more particular patient interface devices would contact a wearer's face, and wherein the augmented real-time image comprises the contact region image rendered on the real-time image.
8. An augmented reality apparatus according to claim 1, wherein the at least one of the augmented reality component data files represents an instructional tag image indicating a potential problem area, and wherein the augmented real-time image comprises the instructional tag image rendered on the real-time image.
9. An augmented reality apparatus according to claim 1, wherein the at least one of the augmented reality component data files represents an instructional tag image indicating instructions for using the one or more particular patient interface devices, and wherein the augmented real-time image comprises the instructional tag image rendered on the real-time image.
10. An augmented reality patient interface device fitting method, comprising: storing a plurality of augmented reality component data files, each of the augmented reality component data files relating to either a structure or an aspect of one or more particular patient interface devices or a problem that may be encountered in the fitting method; providing a real-time image of a patient; and creating and transmitting an augmented real-time image by augmenting the real-time image using at least one of the augmented reality component data files.
11. An augmented reality patient interface device fitting method according to claim 10, wherein the providing the real-time image employs a combiner, and wherein the creating and transmitting the augmented real-time image comprises causing a projector to create and transmit to the combiner a supplemental image which is combined with the real-time image.
12. An augmented reality patient interface device fitting method according to claim 10, wherein the providing the real-time image employs an image capture device, and wherein the creating and transmitting the augmented real-time image comprises creating a supplemental image using the least one of the augmented reality component data files, supplementing the real-time image using the supplemental image to create the augmented real-time image, and causing the augmented real-time image to be transmitted by a display device.
13. An augmented reality patient interface device fitting method according to claim 10, wherein the at least one of the augmented reality component data files represents a 2D or 3D patient interface device image, and wherein the augmented real-time image comprises the 2D or 3D patient interface device image rendered on the real-time image.
14. An augmented reality patient interface device fitting method according to claim 10, wherein the at least one of the augmented reality component data files represents a contact region image showing where the one or more particular patient interface devices would contact a wearer's face, and wherein the augmented real-time image comprises the contact region image rendered on the real-time image.
15. An augmented reality patient interface device fitting method according to claim 10, wherein the at least one of the augmented reality component data files represents an instructional tag image indicating a potential problem area, and wherein the augmented real-time image comprises the instructional tag image rendered on the real-time image.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011]
[0012]
[0013]
[0014]
[0015]
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0016] As used herein, the singular form of “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise. As used herein, the statement that two or more parts or components are “coupled” shall mean that the parts are joined or operate together either directly or indirectly, i.e., through one or more intermediate parts or components, so long as a link occurs. As used herein, “directly coupled” means that two elements are directly in contact with each other. As used herein, “fixedly coupled” or “fixed” means that two components are coupled so as to move as one while maintaining a constant orientation relative to each other.
[0017] As used herein, the word “unitary” means a component is created as a single piece or unit. That is, a component that includes pieces that are created separately and then coupled together as a unit is not a “unitary” component or body. As employed herein, the statement that two or more parts or components “engage” one another shall mean that the parts exert a force against one another either directly or through one or more intermediate parts or components. As employed herein, the term “number” shall mean one or an integer greater than one (i.e., a plurality).
[0018] Directional phrases used herein, such as, for example and without limitation, top, bottom, left, right, upper, lower, front, back, and derivatives thereof, relate to the orientation of the elements shown in the drawings and are not limiting upon the claims unless expressly recited therein.
[0019] As used herein, the term “real-time” shall mean a response that appears to take place instantaneously or in the same timeframe as its real world counterpart action, process or event.
[0020] Augmented reality (AR) is a real-time (i.e., live) direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or other visual data or GPS data. Since the augmentation is done in real-time, AR technology functions by enhancing one's current perception of reality.
[0021] Currently, there are two basic options for combining real and the virtual (computer generated) elements to implement an AR system with augmented visuals: (i) optical systems, and (ii) video systems. Optical systems work by placing one or two optical combiners in front of the user's eyes. The combiners are partially transmissive, so that the user can look directly through them to see the real world, and partially reflective, so that the user can also simultaneously see virtual mages bounced off of the combiners from a projection system. Video systems use a closed view display apparatus in conjunction with one or two head mounted video cameras. The video cameras capture the user's view of the real world, and video from these cameras is combined with virtual, computer generated images to blend the real and the virtual into augmented image data. The augmented image data is sent to the closed view display apparatus to be viewed by the user.
[0022] As described in greater detail herein, the disclosed concept provides a number of AR apparatus implementations that are structured to facilitate and improve the patent interface device fitting process. For example, the disclosed concept would allow clinicians to visualize in real-time different patient interface device geometries and styles in real-time directly on the patient's face to assess sizing, fit and potential problems. An AR apparatus as described herein would thus be much more effective and informative than a sizing gauge or similar fitting tool since the entire interface device can be visualized in real-time with respect to the patient's facial geometry rather than just measuring a few discrete landmarks on the patient's face. The clinician could try different sizes and/or styles of interface devices on the patient virtually without needing to open any physical product (which wastes product). Thus, an AR apparatus as described herein would provide a much more natural and intuitive way to view and fit a patient interface device on a person's face/head.
[0023]
[0024] Processor apparatus 8 comprises a processor 10 and a memory 12. Processor 10 may be, for example and without limitation, a microprocessor (μP), a microcontroller, or some other suitable processing device, that interfaces with memory 12 (which may be separate from or included as part of processor 10). Memory 12 can be any of one or more of a variety of types of internal and/or external storage media such as, without limitation, RAM, ROM, EPROM(s), EEPROM(s), FLASH, and the like that provide a storage register, i.e., a machine readable medium, for data storage such as in the fashion of an internal storage area of a computer, and can be volatile memory or nonvolatile memory. Processor apparatus 8 may also employ cloud-based memory and processing and/or connectivity to a base station that includes memory and processing capabilities.
[0025] Memory 12 has stored therein a number of routines 14 that are executable by processor 10. One or more of the routines implement (by way of computer/processor executable instructions) a system for controlling AR apparatus 2 to facilitate the fitting of patent interface devices as described herein. Memory 12 also has stored therein a database of a plurality of AR component data files 16. Each AR component data files 16 comprises electronic data that may be used to augment the user's (e.g., clinician's) view of the physical, real-world environment during the mask fitting process. Such AR component data files 16 may include, for example and without limitation, data representing (i) a number of 2D or 3D patient interface device images that may be rendered on the real-world view of a patient's face, (ii) a number contact region images showing where various patient interface devices would actually contact the wearer's face that may be rendered on the real-world view of a patient's face, or (iii) a number of informational tags that may be rendered on the real-world view of a patient's face that, for example and without limitation, indicate for the clinician potential problems areas on the patient's face/head or instructions for using a particular patient interface device.
[0026] In the exemplary embodiment, processor apparatus 8 and input/output device 6 are housed by a housing member 17 that is coupled to the exterior surface of earpiece 5A.
[0027] AR apparatus 2 also includes a first video projector 18A and a second projector 18B, an image capture device 19, and a first combiner 20A and a second combiner 20B. Image capture device 19 is any device capable of capturing an image in digital form, such as a CCD camera or an analog camera coupled to an A/D converter. As seen in
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034] In the exemplary embodiment, processor apparatus 8 and input/output device 6 are housed by a housing member 76 that is coupled to the exterior surface of earpiece 75B.
[0035] AR apparatus 72 also includes a first image capture device 78A and a second image capture device 78B and a first display device 80A and a second display device 80B. Image capture devices 78A and 78B are any device capable of capturing an image in digital form, such as a CCD camera or an analog camera coupled to an A/D converter or a device capable of capturing a 3D image such as a Time of Flight Camera or Dual Camera system. As seen in
[0036] AR apparatus 72 may be used to implement any of the options shown in
[0037]
[0038] AR apparatus 92 also includes an image capture device 94 and a display device 96. Image capture device 94 is any device capable of capturing an image in digital form, such as a CCD camera found on many smartphones and tablet computers, or an analog camera coupled to an A/D converter, or a device capable of capturing a 3D image such as a Time of Flight Camera or Dual Camera system. Display device 96 comprises a display apparatus, such as an LCD found on many smartphones and tablet computers, that is capable of displaying an image. Image capture device 94 and display device 96 are operatively coupled to processor apparatus 8 are controlled by processor apparatus 8 to: (i) capture a video image of the physical, real-world environment at which image capture device is pointed, (ii) generate overlay imagery from the AR component data files 16, (iii) combine the captured video image and the generated overlay imagery to create an augmented video image, and (iv) cause the augmented video image to be displayed by displayed device 96. Thus, AR apparatus 92 is structured to implement a video AR system wherein real world views captured by image capture device 94 may be augmented in real-time using images based on the AR component data files 16.
[0039] AR apparatus 92 may be used to implement any of the options shown in
[0040] In still another embodiment, AR apparatus 92 may be implemented in the form of a first person embodiment such that the augmented reality image that is displayed would show the user the renderings on his or her own face. Such a first person embodiment could be implemented in AR apparatus 92 by using a front facing camera as image capture device 94. Another first person embodiment may be an optical system employing a smart mirror where in augmented reality imagery as described herein is added to the image as reflected by the mirror.
[0041] In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word “comprising” or “including” does not exclude the presence of elements or steps other than those listed in a claim. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements. In any device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain elements are recited in mutually different dependent claims does not indicate that these elements cannot be used in combination.
[0042] Although the invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.