Virtual Reality Medical Application System
20170319814 · 2017-11-09
Assignee
Inventors
Cpc classification
G16H20/30
PHYSICS
A61M21/00
HUMAN NECESSITIES
G16H20/40
PHYSICS
A61B5/05
HUMAN NECESSITIES
A61N5/1049
HUMAN NECESSITIES
A61B5/70
HUMAN NECESSITIES
A61N5/1068
HUMAN NECESSITIES
A61N5/1001
HUMAN NECESSITIES
A61B5/0046
HUMAN NECESSITIES
A61B5/107
HUMAN NECESSITIES
A61B6/0492
HUMAN NECESSITIES
A61B5/055
HUMAN NECESSITIES
A61B6/46
HUMAN NECESSITIES
A61B5/744
HUMAN NECESSITIES
A61B5/318
HUMAN NECESSITIES
A61B6/04
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
A61B5/02055
HUMAN NECESSITIES
A61B1/0005
HUMAN NECESSITIES
A61B2562/0219
HUMAN NECESSITIES
International classification
A61M21/00
HUMAN NECESSITIES
A61B6/00
HUMAN NECESSITIES
A61B5/107
HUMAN NECESSITIES
A61N5/10
HUMAN NECESSITIES
A61B5/00
HUMAN NECESSITIES
A61B5/11
HUMAN NECESSITIES
A61B5/055
HUMAN NECESSITIES
G06T19/00
PHYSICS
A61B1/00
HUMAN NECESSITIES
Abstract
Systems and methods are disclosed for monitoring a patient by positioning the patient for a predetermined medical mission; sensing biometric and physical conditions of a patient during the mission, and displaying a multimedia interaction with the patient to keep the patient in a predetermined position to improve efficacy of a medical mission.
Claims
1. A virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission comprising: a motion detection system to capture a real-time physical position data corresponding to a real-time physical position of a patient; a console system to establish a physical position data range; a medical mission system to perform a medical mission when the real-time physical position data is within the physical position data range; a communications system that provides sensory feedback of the real-time physical position to the patient, the communications system displaying to the patient an avatar in a virtual environment representative of the real-time physical position of the patient; and wherein, by referencing the movements of the avatar in the virtual environment displayed in the communication system, the patient adjust the real-time physical position to manipulate the real-time physical position data to be within the physical position data range to perform the medical mission.
2. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 1, further comprising: a technician controller system to allow a technician to interact with the virtual reality medical application system.
3. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 2, further comprising: a patient controller system to allow the patient to interact with the virtual reality medical application system.
4. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 1, further comprising: a biofeedback system to capture a real-time biometric data corresponding a real-time biometric measurements of the patient; wherein the console system establishes a biometric data range, the console system further configured to alert the patient through the communication system when the real-time biometric data is not within the biometric data range; and wherein the patient may adjust the real-time biometric measurements to manipulate the real-time biometric data to be within the biometric data range.
5. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 4, wherein the medical mission system is further configured to perform the medical mission when the real-time biometric data is within the biometric data range.
6. A virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission comprising: a motion detection system to capture a real-time physical position data corresponding to a real-time physical position of a patient; a console system to establish a physical position data range; a medical mission system to perform a medical mission when the real-time physical position data is within the physical position data range; a head mounted display system that provides sensory feedback of the real-time physical position data to the patient, the head mounted display system displaying an avatar in a virtual environment representative of the real-time physical position data of the patient; and wherein the patient adjust the real-time physical position, by referencing the movements of the avatar in the virtual environment, to manipulate the real-time physical position data to be within the physical position data range to perform the medical mission.
7. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 6, wherein the motion detection system comprises: a plurality of motion detectors positioned at reference locations away from the patient.
8. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 7, wherein the motion detector system further comprises: a plurality of position locator tags positioned on the patient at marker positions, wherein the plurality of motion detectors detect the plurality of position locator tags to capture real-time physical position data corresponding to the real-time physical position of the patient.
9. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 7, wherein the motion detector system further comprises: a plurality of motions sensors positioned on the patient.
10. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 9, further comprising: a patient controller system to allow the patient to interact with the virtual reality medical application system.
11. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 10, further comprising: a biofeedback system to capture a real-time biometric data corresponding to a real-time biometric measurements of the patient; wherein the console system establishes a biometric data range, the console system configured to alert the patient through the head mounted display system when the real-time biometric data is not within the biometric data range; and wherein the patient may adjust the real-time biometric measurements to manipulate the real-time biometric data to be within the biometric data range.
12. A virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission comprising: a medical mission system having a reference plane to perform a medical mission; a motion detector system with a field of view of the reference plane to capture a real-time physical position data corresponding to a real-time physical position of a patient located on the reference plane; a head mounted display system to provide sensory feedback of the real-time physical position data to the patient, the head mounted display system displaying an avatar in a virtual environment representative of the real-time physical position data of the patient; a console system in communication with the medical mission system, the motion detector system, and the head mounted display system, the console system configured to establish a physical position data range; wherein the medical mission system performs a medical mission when the real-time physical position data is within the physical position data range; and wherein, by referencing the movements of the avatar in the virtual environment, the patient adjust the real-time physical position to manipulate the real-time physical position data to be within the physical position data range to perform the medical mission.
13. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 12, further comprising: a biofeedback system positioned on the patient to capture a real-time biometric data corresponding to a real-time biometric measurements of the patient.
14. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 13, wherein the motion detection system comprises: a plurality of motion detectors; and a plurality of position locator tags positioned on the patient at marker positions, wherein the plurality of motion detectors detect the position locator tags to capture the real-time physical position data corresponding to the real-time physical position of the patient.
15. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 13, further comprising: a secondary motion detection system comprising a plurality of motion sensors positioned on the patient.
16. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 13, further comprising: a patient controller system to allow the patient to interact with the virtual reality medical application system.
17. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 16, further comprising: a technician controller system to allow a technician to interact with the virtual reality medical application system.
18. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 13, wherein the medical mission system performs the medical mission when the real-time biometric data is within a biometric data range established by the console system; and wherein the patient may adjust the real-time biometric measurements to manipulate the real-time biometric data to be within the biometric data range.
19. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 18, wherein the head mounted display system visually alerts the patient when the real-time physical position data is outside the physical position data range and when the real-time biometric data is outside the biometric data range.
20. The virtual reality medical application system for maintaining a physical positioning of a patient during a medical mission of claim 19, wherein the head mounted display system audibly alerts the patient when the real-time physical position data is outside the physical position data range and when the real-time biometric data is outside the biometric data range.
Description
BRIEF DESCRIPTION OF THE DRAWING
[0058] The nature, objects, and advantages of the present invention will become more apparent to those skilled in the art after considering the following detailed description in connection with the accompanying drawings, in which like reference numerals designate like parts throughout, and wherein:
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
[0067]
[0068]
[0069]
[0070]
[0071]
[0072]
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT
Hardware System Description
[0084] Referring initially to
[0085] System 100 also includes a number of imaging devices 110, 112, 114, and 116, each having a field of view 117 (shown in dashed lines) to perceive movement of patient 102. Signals from imaging devices 110, 112, 114, and 116 provided data along connection 118 to the video image receiver 120 in VR Medical Console 106, which also includes a video image processor 122 to process the video images to detect patient movement and monitor patient position. A biometric receiver 124 is provided and includes a processor for monitoring patient biometric conditions including the motion sensor for 3-D spatial position. Console 106 also includes a virtual reality video driver 126 and virtual reality audio driver 128.
[0086] A treatment protocol database 130, biometric data database 132 and physical data database 134 stores and provide data for patient treatment, and can provide specific patient data from prior treatment sessions, as well as store current treatment data for use later. Also, all treatment data may be stored in hard drive 140.
[0087] A patient controller receiver 136 and game software 138 reside in VR medical console 106 to provide the virtual reality experience for patient 102. Console 106 interfaces with the virtual reality headset, headphone and microphone via input 142, and biometric data is provided to console 106 via input 146, and patient controller input is received on input 148.
[0088] A VR Medical Application controller 108 includes a treatment selection database 152, a patient biometric database 154 and a patient physical database 156, and may include a computer console 150 for the technician to operate and interface with the Reality Medical Application System of the present invention. Control signals are provided from console 150 to treatment apparatus 104 via channel 170, and may include safety features, such as device interrupts, and error condition notifications.
[0089] Data may be exchanged between console 106 and user controller 108. For instance, control inputs 160 including historical data and patient data, alerts 162 including interrupts, biometric data and physical data, and bidirectional channel 164 including audio and video signals. These data channels provide a technician seated apart from the patient the ability to fully monitor the patient 102, and the patient's interaction with the Virtual Reality Medical Application System and the treatment apparatus 104.
[0090] Radiation from treatment apparatus 104 causes cellular degradation due to damage to DNA and other key molecular structures within the cells in various tissues. It was discovered that with controlled amounts of radiation, the effects caused by radiation can be utilized to treat certain diseases such as cancer. With the use of radiation, several previously untreatable diseases have been able to be successfully treated using radiation therapies. However, along with the curative effects of radiation are negative side effects. The prolonged unnecessary exposure to radiation to normal organs includes both immediate and delayed side effects. The immediate side effects of radiation exposure include nausea and vomiting, diarrhea, pain, headaches, and skin or organ burns, whereas the delayed effects could include fatigue, weakness, hemorrhaging, leukopenia, epilation, paralysis, brain damage, blindness, perforation, ulceration, malabsorption, organ failure, and other various side effects. Additionally, the prolonged overexposure of unnecessary radiation may cause more serious complications and may lead to mortality. The system of
[0091] The Virtual Reality Medical Application System of one embodiment of the invention is suitable for a variety of medical applications. For instance, all patients of all ages that receive radiation therapy (including brachytherapy) will have an improved experience using the present invention. There are about 700,000-800,000 patients that receive radiation treatments yearly, and many of these patients are unable to remain still due to pain or anxiety, and which will benefit an improvement in their treatment from the present invention.
[0092] The Virtual Reality Medical Application System of one embodiment also provides benefits when used with respiratory-gating/tracking of a tumor during radiation therapy. While a patient is receiving radiation treatment for the duration of about 10-15 minutes, there is tumor motion due to respiration, particularly tumors in the lung, breast, and liver. The normal radiation field or treatment portal has to be enlarged to account for this tumor motion, which means more normal structure tissues is being exposed the radiation treatment, hence more treatment side effects. One way to reduce the size of treatment portal or radiation field size is to turn the radiation on only during part of the respiration cycle by having holding their breath for a period of 20-60 seconds. This process is called “respiration gating”. With the input from body motion sensor, the present invention facilitates the accurate timing of the radiation beam (“respiration gating”) with the respiratory cycle to minimize the effect of tumor motion. The body sensor will send feedback to patients via the headset goggle to instruct patients to hold their breath at the right movement or similar 3-D chest position and send signal to the treatment machine and/or operator to allow the radiation beam to turn on.
[0093] The Virtual Reality Medical Application System of one embodiment also provides benefits to patients using the body sensors and goggle feedback is non-invasive, more objective, easy compliance, patient-controlled, and less expensive system. This system also provides additional sensory input to patients and operators such as 3-D body spatial position, oxygen saturation level, heart rate, body temperature, etc.
[0094] The software component provides a variety of patient specific 3-D games that are controlled by patients, virtual simulation of real life scenery (car racing, hot-air balloon, fish-eye view, etc), and may include real time images of the procedure that a healthcare provider wants to share with patient, such as an inside image of the colon from the colonoscopy. The proposed system can be used to decrease the discomfort, anxiety and pain patients during procedures, such as, medical imaging (CT, MRI, PET), medical endoscopy procedures (bronchoscopy, colonoscopy, proctoscopy, cystoscopy, etc) or biopsy.
[0095] The 3-D games and virtual simulation software will be developed is, in a preferred embodiment, run on Linux, and the technician console is a cross platform and runs on Microsoft Windows, Apple Mac OS, and Linux in a custom medical-grade computer with input/output ports and storage system.
[0096] The sensor component performs two major functions: motion tracking and biofeedback. For motion tracking of the patients, the head and body are tracked separately. The tracking of the head and the body both utilize sensors based on Inertial Measurement Units (IMU). IMU's are electrical components utilizing a combination of gyro meters, accelerometers, and magnetometers to track motion in 3 dimensional space. For the head, the IMU's are built into the 3-D goggles. For the body, 3D motion is tracked via an array of IMUs that are placed on the various joints of the body. The array of IMUs transmit data to a central hub and then sent to the Virtual Reality Medical Application System. For biofeedback, sensors such as blood pressure, heart rate, EEG, and EKG, among others will help a technician keep informed of the patient condition, and will tend to determine the game play mechanics.
[0097] Alternatively, motion may be detected using a video based motion sensing device, capable of three-dimensional motion detection. One such sensor is commercially available as the Kinect sensor. Kinect builds on software technology developed internally by Rare, a subsidiary of Microsoft Game Studios owned by Microsoft, and on range camera technology by Israeli developer PrimeSense, which developed a system that can interpret specific gestures, making completely hands-free control of electronic devices possible by using an infrared projector and camera and a special microchip to track the movement of objects and individuals in three dimensions. This 3D scanner system, often called Light Coding, employs a variant of image-based 3D reconstruction.
[0098] The Kinect sensor is a horizontal bar connected to a small base with a motorized pivot and is designed to be positioned lengthwise above or below the video display. The device features an “RGB camera, depth sensor and multi-array microphone running proprietary software” which provide full-body 3D motion capture, facial recognition and voice recognition capabilities. Kinect sensor's microphone array enables its attached devices, such as an Xbox 360, to conduct acoustic source localization and ambient noise suppression, allowing for things such as headset-free party chat over Xbox Live.
[0099] The depth sensor consists of an infrared laser projector combined with a monochrome CMOS sensor, which captures video data in 3D under any ambient light conditions. The sensing range of the depth sensor is adjustable, and Kinect software is capable of automatically calibrating the sensor based on gameplay and the player's physical environment, accommodating for the presence of furniture or other obstacles.
[0100] The Virtual Reality Medical Application System of the present, in a preferred embodiment, is particularly suited for use with patients undergoing radiation therapy, but may be used to cover patients undergoing brachytherapy, Computed Tomography (CT), PET, Magnetic Resonance Imaging (MRI), angiography, biopsy procedures and endoscopy such as bronchoscopy or colonoscopy, for example. In addition to keeping the technician informed of the status of the patient, the Virtual Reality Medical Application System of one embodiment of the invention also helps the patient relax and also distract from the anxiety and pain of these procedures. The visual and audio system allows the healthcare provider to share the clinical information to the patient (e.g., the images of abnormal findings from the colonoscopy) The Virtual Reality Medical Application System of one embodiment of the invention also provides benefits in for patient setup before each radiation treatment. When patient comes in every day for the radiation treatment, he/she must be positioned in exactly the same position as the reference or planned position. This is performed by using restrained devices, adjustment by operators/technicians, and then verified by using invasive X-ray images or Computed Tomography (CT scan). All of these add times (5-10 minutes), cost (hundred of dollars), and unnecessary radiation exposure to patients. The Virtual Reality Medical Application System of one embodiment of the invention uses body sensor to detect the 3-D body position, comparing to the reference or desired position, and give feedback to patients via a game avatar to instruct patient to make adjustment on a specific body parts on their own to get within the reference position. Therefore, it provides a much less expensive, noninvasive (no radiation imaging required), more accurate, and time-efficient. The benefit of IMU based motion tracking over video based motion (kinect) tracking is IMU based motion tracking does not require a patient's skin to be exposed so that treatment can be done while the patient is covered. With video based motion capture, the body needs to be exposed, which could be uncomfortable for patients with certain area such as breast and pelvic regions.
[0101] The Virtual Reality Medical Application System relies on motion sensors, goggle, and 3-D game to monitor patients during the radiation treatment and provide instant feedback to patients to remind them staying still via such process as pausing of the game, commands from game character, etc. Since radiation treatment is typically delivered in 15-30 min during which patients must remain still, any unnecessary involuntarily movement will cause radiation to deliver incorrectly to normal structures and missing tumor. Within radiation therapy, the Virtual Reality Medical Application System of one embodiment of the invention is particularly well suited for use with children with cancer that undergo radiation therapy. There are about 10,000 cases of children cancer in the U.S. every year, and about ½ of these children will undergo radiation therapy daily for several weeks.
[0102] Because, kids typically can be very anxious, staying still is a significant problem during these procedures. The alternative to keeping still voluntarily is to sedate them via intravenous sedatives and anesthetic gas. Sedation comes with anesthesia medication risk, costs several thousand dollars per day, and extends daily treatment time from less than an hour to more than 4 hours per day. The Virtual Reality Medical Application System of the present will be useful because it will reduce the complications that can come from sedating a child, reduce costs since sedation will not be necessary, and speed up the procedure since there is no need to wait for the anesthesia and its recovery.
[0103] The Virtual Reality Medical Application System of one embodiment also helps patients that undergo medical imaging (MRI, CT, and PET) that are too anxious, such as due to claustrophobia, to remain calm. There are estimated 30 million of patients receiving MRI yearly, and about 5% of these patients have claustrophobia that requires sedation with medication. The Virtual Reality Medical Application System of one embodiment is also useful for patients undergoing minor procedures such as biopsy, angiography, colonoscopy, endoscopy, bronchoscopy, dental surgery, cosmetic surgeries, interventional radiology procedures, etc. The Virtual Reality Medical Application System of one embodiment of the invention provides distraction, or escape, from the procedure, thus reducing the need for sedatives and pain medication. The visual and audio system allows the healthcare provider to share the clinical information to the patient (e.g., the images of abnormal findings from the colonoscopy and share informed consent on what to do with certain findings). The Virtual Reality Medical Application System of one embodiment can also allow patients to visualize the real live images of the procedure that the physicians see on their scope. Since the patients are not sedated, their physicians can communicate and get patients' consent to certain procedures such as decision to biopsy or removal. This is particularly beneficial when the lack of patient consent would prohibit a physician from performing a procedure that would be routine, but were unanticipated at the start of the procedure. Moreover, such active patient participating would eliminate countless repeat medical procedures.
[0104] Another use of the Virtual Reality Medical Application System of one embodiment of the invention includes psychotherapy for patients that are receiving chemotherapy. These patients typically spend 4-8 hours per day in the chemo-infusion rooms or in the hospital receiving their chemotherapy drugs intravenously. The effect of chemotherapy drugs to the brain and exposing to the environment can affect patients' cognitive functions (memory, fluency, attention, motor coordination) and cause depression, anxiety, hopelessness. This condition is sometime called “chemo-brain”. The Virtual Reality Medical Application System provides virtual reality escape or “cancer fighting games” that can relieve patients' stress, anxiety, and other cancer related symptoms from chemo-brain effects.
[0105] Also, the Virtual Reality Medical Application System of one embodiment is suitable for psychotherapy for patients that have acute pains such as those just have surgery, trauma, such as accidents or burns. The Virtual Reality Medical Application System provides a virtual reality escape or “games” that can relieve patients' stress, anxiety, and other related symptoms, as well as provide psychotherapy for patients that have chronic pain, depression, anxiety disorder, or other personality/mood/affect disorders (autism, OCD, etc. . . . ) Also, the Virtual Reality Medical Application System of one embodiment of the invention is suitable for psychotherapy for patients that suffer pain from missing arms or legs (“phantom limb pain”). There are approximately 70,000 patients in the U.S. who lose their arms/legs due to military combat or disease such as diabetic or cancer. These patients suffer chronic pain and depression due to the missing limbs. The Virtual Reality Medical Application System provides a virtual reality escape or “games” coupled with body motion sensors, EEG and EMG sensors that can relieve patients' chronic pain and depression.
[0106] Overall, the Virtual Reality Medical Application System of one embodiment provides for high levels of patient safety, provides for a patient controlled treatment session, provides non-invasive patient position monitoring using no X-ray scanning, automated patient position management without requiring any operator involvement, and provides for instant communication with a patient during treatment, and provides an overall cost-savings to traditional treatment approaches. Moreover, this “patient control” experience, as compared to other systems where patient is “passive”, provides an overall improvement to the patient experience, and allows for the real-time sharing of information and communication with the patient. This provides a more efficient procedure, particularly when a procedure requires the need to obtain patient consent during treatment, which is problematic and time-consuming when the patient is under anesthesia.
[0107] Referring to
[0108]
[0109] Also, a number of motion sensors 232 may be positioned on patient 102 to provide mechanical measurement of the patient during treatment, and may include a number of different measurement techniques including gyroscopes, strain gauges, Hall-effect sensors, and other techniques known in the art. The signals from the mechanical measurement devices are provided to motion sensor output 234 for routing to VR medical console 106.
[0110] In addition to video imaging devices 116 and 110, and mechanical motion sensing devices 232, a number of position locator tags, or reference markers, 226, 228 and 230 may be positioned on the patient. These markers may have wavelength-specific reflection properties (such as infrared or ultraviolet), enabling an imaging system to very specific focus on the patient position using the markers, and comparing those marker positions to known references positions.
[0111] In one embodiment of the present invention, motion sensors 116 may be of the type utilized in the Kinect2 system. Incorporation of such as system provides several benefits: (a) It is infrared laser projector and a 3-D camera to capture the 3-D body surface; (b) It enables facial recognition; and (c) It enables voice recognition. The (b) and (c) features are very important for patient safety to correctly identifying the correct patient being treated. This is particularly advantageous in critical health care treatment where 130,000 medical mistakes occur annually in which patients receive the wrong surgery or radiation treatment.
[0112] A representative radiation target may be placed on the patient abdomen to indicate the location of the radiation treatment, or this target may be optically generated by the treatment device, such as a radiation emitter.
[0113] Referring now to
[0114] Console sub-system 302 is in communication via channel 306 with the virtual reality head mounted display (HMD) 304 which shields the patients eyes from the external environment, tracks movement of the patient's head and is used during game play, and provides the three dimensional stereoscopic display for the patient's virtual environment. In a preferred embodiment of the HMD 304, noise cancelling headphones may be incorporated to provide a sound-proof environment for the patient, and may include a microphone for bi-directional communication with the technician at the console 108.
[0115] A patient input controller 310 captures patient input to allow the patient to navigate in the virtual world, and allows the patient to utilize and navigate through the user interface. In a preferred embodiment, the patient input controller can include one or more of eye movement sensors, a mouse, joystick, gamepad, touchpad, button, voice input, and gesture input; with such input signals received on input 312.
[0116] A motion detection subsystem 320 detects the patient motion from motion input devices, such as a video input, kinetic sensors, strain gauges, IMs motion sensors and accelerometers on signal line 322, and provides the motion-related data to the game software 138 running in the console 106, and can also provide data to the technician workstation 108. These motion sensing devices provide highly accurate patient position and motion data to motion detection subsystem 320 which in turn communicates this data to console subsystem 302 using interface 324.
[0117] Biofeedback sensor subsystem 326 is in communication with a variety of biometric monitoring devices, such as blood pressure monitors, EMG, EKG, EEG, heart rate, respiration rate, body temperature, skin conductivity, and oxygen level monitors along input channel 328. These data measurements may be returned to the console subsystem 302 via channel 330.
[0118] The constant measurement of critical patient biometric data provides the Virtual Reality Medical Application System of the present invention to accurately monitor the physical and mental condition of the patient before, during, and after a treatment. These measured biometric signals allow a technician or health care provider to determine in real-time the level of pain and anxiety the patient is experiencing, and thus allows for near real-time adjustment to the treatment. Further this measured patient condition provides the Virtual Reality Medical Application System of the present invention the unique ability to adjust the virtual reality environment to provide the patient with an increased or decreased level of interaction to make the treatment more successful, and the patient experience more positive.
[0119] The technician's workstation 336 allows the technician or health care provider to view that a patient is seeing, and may also provide for a bidirectional communication link between the technician and the patient. Also, the technician workstation 336 may allow a patient to view certain procedure-specific data, such as images from a surgical biopsy, colonoscopy, etc. thus facilitating real-time consent from a patient during a procedure. This is of a particular benefit when the inability to secure consent from a patient during a procedure could result in a procedure being terminated prematurely, repeat procedure, or redundant procedures to accomplish tasks that could have been easily completed were consent available. Technician workstation 336 may also provide real-time data to a treatment technician via channel 340 to a patient display 342.
Software System Description
[0120]
[0121] A head mounted display (HMD) interface 406 interfaces the display to the game module subsystem, and supports the rendering of the graphics in the patient worn virtual reality display. Also, the HMD interface 406 provides motion sensing data back to the motion detection interface 410, as well as may provide the patient with a live video feed from the treatment procedure, facilitating information sharing or consent as mentioned above. To facilitate this communication, communication interface 408 interconnects various modules and VOIP services to the game module system 402.
[0122] Motion detection interface 410 receives motion data from various imaging sources 412 and connects the motion detection sensors to the game module subsystem. In a preferred embodiment, module 410 utilizes an adaptor design to accept multiple sensors that can simultaneously or cooperatively monitor the patient position and movement.
[0123] Biometric interface 414 connects the various biofeedback sensor inputs to the game module subsystem 402, and utilizes an adapter designed so that the various biometric sensors may be selectively or simultaneously monitored, and can receive information from a biometric data display 416.
[0124] A technician station 418 includes a technician console 420 and a technician display 422 that allows bidirectional communication between the technician and the patient, and allows the technician to set motion and treatment parameters and monitor those parameters during the treatment period, and also allows the technician to monitor the patient biofeedback data. The technician station 418 is in communication with the game module subsystem 402 via technician console interface 424.
Operation of the Invention
[0125]
[0126] The physician will decide if biometric sensors will be used and if so, which ones will be used, also the allowable deviations from sensors. In step 508, the patient's baseline biometric data is acquired, such as the skin conductivity, EKG, EEG, heart rate, respiration rate, body temperature and oxygen saturation. Once the baseline biometric data is acquired, it is stored in step 510 in the patient database for later retrieval. Additionally, limits on the biometric data, such as normal ranges and range limitations for detecting unsafe conditions, is determined in step 512.
[0127] In addition to the biometric baseline data, a patient's physical positioning data is acquired in step 514. In treatments where patient position is critical, the treating physician has the opportunity to accurately and precisely position the patient for his or her treatment in the simulation environment thus making certain that the position of the patient is optimum for the best treatment outcome possible.
[0128] A typical simulation would occur one or two weeks before the radiation treatment, and would include the patient undergoing a treatment planning process. During this simulation, the physician will decide on a certain treatment position (called, “reference position”) and at that time, a CT (Computed Tomography) is done (sometimes with MRI and/or PET) to plan the radiation treatment (# of beam and directions, # of treatments, outlining the tumor, etc.).
[0129] Once this optimum reference position is achieved, the patient position data is stored in step 516 for later retrieval. Following the successful positioning during the simulation, the patient positional limitations are determined and stored in step 518. This provides for position limit setting based on the particular treatment to be performed. For instance, in treatments to the abdomen, such as radiation of cervix or ovarian cancer, the precise position of the lower abdomen is critical, whereas the position of the patient's foot is less critical. In such circumstances, the positional limitations on the abdominal position may be very small, but the positional limitations on the patient's foot may be relaxed.
[0130] Once all data from the simulation has been gathered and stored, method 500 returns to its calling procedure in step 520.
[0131]
[0132] Prior to treatment initiating, the patient's current positional data is acquired in step 544, and compared to the previously stored reference position data in step 546. If necessary, the patient or operator makes an adjustment to the patient's position to match the reference position data in step 548.
[0133] Once positions are matched, the patient is ready to start treatment in step 550. Once ready, the patient biometric and positional data is monitored starting in step 552, and the patient's virtual reality protocol is started in step 554. Once fully immersed in the virtual reality environment, the treatment of the patient begins in step 556.
[0134] The patient's positional data is measured in step 558 and compared to the set limitations. If the patient's position is within the set limits as determined in step 560, the treatment is not completed, as determined in step 562, flow chart 530 returns to step 558 for continued monitoring. If the treatment is completed as determined in step 562, the treatment ends in step 564.
[0135] If the patient position is not within the set limits as determined in step 560, the magnitude of the deviation is determined in step 568. If the deviation is small, the patient is alerted to make an adjustment in step 570, and control returns to step 558 for continued monitoring and treatment. However, if the deviation is not small, then the treatment is paused in step 572 until the patient makes a proper position adjustment, and treatment resumes in step 574 and control returns to step 558 for continued monitoring and treatment.
[0136] Referring now to
[0137] The patient's starting biometrics are measured in step 608, and may include the measurement of 3-D body position, skin conductivity, EKG, EMG, EEG, heart rate, respiration rate, body temperature and oxygen level in the blood, and any other biometric data which may be useful for the procedure being performed.
[0138] The patient's reference position is stored in step 610, and the patient starting biometric data is stored in step 612. Once complete, the control returns in step 614 to the calling procedure.
[0139] Referring to
[0140] The patient is positioned in his or her proper reference position in step 630, and verified in step 632. If the patient is not in the proper reference position, feedback is provided to the patient and technician in step 634, and the patient position is then verified in step 628. Once proper positioning is achieved as determined in step 632, control returns in step 636 to the calling procedure.
[0141]
[0142] Treatment begins with the activation of the treatment apparatus in step 710, and patient biometric data is monitored in step 712, and compared to patient biometric data tolerances in step 714. If the treatment is complete as determined in step 716, the treatment ends in step 736. If the treatment is not complete, the treatment continues in step 718 and the patient's physical data is monitored in step 720. If the patient physical data is within tolerance in step 722, treatment continues with return to step 712. If the patient physical data is not within tolerances, the physical data is analyzed in step 724 to determine if it is within a safe range. If so, the patient is provided feedback in step 732 to facilitate position self-correction, and the technician is alerted in step 734 of the position violation, and method 700 returns to step 720 for continued monitoring.
[0143] If the physical data is outside the safe range as determined in step 724, the technician is alerted of the safety issue in step 726, the treatment apparatus in interrupted in step 728, and the treatment ends in step 730.
[0144] If it is determined in step 714 that the patient biometric data is not within the preset tolerances, it is determined in step 738 whether the biometric data is outside the safety range. If safe, the technician is alerted in step 740 of the biometric error, and treatment is adjusted in step 742, and the virtual reality application may increase patient focus in step 746, or the virtual reality application may adjust the program or game to return the patient biometric readings t within range in step 748. Once these corrective steps have been taken, method 700 returns to step 712 for continued monitoring.
[0145] If the biometric data as measured is unsafe as determined in step 738, the technician is alerted in step 750 and the treatment apparatus is interrupted in step 752 and the treatment ends in step 754.
[0146] In a preferred embodiment of the method of the present invention, on the day of treatment, a typical treatment process is as follows:
[0147] a) Patient is set up on the treatment table;
[0148] b) imaging scan the patient's face and record voice to verify it is the correct patient. This “Time-out” is the requirement for all treatment centers to ensure proper patient identification and treatment;
[0149] c) Patient puts on the virtual reality goggles and headset, and the sensors are attached to the patient;
[0150] d) imaging scan and assist patient/technician to make adjustment so patient is in the reference position;
[0151] e) This current position is “captured” or recorded together with biometric sensor data;
[0152] f) At this time, the virtual reality game is turned on and operator will leave the treatment room and go to the treatment console;
[0153] h) Operator will check with the position monitor and biometric monitor to see they are within limit (“Green light”);
[0154] i) Operator will turn on the treatment machine and informed patient via headset;
[0155] j) Position and biometric sensor monitor are continuously feeding signal and display via monitor: [0156] “Green Light” (position is good); [0157] “Yellow Light” (small deviation can be corrected by patient, treatment can be continued, and technician is stand by to pause treatment if necessary; and [0158] “Red Light” (Major deviation—treatment must be paused automatically. Adjustment must be made by patients first, then technician if needed before treatment can be resumed;
[0159] k) If respiration gating feature is used, one more step is added to have patient holding their breath to certain level and go by Green, Yellow, Red light system; and
[0160] l) Treatment completed.
[0161] What has been described here are exemplary methods capable using the Virtual Reality Medical Application System of the present invention. It is to be appreciated that other variations of these methods may be achieved using the present invention, and no limitations as to these methods are intended by the few exemplars presented herein.
[0162] Referring now to
[0163] The patient baseline biometric data is input in step 816, and the biometric data range for the treatment is established in step 818. The baseline physical data is input in step 820, and the physical data range is established in step 822. The virtual reality program begins in step 824 followed by the activation of the treatment apparatus in step 826.
[0164] Biometric data is monitored in step 826, and physical data is monitored in step 830. If data is within range as determined in step 832, the virtual reality program continues in step 834 to receive input from the patient in step 836, and to process the patient input in step 838. As the virtual reality continues, the program adjusts in response to patient input in step 840, and adjusts in response to biometric data readings in step 842, and adjusts in response to physical data readings in step 844. If treatment is complete as determined in step 846, treatment ends in step 848; otherwise, control returns to step 828 to continue monitoring biometric data.
[0165] If the data measured is out of range as determined in step 832, the data deviation is determined in step 850. If the deviation is small, a yellow alert is provided to the patient in step 852, and the virtual reality program adjusts to address the range violation to provide the patient with self-correction guidance. If the data deviation is not small as determined in step 850, a red alert is given to the patient in step 854, and the treatment is paused in step 856, and the virtual reality program is adjusted to address the range violation in step 858.
[0166] Once the virtual reality program has been adjusted, the technician is alerted of the range violation in step 860, and the out-of-range data is rechecked in step 862. Again, in step 832, it is determined if the data is out of range, and the method continues.
[0167] Referring now to
[0168] A patient avatar is selected in step 908, and the introduction is displayed to the patient in step 910, and instructions are provided in step 912. If the patient does not understand as determined in step 914, then addition instruction is given in step 916. If the patient understood the instructions, a tutorial is provided in step 918. If the patient does not adequately understand or complete the tutorial, addition tutorial is provided in step 922. Once the patient completes the tutorial successfully as determined in step 920, step 924 displays the “cut scene” to transport the patient avatar to the various game environments as determined in step 926, such as determined by the patient's prior game history or patient age, for example.
[0169] The patient avatar may be sent to any number of levels or worlds, such as level 1 928, level 2, 930, level 3, 932, or level 4, 934. Once on the particular level or world, the patient's reward screed is displayed in step 936. If the game is over, as determined in step 940, the patient game data is recorded in step 942, and the game ends in step 944. if the game is not over, control returns to step 926 and the game continues as the patient avatar navigates through the various levels, various game worlds, and successfully completes each challenge within the game. Preferably, the game duration would correspond to the duration of the patient's treatment such that the attention of the patient is drawn fully into the virtual reality world.
Exemplary Game and User Interface
[0170] A number of exemplary game interfaces are presented herein. For instance,
[0171] This simple example of the Virtual Reality Medical Application System of the present invention provides for the virtual reality system to provide physical feedback to the patient during treatment to maintain proper positional placement. Moreover, the biometric feedback data from the patient monitoring can be incorporated into the virtual reality system to change the avatar environment, increase or decrease patient focus, in order to increase focus and decrease stress on the patient during treatment.
Game Description for
[0172]
[0173] An exemplary title page is shown in
[0174] Exemplary features of a virtual reality environment of an embodiment of the present invention include: [0175] Kid patients, ages from 3-10 (older kids may need to be considered also) [0176] Target platform: Personal Computer (PC) or Laptop [0177] Display hardware: Oculus Rift II or similar goggle [0178] Input device: N/A [0179] Extra device: monitor camera [0180] Game engine: Unity 3D
[0181] In the exemplary game depicted in
[0182] in order to help Dr. Patel to build a dinosaur park on the earth, the child patient avatar is chosen to be transported to another planet named “D-Planet” which has an ecological environment similar to the one the earth used to have back to millions years ago. There′re multiple islands on “D-Planet” and on each island lives a type of dinosaur. The ultimate task for the chosen child avatar is to utilize a special equipment named “Aero Skate Board” and by all means to search and collect eggs of different types of dinosaurs on different islands and bring them all back to the earth . . . .
[0183] Because one target audience are 3-10 years old child patients, it's important to create a virtual world that can let them fully immerse in for 30 minutes, so the art style in general has to be cute, cartoonish with colorful and bright palettes. Here below are some examples of look and feel of a preferred embodiment of the present invention depicting characters, dinosaurs, items and environments (level)
[0184] For instance, if the child patient is a boy, then the game system will assign this boy avatar to him as his virtual identity. He will play the role as the little adventurer chosen by Dr. Patel and accomplish his mission of dinosaur eggs collecting on D-Planet. Alternatively, if the child patient is a girl, then the game system will assign this girl avatar to her as her virtual identity. She will play the role as the little adventurer chosen by Dr. Patel and accomplish her mission of dinosaur eggs collecting on D-Planet.
[0185]
[0186]
[0202] Using the virtual reality environment, a variety of gameplay options are available, and the specific examples presented herein are not to be construed as any limitation of the present invention.
Gameplay Option 1
[0203] In each level, the chosen child avatar must hide behind a certain object and keep still in the game scene for a certain period of time (60 sec-90 sec) when the Egg Collector is searching and collecting dinosaur eggs. If the patient moves any part of the body during the period of time then the patient avatar will correspondingly move in the virtual reality environment, and the dinosaur parents will be aware and go to protect eggs and destroy the Egg Collector.
[0204] The Egg Collector has a countdown clock on its surface to show time countdown. If the child patient moves any part of the body, the movement will be detected by the motion monitoring system and child patient's avatar in the game will move correspondingly.
[0205] The Egg Collector will also stop collecting eggs for a few seconds. If the child patient moves 3 times during the period of time, the egg collection mission failed and the patient avatar will need to re-do the same mission.
[0206] The patient avatar will get rewarded with a bronze egg or silver egg or gold egg based on the behavior in the finished level. Egg reward of each level has different/unique design, and given that many treatments require multiple sessions, it is conceivable that a patient will be able to collect many different rewards over the course of treatment.
Gameplay Option 2
[0207] Mother dinosaur lost 5 eggs in each level and it's hard for her to find them back because she has other eggs need to be taken care of. The chosen kid volunteered to find all 5 eggs back.
[0208] Riding a hi-tech “aero skateboard” and carrying an egg-protector on the back, the avatar starts the journey in the level to look for lost eggs. He/She needs to standing on the skateboard by keeping balance (in order to make it flying smooth and stable). If the child patient moves any part of the body, the movement will be detected by the monitor camera, body position, and patient's avatar in the game will move correspondingly and this will make the skate board unbalanced.
[0209] If the child patient moves 3 times during the period of time, 1.sup.st time caution light on the tail of skateboard will turn into orange and flickering; 2.sup.nd time caution light on the tail of skateboard will turn into orange and flickering rapidly; 3.sup.rd time mission failed.
[0210] The patient avatar will receive rewards from the mother dinosaur with an egg if the level is completed. Each egg reward of each level has different/unique design (to indicate a certain type of dinosaur).
[0211]
[0212] Referring now to
[0213] The estimated time of finishing the level will be around 4˜5 to 10 minutes. If the child patient moves his or her body during the game and cause the dropping or loss of any eggs, then the level will restart from the very beginning.
[0214]
[0215] A number of game statistics may be displayed on image 1100, including egg value 1110, eggs collected 1114 and time counter 1116. This data provides the patient with information related to his or her current score, the duration of the game, and the progress through the game session.
[0216]
[0217] The image on the bottom shows the scene the child patient will see in the virtual reality goggles when the game starts. For instance, five dinosaur eggs can be seen on the way (determined by the route set up) and the skateboard will lead the way to it automatically. Once a dinosaur egg is collected, an egg-figured space on the top left screen will be filled with an egg 1122 icon correspondingly.
[0218] At the conclusion of the game session, a representative game end screen is shown in
[0219] Referring to
[0220]
[0221] Another embodiment of the invention is the respiratory gating application.
[0222] As shown in the lower portion of
[0223] In a preferred embodiment, the game incorporated into the present invention is about a girl on a skateboard who travels inside a sci-fi tunnel. When the patient breathes-in, the skateboard moves up and when he/she breathes out the skateboard moves down. The values from the sensors are linked to the movement of the skateboard. The patient should not breathe more than the value recorded as the skateboard might hit the top of the tunnel and slows down the skateboard or he/she should not breathe out more since the skateboard might hit the floor and slows it down.
[0224] The ultimate goal of the game is to avoid hitting the tunnel and cover a maximum distance within the prescribed time of 45 seconds to 1 minute. Game points will be awarded on how long the breath is held+minimal chest move up and down and the speed. Once the patient treatment is complete, the virtual reality environment returns the patient to the real world environment gradually in order to provide a smooth transition from the virtual reality environment and minimize stress upon the patient.
[0225] The Virtual Reality Medical Application System presently described herein is capable of obtaining the objects of the present invention. The particular preferred embodiments that have been presented herein are merely exemplary of preferred embodiments, but various alternative combinations of the features and components of the present invention may be constructed and are fully contemplated in the present invention.