Patent classifications
A61B5/744
VR-Based Treatment System and Method
An XR-based system (virtual reality, augmented reality, or mixed reality system), is provided to visualize and resolve at least one condition of a subject. A dynamic virtual representation of the subject's body is generated based on the captured physical traits and movement of the subject's body is captured by at least one motion tracking device, and rendered in the extended reality environment. The dynamic virtual representation is synchronized with the movement of the body of the subject, generating a virtual representation of at least one condition of the subject in response to one or more inputs, overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject, and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the extended reality environment.
ICU Monitor With Medication Data
Systems and methods for remotely monitoring hospital patients are provided. Medical devices may capture biometric data associated with a patient positioned in an ICU environment, and a remote display device positioned external to the ICU environment may display an indication of the health of the patient based on the captured biometric data. The indication of the health of the patient may include a graph mapping normalized health statuses associated with the patient over time. Each normalized health status may be based on normalizing a particular type of biometric data based on a medical protocol, so each data point of the graph includes an indication of a normalized health status associated with the patient at a given time. The indication of the health of the patient may include an animated three-dimensional that is dynamically updated to reflect biometric data associated with the patient captured at a time selected by a user.
PATIENT MONITORING DEVICE WITH IMPROVED USER INTERFACE
A system for monitoring a patient's orientation to reduce a risk of the patient developing a pressure ulcer can include one or more hardware processors that can receive and process data from a sensor to determine the patient's orientation. The one or more hardware processors can maintain a plurality of timers associated with available orientations of the patient. The one or more hardware processors can modify a value of the plurality of timers based on an amount of time the patient is oriented in the plurality of orientations. The one or more hardware processors can generate an interactive graphical user interface on the display screen. The user interface can include a graphic for illustrating an orientation history of the patient. The one or more hardware processors can modify an appearance of the graphic based upon the values the plurality of timers.
Mixed-reality surgical system with physical markers for registration of virtual models
An example method includes obtaining, a virtual model of a portion of an anatomy of a patient obtained from a virtual surgical plan for an orthopedic joint repair surgical procedure to attach a prosthetic to the anatomy; identifying, based on data obtained by one or more sensors, positions of one or more physical markers positioned relative to the anatomy of the patient; and registering, based on the identified positions, the virtual model of the portion of the anatomy with a corresponding observed portion of the anatomy.
User interfaces for health applications
The present disclosure generally relates to user interfaces for health applications. In some embodiments, exemplary user interfaces for managing health and safety features on an electronic device are described. In some embodiments, exemplary user interfaces for managing the setup of a health feature on an electronic device are described. In some embodiments, exemplary user interfaces for managing background health measurements on an electronic device are described. In some embodiments, exemplary user interfaces for managing a biometric measurement taken using an electronic device are described. In some embodiments, exemplary user interfaces for providing results for captured health information on an electronic device are described. In some embodiments, exemplary user interfaces for managing background health measurements on an electronic device are described.
ENABLING AN IM USER TO NAVIGATE A VIRTUAL WORLD
A user is enabled to interact with a virtual world environment using an instant messenger application by enabling a user to enter the virtual world environment using the instant messenger application that includes an instant messaging (IM) user interface, generating and managing an avatar to represent the user in the virtual world environment, monitoring a sub-portion of the virtual world environment corresponding to a current location of the user in the virtual world environment, determining descriptions of activities taking place in the sub-portion of the virtual world environment based on the monitoring, and providing the user with the determined descriptions of activities taking place in the sub-portion of the virtual world environment via the IM user interface.
User interfaces for health applications
The present disclosure generally relates to user interfaces for health applications. In some embodiments, exemplary user interfaces for managing health and safety features on an electronic device are described. In some embodiments, exemplary user interfaces for managing the setup of a health feature on an electronic device are described. In some embodiments, exemplary user interfaces for managing background health measurements on an electronic device are described. In some embodiments, exemplary user interfaces for managing a biometric measurement taken using an electronic device are described. In some embodiments, exemplary user interfaces for providing results for captured health information on an electronic device are described. In some embodiments, exemplary user interfaces for managing background health measurements on an electronic device are described.
SYSTEMS AND METHODS FOR USING VIRTUAL REALITY, AUGMENTED REALITY, AND/OR A SYNTHETIC 3-DIMENSIONAL INFORMATION FOR THE MEASUREMENT OF HUMAN OCULAR PERFORMANCE
A system or method for measuring human ocular performance can be implemented using an eye sensor, a head orientation sensor, an electronic circuit and a display that presents one of virtual reality information, augmented reality information, or synthetic computer-generated 3-dimensional information. The device is configured for measuring saccades, pursuit tracking during visual pursuit, nystagmus, vergence, eyelid closure, or focused position of the eyes. The eye sensor comprises a video camera that senses vertical movement and horizontal movement of at least one eye. The head orientation sensor senses pitch and yaw in the range of frequencies between 0.01 Hertz and 15 Hertz. The system uses a Fourier transform to generate a vertical gain signal and a horizontal gain signal.
Video rebroadcasting with multiplexed communications and display via smart mirrors
During a first time period and for a first user, a second user is automatically selected based on competitive data of the first user and competitive data of the second user, and a workout selection is sent to cause a video of a workout to be displayed during a second time period on a smart mirror of the first user and a smart mirror of the second user. During the second time period, a live stream of the first user exercising is displayed at the smart mirror of the second user, and a live stream of the second user exercising is received and displayed at the smart mirror of the first user. During the second time period, a performance score of the first user and a performance score of the second user is displayed at the smart mirrors of the first user and the second user.
BIOMETRIC ENABLED VIRTUAL REALITY SYSTEMS AND METHODS FOR DETECTING USER INTENTIONS AND MODULATING VIRTUAL AVATAR CONTROL BASED ON THE USER INTENTIONS FOR CREATION OF VIRTUAL AVATARS OR OBJECTS IN HOLOGRAPHIC SPACE, TWO-DIMENSIONAL (2D) VIRTUAL SPACE, OR THREE-DIMENSIONAL (3D) VIRTUAL SPACE
Biometric enabled virtual reality (VR) systems and methods are disclosed for detecting user intention(s) and modulating virtual avatar control based on the user intention(s) for creation of virtual avatar(s) or object(s) in holographic space, two-dimensional (2D) virtual space, or three-dimensional (3D) virtual space. A virtual representation of an intended motion of a user corresponding to an intention of muscle activation of the user is determined based on analysis of a biometric signal data of the user as collected by a biometric detection device. The virtual representation of the intended motion is used to modulate virtual avatar control or output to create at least one of a virtual avatar representing aspect(s) of the user or an object manipulated by the user in a holographic space, virtual 2D space, or virtual 3D space. The avatar or the object is created based on: (1) the biometric signal data of a user, or (2) user-specific specifications as provided by the user.