Patent classifications
A61B90/36
CLINICAL DIAGNOSTIC AND PATIENT INFORMATION SYSTEMS AND METHODS
A patient information system is described. A virtual representation of a patient is generated using a captured image of the patient. A graphical user interface is displayed on the display, the graphical user interface including a first screen portion configured to display the virtual representation of the patient and a second screen portion configured to display information. A selection of an area of interest is received on the virtual representation of the patient. Clinical and diagnostic information associated with the selected area of interest is received and displayed in the second screen portion of the graphical user interface. The graphical user interface also includes a timeline that displays events of interest. A selection of an event of interest causes the second screen portion of the graphical user interface to display clinical and diagnostic information associated with the selected event of interest.
THREE-DIMENSIONAL INSTRUMENT POSE ESTIMATION
The present disclosure relates to systems, devices, and methods for augmenting a two-dimensional image with three-dimensional pose information of instruments shown in the two-dimensional image.
Vision-based position and orientation determination for endovascular tools
Systems and methods for vision-based position and orientation determination for endovascular tools are disclosed. In one example, a method includes receiving a two-dimensional medical image including a view of at least a distal portion of a medical instrument, the distal portion of the medical instrument including one or more fiducials positioned thereon, the one or more fiducials being radio-opaque and visible in the medical image. The method also includes detecting, within the medical image, a two-dimensional appearance of the one or more fiducials, and based on the two-dimensional appearance of the one or more fiducials, determining at least one of a roll angle of the distal portion of the medical instrument, and an incline of the distal portion of the medical instrument.
ULTRASOUND SLICE ENHANCEMENT
In one embodiment a system includes a ultrasound probe to capture 2D ultrasonic images of a body part of a living subject, a process to generate a 3D anatomical map of the body part, the 3D anatomical map and the 2D ultrasonic images being registered with a 3D coordinate space, add a 3D indication of an anatomical structure to the 3D anatomical map, render to a display the 3D anatomical map including the 3D indication of the anatomical structure, and render to the display a given one of the 2D ultrasonic images with a 2D indication of the anatomical structure on the given 2D ultrasonic image responsively to the 3D indication of the anatomical structure.
VISION-BASED POSITION AND ORIENTATION DETERMINATION FOR ENDOVASCULAR TOOLS
Systems and methods for vision-based position and orientation determination for endovascular tools are disclosed. In one example, a method includes receiving a two-dimensional medical image including a view of at least a distal portion of a medical instrument, the distal portion of the medical instrument including one or more fiducials positioned thereon, the one or more fiducials being radio-opaque and visible in the medical image. The method also includes detecting, within the medical image, a two-dimensional appearance of the one or more fiducials, and based on the two-dimensional appearance of the one or more fiducials, determining at least one of a roll angle of the distal portion of the medical instrument, and an incline of the distal portion of the medical instrument.
SYSTEMS AND METHODS FOR AIDING NON-CONTACT DETECTOR PLACEMENT IN NON-CONTACT PATIENT MONITORING SYSTEMS
Systems and methods for aiding a clinician in the proper positioning, placing or otherwise locating of a non-contact detector component of a non-contact patient monitoring system are described. The systems and methods may employ a targeting aid superimposed on a display screen component of the non-contact patient monitoring system, the targeting aid being designed to assist the clinician in properly locating the non-contact detector for proper and accurate functioning of the non-contact patient monitoring system. The systems and methods described herein may also employ a bendable mounting arm to which the non-contact detector is attached such that the non-contact detector can be easily moved into the proper location when used in conjunction with the targeting aid superimposed on the display.
Device and system for generating ultrasonic waves in a target region of a soft solid and method for locally treating a tissue
This device (2) for generating ultrasonic waves in a target region of a soft solid, includes at least two ultrasound sources (32), light sources (40) distributed around a central axis (X2) of the device (2), for enlightening a zone of the soft solid via subsurface scattering, and a video camera (50), for capturing images of the zone enlightened by the lighting means. The ultrasound source (32), the light sources (40) and the video camera (50) are mounted on a body of the device (20) and oriented toward a common target zone which includes a focal point of the ultrasound sources (32). A boresight of the video camera is aligned on the central axis (X2).
Method of using a manually-operated light plane generating module to make accurate measurements of the dimensions of an object seen in an image taken by an endoscopic camera
Presented herein is a method of using a manually-operated light plane generating module to make accurate measurements of the dimensions of an object seen in an image taken by an endoscopic camera. The method comprises: providing the light plane generating module with distinctive features, introducing the light plane generating module until the distinctive features are visible in the image, aligning the light plane across the object, and providing a processor device and software configured to analyze the camera images. Also described are diagnostic or therapeutic endoscopic tools that comprise an attached light plane generating module to provide the tool with integrated light plane measurement capabilities, wherein the tool is configured to be used in the described method.
Virtual reality training for medical events
Systems and methods for virtual reality (VR) training of medical events are described herein. In one aspect, a method for generating a VR medical training environment can include displaying a medical event through a VR headset, receiving, from a user of the VR headset, a set of verbal responses corresponding to the user reacting to the medical event, determining a timestamp for at least one verbal response received from the user, determining a medical event score for the user based on the set of verbal responses and the timestamp, and displaying a summary of the medical event score via the VR headset or a display screen.
METHOD AND SYSTEM FOR REPRODUCING AN INSERTION POINT FOR A MEDICAL INSTRUMENT
The invention relates to a method for displaying an injection point for a medical instrument. The method comprises the following steps: Providing at least one marker on a surface of an object, with such marker exhibiting the property that it can be recorded both tomographically, in particular fluoroscopically, and also optically; Generating tomographic image data that can be used to reconstruct a fluoroscopic image of the at least one marker, located on the surface of the object, together with the object; Determining the insertion point for the medical instrument on the surface of the object relative to the at least one marker in the coordinate system of the tomographic image data; Generating visual image data that can be used to reconstruct a visual image of the at least one marker, located on the surface of the object, together with the object; Transforming the coordinate of the insertion point in the coordinate system of the tomographic image data into the coordinate system of the visual image data using the relative position of the insertion point to the at least one marker; and Displaying the insertion point for the medical instrument in real time in a view of the object.