A61B2090/502

Calibration and image procession methods and systems for obtaining accurate pupillary distance measurements
11707191 · 2023-07-25 · ·

Accurate measurement of pupillary distance, PD, is necessary to make prescription eye glasses as well as configuring VR headsets, and using other binocular optical devices. Today, many people are ordering eyeglasses on line and obtaining their PD is often problematic for a number of reasons as the prior art fails to provide consumer friendly PD measurement systems. A disclosed eyeglass frame system comprises reference marks of known locations upon the frames. A smart phone may be used to locate the consumer's pupils, while the consumer is wearing the frames. The consumer's pupils may be marked or tagged upon a digital image of the consumer wearing the frames. By use of angles in the sight lines of the camera lens and other variable values and the known relative distances of the frame markings, a consumer's pupillary distance can be quickly and accurately derived.

ERGONOMIC REFRACTION STATION AND METHOD FOR USING SAME
20230233073 · 2023-07-27 ·

Ergonomic refraction station and procedure of use consists of a phoropter helmet, chair, work table, monitor and electronic circuit, which seeks to perform a refraction test in the conditions most similar to the usual work environment of the patient, for this it consists of a lightweight phoropter helmet, which adjusts to the size of the user, made of transparent material to allow contact with its surroundings and execute the usual movements of head, neck, eyes and working distance, parameters that are captured by optical, distance and inclination sensors, located on the phoropter helmet or on the flexible and adjustable table with “swan neck” arms.

SYSTEMS, METHODS, APPARATUSES, AND COMPUTER-READABLE MEDIA FOR IMAGE MANAGEMENT IN IMAGE-GUIDED MEDICAL PROCEDURES
20230233264 · 2023-07-27 ·

Presented herein are methods, systems, devices, and computer-readable media for image management in image-guided medical procedures. Some embodiments herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound transducers or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets and allow the operator to control the presentation of regions of interest. Additionally, some embodiments provide for quick calibration of surgical instruments or attachments for surgical instruments.

PATIENT-SPECIFIC SIMULATION DATA FOR ROBOTIC SURGICAL PLANNING

A method for creating a patient-specific surgical plan includes receiving one or more pre-operative images of a patient having one or more infirmities affecting one or more anatomical joints. three-dimensional anatomical model of the one or more anatomical joints is created based on the one or more pre-operative images. One or more transfer functions and the three-dimensional anatomical model are used to identify a patient-specific implantation geometry that corrects the one or more infirmities. The transfer functions model performance of the one or more anatomical joints as a function of anatomical geometry and anatomical implantation features. surgical plan comprising the patient-specific implantation geometry may then be displayed.

METHODS FOR OPTICAL TRACKING AND SURFACE ACQUISITION IN SURGICAL ENVIRONMENTS AND DEVICES THEREOF

A computer assisted system is disclosed that includes an optical tracking system and one or more computing devices. The optical tracking system includes an RGB sensor and is configured to capture color images of an environment in the visible light spectrum and tracking images of fiducials in the environment in a near-infrared spectrum. The computer assisted system is configured to generate a color image of the environment using the color images, identify fiducial locations using the tracking images, generate depth maps from the color images, reconstruct three-dimensional surfaces of structures based on the depth maps, and output a display comprising the reconstructed three-dimensional surface and one or more surgical objects that are associated with the tracked fiducials. The computer assisted system can further include a monitor or a head-mounted display (HMD) configured to present augmented reality (AR) images during a procedure.

Electrogram Annotation System

In an embodiment, an electrogram (EGM) processing system provides, for display by a head-mounted display (HMD) worn by a user, a holographic rendering of intracardiac geometry. The HMD also displays an electrogram waveform. The EGM processing system determines a gaze direction of the user by processing sensor data from the HMD. The HMD displays a marker overlaid on the electrogram waveform at a location based on an intersection point between the gaze direction and the electrogram waveform. The EGM processing system determines a measurement of the electrogram waveform using the location of the marker. The HMD displays the measurement of the electrogram waveform.

Method of hub communication with surgical instrument systems

A method for adjusting the operation of a surgical instrument using machine learning in a surgical suite is disclosed. The method comprises the steps of gathering data during surgical procedures, wherein the surgical procedures include the use of a surgical instrument, analyzing the gathered data to determine an appropriate operational adjustment of the surgical instrument, and adjusting the operation of the surgical instrument to improve the operation of the surgical instrument.

State assessment system, diagnosis and treatment system, and method for operating the diagnosis and treatment system

A state assessment system, a diagnosis and treatment system and a method for operating the diagnosis and treatment system are disclosed. An oscillator model converts a physiological signal of a subject into a defined feature image. A classification model analyzes state information of the subject based on the feature image. An analysis model outputs a treatment suggestion for the subject based on the state information of the subject. An AR projection device projects acupoint positions of a human body onto the subject, for the subject to be treated based on the treatment suggestion.

Body-mounted or object-mounted camera system

An object or body-mounted camera apparatus for recording surgery is provided that is adapted for tracking a relevant visual field of an on-going operation. To help maintain visibility and/or focus of the visual field, specific machine learning approaches are proposed in combination with control commands to shift a physical positioning or a perspective of the camera apparatus. Additional variations are directed to tracking obstructions based on the visual field of the camera, which can be utilized for determining a primary recording for use when there are multiple cameras being used in concert.

System and method for mapping navigation space to patient space in a medical procedure

An apparatus is provided that is visible by both a three dimensional (3D) scanner system of a medical navigation system and a camera of the medical navigation system. The apparatus involves a rigid member and a plurality of markers attached to the rigid member. Each of the plurality of markers includes a reflective surface portion visible by the camera and a distinct identifiable portion visible by the 3D scanner system. The apparatus further involves a connector mechanism to connect the apparatus to a reference location. The apparatus is in a field of view of the 3D scanner system and the camera within a timeframe of the 3D scan.