A61B8/4245

Tracking Apparatus For Tracking A Patient Limb

A tracking apparatus for tracking a bone of a patient limb is provided. The tracking apparatus includes a body configured to couple to the patient limb. The body includes first and second arms each including an exterior and opposing interior surface and opposing sides connecting the exterior and interior surfaces. The tracking apparatus also includes a wing portion extending from one of the sides of the first or second arm, the wing portion sharing the interior surface of the first or second arm. The tracking apparatus also includes one or more ultrasonic sensors coupled to the interior surface of the body and the interior surface of wing portion, the one or more ultrasonic sensor being configured to transmit ultrasonic waves to and receive ultrasonic waves from the bone. The tracking apparatus also includes one or more trackable elements coupled to the body and the wing portion.

Control apparatus

A control apparatus detects a misalignment between an irradiation position of a transdermal treatment device and a target irradiation position. When the misalignment is detected, the control unit stops irradiation of the transdermal treatment device or moves the irradiation position closer to the target irradiation position.

SYSTEM AND METHOD FOR EVALUATING THE PERFORMANCE OF A USER IN CAPTURING AN ULTRASOUND IMAGE OF AN ANATOMICAL REGION
20230037923 · 2023-02-09 · ·

A training platform, a method and a computer-readable medium for evaluating users in capturing images of an internal anatomical region for the analysis of organs. Automated machine learning models, trained on a dataset of labelled training images associated with different imaging device positions, are used. The one or more automated machine learning models are used to process an image resulting from a user positioning an imaging device at various imaging device positions relative to a training manikin, a human or an animal, to determine whether the generated image corresponds to a predefined view required for the analysis of the organ features shown therein. An output indicative of whether the generated image corresponds to the predefined view expected for organ analysis and measurements is provided.

ULTRASOUND DEVICES FOR MAKING EYE MEASUREMENTS
20230043585 · 2023-02-09 ·

The disclosed ultrasound devices may include at least one ultrasound transmitter positioned and configured to transmit ultrasound signals toward a user's face to reflect off a facial feature of the user's face and at least one ultrasound receiver positioned and configured to receive and detect the ultrasound signals reflected off the facial feature. At least one processor may be configured to receive data from the at least one ultrasound receiver and to determine, based on the received data from the at least one ultrasound receiver, at least one of the following eye measurements: an interpupillary distance of the user; an eye relief; or a position of a head-mounted display relative to the facial feature of the user. Various other devices, systems, and methods are also disclosed.

Ultrasonic diagnostic apparatus, medical image processing apparatus, and non-transitory computer medium storing computer program

The ultrasonic diagnostic apparatus according to the present embodiment includes processing circuitry. The processing circuitry is configured to: acquire multiple position data associated with respective multiple two-dimensional image data of ultrasonic related to multiple cross sections; smooth the acquired multiple position data; and arrange the multiple two-dimensional image data in accordance with the smoothed multiple position data to generate volume data.

HYBRID ROBOTIC-IMAGE PLANE CONTROL OF A TEE PROBE

The following relates generally to systems and methods of trans-esophageal echocardiography (TEE) automation. Some aspects relate to a TEE probe with ultrasonic transducers on a distal end of the TEE probe. In some implementations, if a target is in a field of view (FOV) of the ultrasonic transducers, an electronic beam steering of the probe is adjusted; if the target is at an edge of the FOV, both the electronic beam steering and mechanical joints of the probe are adjusted; and if the target is not in the FOV, only the mechanical joints of the probe are adjusted.

System and method for orientating capture of ultrasound images

A downloadable navigator for a mobile ultrasound unit having an ultrasound probe, implemented on a portable computing device. The navigator includes a trained orientation neural network to receive a non-canonical image of a body part from the mobile ultrasound unit and to generate a transformation associated with the non-canonical image, the transformation transforming from a position and rotation associated with a canonical image to a position and rotation associated with the non-canonical image; and a result converter to convert the transformation into orientation instructions for a user of the probe and to provide and display the orientation instructions to the user to change the position and rotation of the probe.

PORTABLE ULTRASONIC MEASURING DEVICE SUITABLE FOR MEASURING PELVIC TILT

An ultrasound measuring device includes: a support bearing two ultrasound probes movable relative to each other by slide link, each of the two probes being movable relative to the support by ball-joint link, wherein the probes are capable of simultaneously acquiring two ultrasound images. The device includes a first set of measuring elements to measure a relative positioning of the probes, including one travel sensor and at least two orientation sensors. The device includes a second set of measuring elements to measure a positioning of the device relative to a reference plane, including at least one orientation sensor. The device localizes at least one point of interest on each of the two ultrasound images, and processes data coming from the first and second measuring elements, delivering a relative spatial position of the points of interest located in the images.

Image-based probe positioning

A framework for image-based probe positioning is disclosed herein. The framework receives a current image from a probe. The current image is acquired by the probe within a structure of interest. The framework predicts a position of the probe and generates a recommendation of a next maneuver to be performed using the probe by applying the current image to a trained classifier. The framework then outputs the predicted position and the recommendation of the next maneuver.

SYSTEMS, METHODS, APPARATUSES, AND COMPUTER-READABLE MEDIA FOR IMAGE MANAGEMENT IN IMAGE-GUIDED MEDICAL PROCEDURES
20230233264 · 2023-07-27 ·

Presented herein are methods, systems, devices, and computer-readable media for image management in image-guided medical procedures. Some embodiments herein allow a physician to use multiple instruments for a surgery and simultaneously provide image-guidance data for those instruments. Various embodiments disclosed herein provide information to physicians about procedures they are performing, the devices (such as ablation needles, ultrasound transducers or probes, scalpels, cauterizers, etc.) they are using during the procedure, the relative emplacements or poses of these devices, prediction information for those devices, and other information. Some embodiments provide useful information about 3D data sets and allow the operator to control the presentation of regions of interest. Additionally, some embodiments provide for quick calibration of surgical instruments or attachments for surgical instruments.