Patent classifications
A61B1/00004
Single source photoacoustic remote sensing (SS-PARS)
A photoacoustic remote sensing system for imaging a subsurface structure in a sample, comprising exactly one laser source configured to generate a pulsed or intensity-modulated excitation beam configured to generate ultrasonic pressure signals in the sample at an excitation location, and an interrogation beam incident on the sample at the excitation location, a portion of the interrogation beam returning from the sample that is indicative of the generated ultrasonic pressure signals, an optical system configured to focus the excitation beam and the interrogation beam below a surface of the sample, a detector configured to detect the returning portion of the interrogation beam, and a processor configured to calculate an image of the sample based on a detected intensity modulation of the returning portion of the interrogation beam from below the surface of the sample.
SINGLE SOURCE PHOTOACOUSTIC REMOTE SENSING (SS-PARS)
A photoacoustic remote sensing system for imaging a subsurface structure in a sample, comprising exactly one laser source configured to generate a pulsed or intensity-modulated excitation beam configured to generate ultrasonic pressure signals in the sample at an excitation location, and an interrogation beam incident on the sample at the excitation location, a portion of the interrogation beam returning from the sample that is indicative of the generated ultrasonic pressure signals, an optical system configured to focus the excitation beam and the interrogation beam below a surface of the sample, a detector configured to detect the returning portion of the interrogation beam, and a processor configured to calculate an image of the sample based on a detected intensity modulation of the returning portion of the interrogation beam from below the surface of the sample.
DISTAL TIP TRACKING AND MAPPING
Methods and systems for determining and mapping a location of a distal end region of an elongate shaft. An illustrative method may comprise obtaining data from an accelerometer located in the elongate shaft adjacent a distal end thereof, determining a length of the elongate shaft inserted into a body from a reference point, merging the accelerometer data and the length of the elongate shaft to localize the distal end region of the elongate shaft, reconstructing a line of travel of the medical device within the body, and superimposing the reconstructed line of travel over an image of an anatomy of the patient.
Method and Apparatus for Extending Battery Life of Capsule Endoscope
Method for extending battery life and a capsule endoscope using the method are disclosed. According to this method, a peak current in a current profile consumed by the capsule endoscope is identified, where the peak current is contributed by at least two sub-tasks associated with operations of the capsule endoscope and said at least two sub-tasks are performed overlapping in time. A running voltage indicating a battery output voltage at or near time instances of the peak current is determined. When the running voltage a condition caused by IR (battery internal-resistance) voltage drop, the sub-tasks are adjusted to reduce overlapping so as to reduce the peak current to a second peak current. According to another method, at least one sub-task is switched to another sub-task when the running voltage is below a threshold.
Wireless sensors for nerve integrity monitoring systems
A sensor including electrodes, a control module and a physical layer module. The electrodes are configured to (i) attach to a patient, and (ii) receive a first electromyographic signal from the patient. The control module is connected to the electrodes. The control module is configured to (i) detect the first electromyographic signal, and (ii) generate a first voltage signal. The physical layer module is configured to: receive a payload request from a console interface module or a nerve integrity monitoring device; and based on the payload request, (i) upconvert the first voltage signal to a first radio frequency signal, and (ii) wirelessly transmit the first radio frequency signal from the sensor to the console interface module or the nerve integrity monitoring device.
Medical manipulator system and image display method therefor
A medical manipulator system includes: an endoscope; a first manipulator equipped with a first treatment tool at a distal end thereof; a second manipulator equipped with a second treatment tool at a distal end thereof; a display for a user to view; and a controller configured to generate an image to be displayed on the display. The controller is configured to: acquire a first image taken by the endoscope, the first image contains the first treatment tool; and in response to determining that the second treatment tool does not exist in the first image: calculate a relative distance and a relative direction between the first treatment tool and the second treatment tool; generate a second image showing the relative distance and the relative direction between the first treatment tool and the second treatment tool; and send the first image and the second image to the display.
DEVICE AND METHOD FOR ASSISTING LAPAROSCOPIC SURGERY - DIRECTING AND MANEUVERING ARTICULATING TOOL
A surgical controlling system for controlling the 3D spatial position of at least one articulating surgical tool includes a controller configured to provide instructions to control movement of the surgical tool. The controller comprises a processor to determine a location of the surgical tool using at least one location estimating feature and to determine movement of the surgical tool using a database in communication with at least one movement detection feature. The location estimating feature is configured to real-time locate the 3D spatial position of the surgical tool at any given time t, and the movement detection feature is configured to detect movement of the surgical tool.
Wireless stimulation probe device for wireless nerve integrity monitoring systems
A stimulation probe device including a first electrode, a stimulation module, a control module and a physical layer module. The stimulation module is configured to (i) wirelessly receive a payload signal from a console interface module or a nerve integrity monitoring device, and (ii) supply a voltage or an amount of current to the first electrode to stimulate a nerve or a muscle in a patient. The control module is configured to generate a parameter signal indicating the voltage or the amount of current supplied to the electrode. The physical layer module is configured to (i) upconvert the parameter signal to a first radio frequency signal, and (ii) wirelessly transmit the first radio frequency signal from the stimulation probe to the console interface module or the nerve integrity monitoring device.
ELECTRONIC ENDOSCOPE SYSTEM
An electronic endoscope system includes: an electronic endoscope including, at a distal end portion, an image sensor that captures an image of a living tissue, and an ultrasound probe that applies ultrasonic waves to the living tissue to obtain an echo signal; a captured image processor including an image processing unit that processes an imaging signal output from the image sensor and generates a captured image; and an ultrasonic image processor including an ultrasonic image processing unit that processes the echo signal output from the ultrasound probe and generates an ultrasonic image, a noise detection unit that detects a periodic noise component included in the echo signal and generated at a level equal to or higher than a preset threshold level, and a noise suppression unit that performs processing of suppressing the detected noise component.
Surgical Systems with Intraluminal and Extraluminal Cooperative Instruments
Surgical systems are provided. In one exemplary embodiment, a surgical system includes a first scope device having a first portion within an extraluminal space and a second portion positioned within an intraluminal space. The first scope device transmits image data of a first scene. A second scope device is disposed within the extraluminal space and transmits image data of a second scene. The first portion of the first instrument is present within the field of view of the second scope device to track the first scope device relative to the second scope device. A controller receives the transmitted image data of the first and second scenes, to determine a relative distance from the first scope device to the second scope device within the extraluminal space, and to provide a merged image. At least one of the first and second scope device in the merged image is a representative depiction thereof.