Patent classifications
A61B1/00043
POWERED ENDOSCOPIC DEVICE WITH HAPTIC FEEDBACK
An endoscopic deployment device includes a body mountable on an endoscopic device, the body having a movable carrier couplable to an end effector device, the end effector device having an end effector shaft covered by an outer sheath and an end effector extending from a distal end of the end effector shaft, the outer sheath being sized and shaped for insertion through a working channel of the endoscopic device, the body having a carrier channel for the carrier to slide therein, wherein the end effecter is actuatable between an open position and a closed position; and a motor having a drive shaft coupled to the carrier, rotation of the drive shaft sliding the carrier in the carrier channel and actuating the end effector in response to a signal from one or more actuation buttons; wherein at least one vibration motor generates vibrations as an angular position of the motor changes.
Combination emitter and camera assembly
A system and method for controlling an emitter assembly comprising a single electromagnetic radiation source for visualizing a surgical site. The emitter assembly comprises a light valve assembly that is coupled to a control circuit. The emitter assembly is configured to emit visible light, infrared radiation, or a combination thereof in either structured or unstructured formats. The control circuit is configured to control the light valve assembly to control which emitter of the emitter assembly is emitting electromagnetic radiation. The light valve assembly can include light valves for controlling whether an emitter receives electromagnetic radiation. Further, the control circuit can control the wavelength of the electromagnetic radiation emitted by the source in accordance with which emitter is receiving electromagnetic radiation.
ROBOTIC SYSTEMS WITH SEPARATE PHOTOACOUSTIC RECEIVERS
A surgical robotic visualization system comprises a first robotic arm, a second robotic arm, a photoacoustic receiver coupled to the first robotic arm, an emitter assembly coupled to the second robotic arm, and a control circuit. The control circuit is configured to cause the emitter assembly to emit electromagnetic radiation toward an anatomical structure at a plurality of wavelengths capable of penetrating the anatomical structure and reaching an embedded structure located below a surface of the anatomical structure, receive an input of the photoacoustic receiver indicative of an acoustic response signal of the embedded structure, and detect the embedded structure based on the input from the photoacoustic receiver.
INSTRUMENT ROLL CONTROL
A medical instrument includes an elongate shaft defining a roll axis and a handle coupled to the elongate shaft. The handle includes a robotic drive input operable to rotate the elongate shaft with respect to the handle about the roll axis and a lockout mechanism movable between an engaged position in which the lockout mechanism impedes rotation of the elongate shaft with respect to the handle about the roll axis, and a disengaged position in which the lockout mechanism permits rotation of the elongate shaft with respect to the handle about the roll axis.
APPARATUS, SYSTEMS, AND METHODS FOR INTRAOPERATIVE VISUALIZATION
Errors in a blended stream that would result in non-display or obscuring of a live video stream from a medical device may be automatically detected, and a failover stream corresponding to the first live video stream may be displayed to medical personnel. For example, one or more second input streams that are being blended may contain no data or invalid data which may result in the blended stream not displaying (or obscuring) the live video stream (if the blended were displayed). Switching from blending to a failover buffer may occur within the time to process a single video image frame. Upon detection (prior to display) that the blended stream would not display the live video stream, display of the live video stream from the failover buffer may be initiated. Other aspects are also described and claimed.
CAMERA NAVIGATION TRAINING SYSTEM
A system for training surgical camera navigation skills is provided. A plurality of two-dimensional targets is printed on an upper surface of a flat sheet of material. The sheet is easily transportable and placed onto a base of a typical box trainer that defines a simulated abdominal cavity between the base and a top. A scope is inserted through a port in the top and the targets are viewed on a live video feed displayed to a trainee on a screen with the targets being otherwise obscured from view by the box trainer. The trainee can move the scope back and forth, roll and angulate the scope about the port in order to view the targets on the sheet at different angles and distances. The trainee is instructed to follow a sequence of targets marked on the sheet and manipulate the scope to align consecutively each target with the edges of the screen in the sequence provided.
Surgical visualization of multiple targets
A surgical visualization system is disclosed. The surgical visualization system is configured to identify one or more structure(s) and/or determine one or more distances with respect to obscuring tissue and/or the identified structure(s). The surgical visualization system can facilitate avoidance of the identified structure(s) by a surgical device. The surgical visualization system can comprise a first emitter configured to emit a plurality of tissue-penetrating light waves and a second emitter configured to emit structured light onto the surface of tissue. The surgical visualization system can also include an image sensor configured to detect reflected visible light, tissue-penetrating light, and/or structured light. The surgical visualization system can convey information to one or more clinicians regarding the position of one or more hidden identified structures and/or provide one or more proximity indicators.
Force sensor through structured light deflection
A surgical visualization system is disclosed. The surgical visualization system includes a control circuit communicatively coupled to a straight line laser source, a structured light emitter, and an image sensor; and a memory communicatively coupled to the control circuit. The memory stores instructions which, when executed, cause the control circuit to control the straight line laser source to project a straight laser line reference; control the structured light source to emit a structured light pattern onto a surface of an element of a surgical device; control the image sensor to detect the projected straight laser line and structured light reflected from the surface of the element of the surgical device; and determine a position of the element of the surgical device relative to the projected straight laser line reference.
Multi-organ imaging system with a single, multi-examination illumination unit
A multi-organ imaging system including a camera lens, a stationary, multi-examination illumination unit (SMEIU), and an attachment holder is provided. An industrial camera unit (ICU) for imaging multiple organs, for example, ear, nose, throat, and skin, is housed in a camera body. The camera lens has a fixed focal length and an iris for optimizing examination and imaging of the organs. The SMEIU is integrated to the camera body and includes illuminators arranged in a geometrical configuration. The attachment holder accommodates an organ examination attachment selected for examining an organ. The illuminators, in optical communication with one or more reflective surfaces in the organ examination attachment, produce shadowless illumination during examination and imaging of each organ, without requiring replacement of the SMEIU for examining each organ. A display unit, accommodated in a display holder detachably attached to the camera body, assists in aiming the camera lens and visualizing each organ.
SURGICAL VISUALIZATION AND MONITORING
A surgical visualization system is disclosed. The surgical visualization system is configured to identify one or more structure(s) and/or determine one or more distances with respect to obscuring tissue and/or the identified structure(s). The surgical visualization system can facilitate avoidance of the identified structure(s) by a surgical device. The surgical visualization system can comprise a first emitter configured to emit a plurality of tissue-penetrating light waves and a second emitter configured to emit structured light onto the surface of tissue. The surgical visualization system can also include an image sensor configured to detect reflected visible light, tissue-penetrating light, and/or structured light. The surgical visualization system can convey information to one or more clinicians regarding the position of one or more hidden identified structures and/or provide one or more proximity indicators. In various instances, a robotic camera of the surgical visualization system can monitor and track one or more tagged structures.