Patent classifications
A61B2090/304
Stapling assembly including a controller for monitoring a clamping laod
Various examples are directed to systems and methods for operating a surgical instrument comprising a firing member translatable proximally and distally along a longitudinal axis between a stroke begin position to a stroke end position distal of the stroke begin position; a knife coupled to the firing member; and a motor coupled to the firing member to translate the firing member between the stroke begin position and the stroke end position. A control circuit may receive a firing signal and begin a firing member stroke by providing an initial motor setting to the motor. The control circuit may maintain the initial motor setting for an open-loop portion of the firing member stroke. The control circuit may receive firing member motion data describing a motion of the firing member during the open-loop portion of the firing member stroke and may select a firing control program based at least in part on the motion of the firing member during the open-loop portion of the firing member stroke.
System and method for determining, adjusting, and managing resection margin about a subject tissue
A surgical visualization system that can include a structured light emitter, a spectral light emitter, an image sensor, and a control circuit is disclosed herein. The structured light emitter can emit a structured pattern of electromagnetic radiation onto an anatomical structure. The spectral light emitter can emit electromagnetic radiation including a plurality of wavelengths. At least one of the wavelengths can penetrate a portion of the anatomical structure and reflect off a subject tissue. The image sensor can detect the structured pattern of electromagnetic radiation reflected off the anatomical structure and the at least one wavelength reflected off the subject tissue. The control circuit can receive signals from the image sensor, construct a model of the anatomical structure, detect a location of the subject tissue, and determine a margin about the subject tissue, based on at least one signal received from the image sensor.
Combination emitter and camera assembly
A system and method for controlling an emitter assembly comprising a single electromagnetic radiation source for visualizing a surgical site. The emitter assembly comprises a light valve assembly that is coupled to a control circuit. The emitter assembly is configured to emit visible light, infrared radiation, or a combination thereof in either structured or unstructured formats. The control circuit is configured to control the light valve assembly to control which emitter of the emitter assembly is emitting electromagnetic radiation. The light valve assembly can include light valves for controlling whether an emitter receives electromagnetic radiation. Further, the control circuit can control the wavelength of the electromagnetic radiation emitted by the source in accordance with which emitter is receiving electromagnetic radiation.
Adaptive visualization by a surgical system
Various adaptive surgical visualization systems are disclosed. Surgical visualizations can compensate for obscured, incomplete, damaged, or interfered with portions of captured images by substituting those portions of the images with corresponding portions of other images. The other images could include images that were previously generated by the surgical visualization system or images that were generated using multispectral imaging techniques.
ROBOTIC SYSTEMS WITH SEPARATE PHOTOACOUSTIC RECEIVERS
A surgical robotic visualization system comprises a first robotic arm, a second robotic arm, a photoacoustic receiver coupled to the first robotic arm, an emitter assembly coupled to the second robotic arm, and a control circuit. The control circuit is configured to cause the emitter assembly to emit electromagnetic radiation toward an anatomical structure at a plurality of wavelengths capable of penetrating the anatomical structure and reaching an embedded structure located below a surface of the anatomical structure, receive an input of the photoacoustic receiver indicative of an acoustic response signal of the embedded structure, and detect the embedded structure based on the input from the photoacoustic receiver.
Surgical visualization of multiple targets
A surgical visualization system is disclosed. The surgical visualization system is configured to identify one or more structure(s) and/or determine one or more distances with respect to obscuring tissue and/or the identified structure(s). The surgical visualization system can facilitate avoidance of the identified structure(s) by a surgical device. The surgical visualization system can comprise a first emitter configured to emit a plurality of tissue-penetrating light waves and a second emitter configured to emit structured light onto the surface of tissue. The surgical visualization system can also include an image sensor configured to detect reflected visible light, tissue-penetrating light, and/or structured light. The surgical visualization system can convey information to one or more clinicians regarding the position of one or more hidden identified structures and/or provide one or more proximity indicators.
Force sensor through structured light deflection
A surgical visualization system is disclosed. The surgical visualization system includes a control circuit communicatively coupled to a straight line laser source, a structured light emitter, and an image sensor; and a memory communicatively coupled to the control circuit. The memory stores instructions which, when executed, cause the control circuit to control the straight line laser source to project a straight laser line reference; control the structured light source to emit a structured light pattern onto a surface of an element of a surgical device; control the image sensor to detect the projected straight laser line and structured light reflected from the surface of the element of the surgical device; and determine a position of the element of the surgical device relative to the projected straight laser line reference.
SURGICAL INSTRUMENT WITH MULTIPLE PROGRAM RESPONSES DURING A FIRING MOTION
A surgical instrument. The surgical instrument includes an elongated channel configured to support a staple cartridge, an anvil pivotably connected to the elongated channel, a knife mechanically coupled to the staple cartridge, an electric motor and a control circuit electrically connected to the electric motor. The control circuit is configured to change a firing motion a first way based on a first value of a projected peak firing force and a second way based on a second value of the projected peak firing force value.
Loupe-based intraoperative fluorescence imaging device for the guidance of tumor resection
This application concerns a loupe-based wearable device that is enhanced by a mounted visualization aid on the housing body of at least one of the loupe eyepieces, the aid providing a dual light source, a beam splitter, and a camera directed in the same optical path as a user's eyesight such that both visible light and fluorescent dye exciting light can be directed at a site of operation to enhance real time visualization of tissue resection.
Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto
A method of imaging a surgical site is disclosed. The method comprises obtaining, by a controller of an automated surgical hub system, first image data of a surgical site, controlling, by the controller, at least one illumination source to illuminate a visible surface of the surgical site in a first manner by projecting structured light onto the visible surface, obtaining, by the controller, second image data of the visible surface of the surgical site under illumination in the first manner by the at least one illumination source, calculating, by the controller, a three-dimensional model of the visible surface based on the second image data obtained by the controller, and integrating, by the controller, the three-dimensional model of the visible surface with the first image data of the surgical site. The second image data is based on sensing the structured light projected onto the visible surface.