Y10S901/09

DEFORMABLE SENSORS AND METHODS FOR DETECTING POSE AND FORCE AGAINST AN OBJECT

Systems and methods for detecting pose and force against an object are provided. A method includes receiving a signal from a deformable sensor comprising data from a deformation region in a deformable membrane resulting from contact with the object utilizing an internal sensor disposed within an enclosure and having a field of view directed through a medium and toward a bottom surface of the deformable membrane. The method also determines a pose of the object based on the deformation region of the deformable membrane. The method also determines an amount of force applied between the deformable membrane and the object is determined based on the deformation region of the deformable membrane.

Configurable robotic surgical system with virtual rail and flexible endoscope

Systems and methods for moving or manipulating robotic arms are provided. A group of robotic arms are configured to form a virtual rail or line between the end effectors of the robotic arms. The robotic arms are responsive to outside force such as from a user. When a user moves a single one of the robotic arms, the other robotic arms will automatically move to maintain the virtual rail alignments. The virtual rail of the robotic arm end effectors may be translated in one or more of three dimensions. The virtual rail may be rotated about a point on the virtual rail line. The robotic arms can detect the nature of the contact from the user and move accordingly. Holding, shaking, tapping, pushing, pulling, and rotating different parts of the robotic arm elicits different movement responses from different parts of the robotic arm.

Automated heart valve manufacturing devices and methods

An automated system that can be used for prosthetic heart valve manufacturing or suturing procedures. The system can include a first automated fixture that includes an articulating arm and a target device holder. The system can also include one or more additional automated fixtures, which can be configured as one or more suturing arms that include another articulating arm and a needle holder. The first automated fixture can be configured to rotate a target device held by the holder to allow the one or more additional automated fixtures to perform operations such as form sutures on the target device without intervention of a human operator. The system can include a targeting system configured to provide positioning feedback to the system.

Surgical system and method utilizing impulse modeling for controlling an instrument
11471232 · 2022-10-18 · ·

A surgical system for applying an energy applicator to a target tissue and methods operating the same are disclosed. The energy applicator extends from a surgical instrument. The surgical system comprises a sensor to measure external forces and torques placed on the surgical instrument. A surgical manipulator is configured to move the energy applicator in a manual mode in response to the external forces and torques. At least one controller is configured to: model the surgical instrument and the energy applicator as a virtual rigid body having a virtual mass; calculate, using impulse modeling, constraining forces and torques to be applied to the virtual rigid body; determine total forces and torques based on the external forces and torques and the constraining forces and torques; and advance the energy applicator in the manual mode based on the total forces and torques.

Characterising robot environments
11597094 · 2023-03-07 · ·

A method for characterising the environment of a robot, the robot having a flexible arm having a plurality of joints, a datum carried by the arm, a plurality of drivers arranged to drive the joints to move and a plurality of position sensors for sensing the position of each of the joints, the method comprising: contacting the datum carried by the arm with a first datum on a second robot in the environment of the first robot, wherein the second robot has a flexible arm having a plurality of joints, and a plurality of drivers arranged to drive those joints to move; calculating in dependence on the outputs of the position sensors a distance between a reference location defined in a frame of reference local to the robot and the first datum; and controlling the drivers to reconfigure the first arm in dependence on at least the calculated distance.

Robotic control
11472026 · 2022-10-18 · ·

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for receiving, by one or more non-real-time processors, data defining a light illumination pattern for a robotic device. Generating, by the one or more non-real-time processors and based on the data, a spline that represents the light illumination pattern, where a knot vector of the spline defines a timing profile of the light illumination pattern. Providing the spline to one or more real-time processors of the robotic system. Calculating, by the one or more real-time processors, an illumination value from the spline at each of a plurality of time steps. Controlling, by the one or more real-time processors, illumination of a lighting display of the robotic system in accordance with the illumination value of the spline at each respective time step.

ROBOT NAVIGATION USING 2D AND 3D PATH PLANNING
20230123298 · 2023-04-20 ·

Methods, systems, and apparatus, including computer-readable storage devices, for robot navigation using 2D and 3D path planning. In the disclosed method, a robot accesses map data indicating two-dimensional layout of objects in a space and evaluates candidate paths for the robot to traverse. In response to determining that the candidate paths do not include a collision-free path across the space for a two-dimensional profile of the robot, the robot evaluates a three-dimensional shape of the robot with respect to a three-dimensional shape of an object in the space. Based on the evaluation of the three-dimensional shapes, the robot determines a collision-free path to traverse through the space.

Deformable sensors and methods for detecting pose and force against an object

Systems and methods for detecting pose and force against an object are provided. A method includes receiving a signal from a deformable sensor comprising data from a deformation region in a deformable membrane resulting from contact with the object utilizing an internal sensor disposed within an enclosure and having a field of view directed through a medium and toward a bottom surface of the deformable membrane. The method also determines a pose of the object based on the deformation region of the deformable membrane. The method also determines an amount of force applied between the deformable membrane and the object is determined based on the deformation region of the deformable membrane.

User-assisted robotic control systems

Exemplary embodiments relate to user-assisted robotic control systems, user interfaces for remote control of robotic systems, vision systems in robotic control systems, and modular grippers for use by robotic systems. The systems, methods, apparatuses and computer-readable media instructions described interact with and control robotic systems, in particular pick and place systems using soft robotic actuators to grasp, move and release target objects.

Systems and methods for establishing virtual constraint boundaries

Systems and methods are disclosed involving an instrument, an instrument tracking device for tracking movement of the instrument, a first boundary tracking device for tracking movement of a first virtual boundary associated with an anatomy of a patient to be treated by the instrument, and a second boundary tracking device for tracking movement of a second virtual boundary associated with a periphery of an opening in the patient to be avoided by the instrument. One or more controllers receive information associated with the tracking devices including positions of the instrument relative to the first and second virtual boundaries, detect movement of the first and second virtual boundaries relative to one another, and generate a response upon detecting movement of the first and second virtual boundaries relative to one another.