B25J19/061

SYSTEM FOR CHECKING INSTRUMENT STATE OF A SURGICAL ROBOTIC ARM
20230024362 · 2023-01-26 ·

A surgical robotic system includes: a surgical console having a display and a user input device configured to generate a user input and a surgical robotic arm having a surgical instrument configured to treat tissue and being actuatable in response to the user input; and a video camera configured to capture video data that is displayed on the display. The system also includes a control tower coupled to the surgical console and the surgical robotic arm. The control tower is configured to: process the user input to control the surgical instrument and to record the user input as input data; train a machine learning system using the input data and the video data; and execute the at least one machine learning system to determine probability of failure of the surgical instrument.

Communicating closure effort for robotic surgical tools background

A method includes grasping a user input device in communication with a surgical tool of a robotic surgical system, the surgical tool including an end effector with opposing jaws, squeezing the user input device and thereby actuating a motor that closes the jaws and clamps down on tissue at a surgical site, and calculating with a computer system in communication with the surgical tool work completed by the motor to close the jaws and clamp down on the tissue. The computer system generates one or more effort indicators when the work completed by the motor meets or exceeds one or more predetermined work increments corresponding to operation of the motor, and communicates the one or more effort indicators to an operator.

LIGHT FOR TEACH PENDANT AND/OR ROBOT
20230211506 · 2023-07-06 ·

A teach pendant can be communicatively coupled to a light. The teach pendant can be communicatively coupled to a robot. The teach pendant can include one or more processors. The teach pendant can be configured to control the light. For instance, the teach pendant can be configured to selectively activate and deactivate the light. The light can be operatively connected to the teach pendant or the robot. The light can help robot programmers or technicians see in dark or dimly lit work environments. The light can improve user safety in such environments and/or enhance the work environment.

AUTONOMOUS MOBILE BODY, INFORMATION PROCESSING METHOD, PROGRAM, AND INFORMATION PROCESSING APPARATUS
20220413795 · 2022-12-29 ·

The present technology relates to an autonomous mobile body, an information processing method, a program, and an information processing apparatus capable of improving user experience by an output sound of the autonomous mobile body.

The autonomous mobile body includes: a recognition unit that recognizes a motion of its own device; and a sound control unit that controls an output sound output from the own device. The sound control unit controls output of a plurality of operation sounds that is the output sound corresponding to a plurality of the motions of the own device, and changes the operation sound in a case where the plurality of motions has been recognized. The present technology can be applied to, for example, a robot.

SYSTEMS AND METHODS TO CONFIGURE A ROBOTIC WELDING SYSTEM

An example robotic welding system, includes: a robotic manipulator configured to manipulate a welding torch; and a robotic controller, comprising: a processor; and a machine readable storage medium comprising machine readable instructions which, when executed by the processor, cause the processor to, in response to initiation of a robotic welding procedure involving the robotic manipulator: prior to starting the robotic welding procedure, output at least one of a visual notification or an audible notification proximate to the robotic manipulator; and after satisfying at least one weld-ready condition, control the robotic manipulator to perform the robotic welding procedure using the welding torch.

Method of detecting human and/or animal motion and performing mobile disinfection

Implementations of the disclosed subject matter provide a method of moving a mobile robot within an area. The movement of the mobile robot and the emission of ultraviolet (UV) light may be stopped when a human and/or animal is determined to be within the area. Using at least one sensor, the method may be determine whether there is at least one of human identification, animal identification, motion, heat, and/or sound within the area for a predetermined period of time. When there is no human identification, animal identification, motion, heat, and/or sound within the predetermined period of time, UV light may be emitted and the drive system may be controlled to move the mobile robot within the area. When there is at least one of human identification, motion, heat, and/or sound within the predetermined period of time, a light source may be controlled to prohibit the emission of UV light.

ROBOT CONTROL DEVICE, METHOD, AND PROGRAM

A robot control device (10) includes an attribute determination unit (71) that determines an attribute of an object person (T) around a robot (1); and a decision unit (74) that decides a notification action of notifying, by the robot (1), the object person (T) of presence of the robot (1), on the basis of the attribute determined by the attribute determination unit (71) and a risk of harm that may be caused to the object person (T) by the robot (1).

METHOD FOR MAINTAINING SYSTEMS, IN PARTICULAR MACHINES IN WAREHOUSES
20220331994 · 2022-10-20 ·

A method for maintaining, commissioning and checking systems in warehouses, where a service technician has a view of the respective system and makes wireless contact with the controller of the system via a mobile computer in order to take over control thereof, where the taking over of control by the mobile computer is permitted by a central controller only if the service technician can have visual contact with the corresponding system, for which purpose the determination of the position and/or the orientation of the mobile computer of the service technician with respect to the respective system is effected via optical and/or acoustic recognition of a fingerprint of the system.

Dynamic, interactive signaling of safety-related conditions in a monitored environment

Systems and methods for determining safe and unsafe zones in a workspace—where safe actions are calculated in real time based on all relevant objects (e.g., some observed by sensors and others computationally generated based on analysis of the sensed workspace) and on the current state of the machinery (e.g., a robot) in the workspace—may utilize a variety of workspace-monitoring approaches as well as dynamic modeling of the robot geometry. The future trajectory of the robot(s) and/or the human(s) may be forecast using, e.g., a model of human movement and other forms of control. Modeling and forecasting of the robot may, in some embodiments, make use of data provided by the robot controller that may or may not include safety guarantees.

DYNAMIC, INTERACTIVE SIGNALING OF SAFETY-RELATED CONDITIONS IN A MONITORED ENVIRONMENT
20230191635 · 2023-06-22 ·

Systems and methods for determining safe and unsafe zones in a workspace - where safe actions are calculated in real time based on all relevant objects (e.g., some observed by sensors and others computationally generated based on analysis of the sensed workspace) and on the current state of the machinery (e.g., a robot) in the workspace - may utilize a variety of workspace-monitoring approaches as well as dynamic modeling of the robot geometry. The future trajectory of the robot(s) and/or the human(s) may be forecast using, e.g., a model of human movement and other forms of control. Modeling and forecasting of the robot may, in some embodiments, make use of data provided by the robot controller that may or may not include safety guarantees.