B25J9/16

MULTI-DIRECTIONAL THREE-DIMENSIONAL PRINTING WITH A DYNAMIC SUPPORTING BASE

A computer-implemented dynamic supporting base creation method that interacts with a three-dimensional (3D) printer that prints an object, the method including providing a physical support, via a first robotic gripper, for an object during three-dimensional (3D) printing using a printing head of the 3D printer and transferring the object to a second robotic gripper to provide a physical support at a different location on the object.

REPEATING PATTERN DETECTION WITHIN USAGE RECORDINGS OF ROBOTIC PROCESS AUTOMATION TO FACILITATE REPRESENTATION THEREOF
20230052190 · 2023-02-16 ·

Improved techniques for examining a plurality of distinct recordings pertaining to user interactions with one or more software applications, where each recording concerns performing at least one task. The examined recordings can be processed such that the recordings can be organized and/or rendered in a consolidated manner which facilitates user's understanding of higher-level operations being performed by the examined recordings to carry out the associated task. As one embodiment, a robotic process automation system can, for example, operate to: acquire a plurality of recordings, the recordings include a series of user interactions with one or more application programs, the recordings being acquired via the robotic process automation system; identify repeating sequences of user interactions within the recordings; select at least one of the identified repeating sequences that occurs more often; create a first level pattern for the selected at least one of the identified repeating sequences; and associate a descriptive label to the first level pattern created for the selected at least one of the identified repeating sequences.

ROBOTIC PROCESS AUTOMATION SUPPORTING HIERARCHICAL REPRESENTATION OF RECORDINGS
20230053260 · 2023-02-16 ·

Improved techniques for examining a plurality of distinct recordings pertaining to user interactions with one or more software applications, where each recording concerns performing at least one task. The examined recordings can be processed such that the recordings can be organized and/or rendered in a consolidated manner which facilitates user's understanding of higher-level operations being performed by the examined recordings to carry out the associated task. Advantageously, the improved techniques enable a robotic process automation (RPA) system to recognize and represent repetitive tasks within multiple recordings as multi-level (e.g., hierarchical) patterns of steps, sub-tasks, or some combination thereof. In doing so, a RPA system can identify and define such patterns within recordings and can also accommodate variants in such patterns. The resulting multi-level representation of the recordings allows users to better understand and visualize what tasks or sub-tasks are being carried out by portions of the recordings.

SYSTEM AND METHOD FOR ROBOTIC OBJECT PLACEMENT
20230052515 · 2023-02-16 ·

A computing system including a processing circuit in communication with a robot and a camera having a field of view. The processing circuit obtains image information based on the objects in the field of view and a loading environment, the loading environment which includes loading areas, an object queue, and a buffer zone. The computing system is configured to use the obtained image information in motion planning operations for the retrieval and placement of objects from the object queue into the loading environment. Pallets provided within the loading environment (i.e., within the loading areas) are dedicated to receiving objects having corresponding object type identifiers. The computer system further uses the image information to determine the fill status of pallets existing within the loading environment, and whether new pallets need to be brought into the loading environment and/or swapped out with existing pallets to account for future planning and placement operations.

AUTONOMOUSLY NAVIGATING ROBOT CAPABLE OF CONVERSING AND SCANNING BODY TEMPERATURE TO HELP SCREEN FOR COVID-19 AND OPERATION SYSTEM THEREOF
20230047316 · 2023-02-16 ·

This application relates to an autonomously navigating robot. In one aspect, the robot includes an end effector configured to measure a person's body temperature and, when the body temperature exceeds a standard fever temperature, activate a chatbot to check symptoms of Covid-19. The robot may also include a manipulator configured to align the end effector with the person's forehead. The robot may further include a mobile robot configured to detect the person and move the end effector and the manipulator to a position where the person is located by performing autonomous navigation.

EVALUATION OF CALIBRATION FOR SURGICAL TOOL
20230046044 · 2023-02-16 ·

The disclosed embodiments relate to systems and methods for a surgical tool or a surgical robotic system. An example computer-implemented method for evaluating calibrations of a surgical tool includes fixating a joint of the surgical tool at a first angle, the joint being driven by an actuator, measuring an actuator position corresponding to the first angle, accessing a calibrated offset corresponding to the first angle, determining an expected joint angle based on the measured actuator position and the calibrated offset, and reporting a first difference between the expected joint angle and the first angle.

ROBOTIC PROCESS AUTOMATION SYSTEM FOR MANAGING HUMAN AND ROBOTIC TASKS
20230050430 · 2023-02-16 ·

Improved techniques for combining human tasks and robotic tasks in an organized manner to define an automation workflow process. A workflow process platform can assist a developer in creating an automation workflow process, and/or manage performance of an automation workflow process. The improved techniques enable a Robotic Process Automation (RPA) system to support programmatically combining various robotic tasks with human actions to provide an interrelated relationship of both human tasks and automated tasks.

MACHINE-LEARNABLE ROBOTIC CONTROL PLANS

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using learnable robotic control plans. One of the methods comprises obtaining a learnable robotic control plan comprising data defining a state machine that includes a plurality of states and a plurality of transitions between states, wherein: one or more states are learnable states, and each learnable state comprises data defining (i) one or more learnable parameters of the learnable state and (ii) a machine learning procedure for automatically learning a respective value for each learnable parameter of the learnable state; and processing the learnable robotic control plan to generate a specific robotic control plan, comprising: obtaining data characterizing a robotic execution environment; and for each learnable state, executing, using the obtained data, the respective machine learning procedures defined by the learnable state to generate a respective value for each learnable parameter of the learnable state.

Visual annotations in robot control interfaces

Methods, apparatus, systems, and computer-readable media are provided for visually annotating rendered multi-dimensional representations of robot environments. In various implementations, an entity may be identified that is present with a telepresence robot in an environment. A measure of potential interest of a user in the entity may be calculated based on a record of one or more interactions between the user and one or more computing devices. In some implementations, the one or more interactions may be for purposes other than directly operating the telepresence robot. In various implementations, a multi-dimensional representation of the environment may be rendered as part of a graphical user interface operable by the user to control the telepresence robot. In various implementations, a visual annotation may be selectively rendered within the multi-dimensional representation of the environment in association with the entity based on the measure of potential interest.

Fault diagnosis device for robot and robot system
11582850 · 2023-02-14 · ·

A fault diagnosis device is configured to diagnose a fault in a light emitting unit that emits light of a color according to an operating state of a robot by individually energizing and lighting a plurality of types of LEDs of different emission colors. The fault diagnosis device includes an energization control unit that controls energization of the LEDs, a voltage detection unit that detects a diagnostic voltage that varies depending on a terminal voltage of the LEDs, and a fault detection unit that detects a fault in the light emitting unit based on a control state of energization by the energization control unit and a detected value of the diagnostic voltage by the voltage detection unit.