B25J9/161

LOOP CLOSURE DETECTION METHOD AND SYSTEM, MULTI-SENSOR FUSION SLAM SYSTEM, ROBOT, AND MEDIUM
20230045796 · 2023-02-16 ·

The present invention provides a loop closure detection method and system, a multi-sensor fusion SLAM system, a robot, and a medium. Said system runs on a mobile robot, and comprises a similarity detection unit, a visual pose solving unit, and a laser pose solving unit. According to the loop closure detection system, the multi-sensor fusion SLAM system and the robot provided in the present invention, the speed and accuracy of loop closure detection in cases of a change in a viewing angle of the robot, a change in the environmental brightness, a weak texture, etc. can be significantly improved.

ROBOT CONTROL APPARATUS, ROBOT CONTROL SYSTEM, ROBOT CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM STORING A ROBOT CONTROL PROGRAM
20230046793 · 2023-02-16 · ·

A robot control apparatus according to one or more embodiments may include: a calculating unit configured to calculate an interference range of a robot based on a model of the robot in a state in which an object is gripped by a gripper with which the robot is equipped; and a planning unit configured to plan a motion of the robot based on the model and the interference range.

CONFIGURING A NEURAL NETWORK FOR EQUIVARIANT OR INVARIANT BEHAVIOR

A method for configuring a neural network which is designed to map measured data to one or more output variables. The method includes: transformation(s) of the measured data is/are specified which when applied to the measured data, is/are meant to induce the output variables supplied by the neural network to exhibit an invariant or equivariant behavior; at least one equation is set up which links a condition that the desired invariance or equivariance be given with the architecture of the neural network; by solving the at least one equation a feature is obtained that characterizes the desired architecture and/or a distribution of weights of the neural network in at least one location of this architecture; a neural network is configured in such a way that its architecture and/or its distribution of weights in at least one location of this architecture has/have all of the features ascertained in this way.

TEMPLATE ROBOTIC CONTROL PLANS
20230050174 · 2023-02-16 ·

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using template robotic control plans. One of the methods comprises obtaining a template robotic control plan that is configurable for a plurality of different robotics applications, wherein the template robotic control plan comprises data defining (i) an adaptation procedure and (ii) a set of one or more open parameters; obtaining a user input defining a respective value or range of values for each open parameter in the set of open parameters, wherein the user input characterizes a specific robotics application for which the template robotic control plan can be configured; and executing, using the obtained values for the set of open parameters, the adaptation procedure to generate a specific robotic control plan from the template robotic control plan.

AUTONOMOUSLY NAVIGATING ROBOT CAPABLE OF CONVERSING AND SCANNING BODY TEMPERATURE TO HELP SCREEN FOR COVID-19 AND OPERATION SYSTEM THEREOF
20230047316 · 2023-02-16 ·

This application relates to an autonomously navigating robot. In one aspect, the robot includes an end effector configured to measure a person's body temperature and, when the body temperature exceeds a standard fever temperature, activate a chatbot to check symptoms of Covid-19. The robot may also include a manipulator configured to align the end effector with the person's forehead. The robot may further include a mobile robot configured to detect the person and move the end effector and the manipulator to a position where the person is located by performing autonomous navigation.

ROBOTIC PROCESS AUTOMATION SYSTEM FOR MANAGING HUMAN AND ROBOTIC TASKS
20230050430 · 2023-02-16 ·

Improved techniques for combining human tasks and robotic tasks in an organized manner to define an automation workflow process. A workflow process platform can assist a developer in creating an automation workflow process, and/or manage performance of an automation workflow process. The improved techniques enable a Robotic Process Automation (RPA) system to support programmatically combining various robotic tasks with human actions to provide an interrelated relationship of both human tasks and automated tasks.

MACHINE-LEARNABLE ROBOTIC CONTROL PLANS

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for using learnable robotic control plans. One of the methods comprises obtaining a learnable robotic control plan comprising data defining a state machine that includes a plurality of states and a plurality of transitions between states, wherein: one or more states are learnable states, and each learnable state comprises data defining (i) one or more learnable parameters of the learnable state and (ii) a machine learning procedure for automatically learning a respective value for each learnable parameter of the learnable state; and processing the learnable robotic control plan to generate a specific robotic control plan, comprising: obtaining data characterizing a robotic execution environment; and for each learnable state, executing, using the obtained data, the respective machine learning procedures defined by the learnable state to generate a respective value for each learnable parameter of the learnable state.

Teaching apparatus, robot system, and teaching program
11577381 · 2023-02-14 · ·

A teaching apparatus includes a display unit that displays a command display area in which a plurality of input motion commands of a robot are displayed, an extraction display area in which at least one motion command extracted from the plurality of motion commands displayed in the command display area is displayed, and a settings input area in which details of the extracted motion command are set, and a display control unit that controls actuation of the display unit, wherein the display control unit extracts and displays a motion command related to one of position information, velocity information, and acceleration information of the robot out of the plurality of motion commands displayed in the command display area in the extraction display area.

Method and system for detecting and picking up objects

A method includes steps of: capturing an image of a container; recognizing at least one object in the container based on the image; determining at least one first coordinate set corresponding to the at least one object; determining at least one second coordinate set that corresponds to target one (s) of the at least one first coordinate set and that relates to a fixed picking device of a robotic arm; adjusting position(s) of unfixed picking device(s) of the robotic arm if necessary; controlling the robotic arm to pick up one (s) of the at least one object that correspond(s) to the at least one second coordinate set with the fixed picking device and/or at least one unfixed picking device.

Automatic robot perception programming by imitation learning

Apparatus, systems, methods, and articles of manufacture for automatic robot perception programming by imitation learning are disclosed. An example apparatus includes a percept mapper to identify a first percept and a second percept from data gathered from a demonstration of a task and an entropy encoder to calculate a first saliency of the first percept and a second saliency of the second percept. The example apparatus also includes a trajectory mapper to map a trajectory based on the first percept and the second percept, the first percept skewed based on the first saliency, the second percept skewed based on the second saliency. In addition, the example apparatus includes a probabilistic encoder to determine a plurality of variations of the trajectory and create a collection of trajectories including the trajectory and the variations of the trajectory. The example apparatus also includes an assemble network to imitate an action based on a first simulated signal from a first neural network of a first modality and a second simulated signal from a second neural network of a second modality, the action representative of a perceptual skill.