G05B2219/40391

Method and device for training manipulation skills of a robot system
11590651 · 2023-02-28 · ·

A method of training a robot system for manipulation of objects, the robot system being able to perform a set of skills, wherein each skill is learned as a skill model, the method comprising: receiving physical input from a human trainer, regarding the skill to be learned by the robot; determining for the skill model a set of task parameters including determining for each task parameter of the set of task parameters if a task parameter is an attached task parameter, which is related to an object being part of said kinesthetic demonstration or if a task parameter is a free task parameter, which is not related to a physical object; obtaining data for each task parameter of the set of task parameters from the set of kinesthetic demonstrations, and training the skill model with the set of task parameters and the data obtained for each task parameter.

Robotic end effector interface systems
11707837 · 2023-07-25 · ·

Embodiments of the present disclosure are directed to methods, computer program products, and computer systems of a robotic apparatus with robotic instructions replicating a food preparation recipe. In one embodiment, a robotic control platform, comprises one or more sensors; a mechanical robotic structure including one or more end effectors, and one or more robotic arms; an electronic library database of minimanipulations; a robotic planning module configured for real-time planning and adjustment based at least in part on the sensor data received from the one or more sensors in an electronic multi-stage process file, the electronic multi-stage process recipe file including a sequence of minimanipulations and associated timing data; a robotic interpreter module configured for reading the minimanipulation steps from the minimanipulation library and converting to a machine code; and a robotic execution module configured for executing the minimanipulation steps by the robotic platform to accomplish a functional result.

Method and device for robot interactions
11548147 · 2023-01-10 · ·

Embodiments of the disclosure provide a method and device for robot interactions. In one embodiment, a method comprises: collecting to-be-processed data reflecting an interaction output behavior; determining robot interaction output information corresponding to the to-be-processed data; controlling a robot to execute the robot interaction output information to imitate the interaction output behavior; collecting, in response to an imitation termination instruction triggered when the imitation succeeds, interaction trigger information corresponding to the robot interaction output information; and storing the interaction trigger information in relation to the robot interaction output information to generate an interaction rule.

Integrating sensor streams for robotic demonstration learning

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for integrating sensor streams for robotic demonstration learning. One of the methods includes selecting, by a learning system for a robot, a base update rate for combining multiple sensor streams into a task state representation. The learning system repeatedly generates the task state representation at the base update rate, including combining, during each time period defined by the update rate, the task state representation from most recently updated sensor data processed by the plurality of neural networks. The learning system repeatedly uses the task state representations to generate commands for the robot at the base update rate.

ROBOTIC DEMONSTRATION RETRIEVAL SYSTEMS AND METHODS

A robot system includes a selection module configured to select a stored demonstration for a robot from a database of stored demonstrations for different tasks of the robot; an encoder module of an attention model, the encoder module configured to determine a similarity value reflecting a similarity between: a user input demonstration for the robot; and the stored demonstration for the robot; and an indicator module configured to indicate whether the stored demonstration is the same as the user input demonstration and belongs to the same task based on the similarity value.

ROBOT INSTRUCTION DISTRIBUTION FRAMEWORK
20220395977 · 2022-12-15 ·

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium that distributes skill bundles that can guide robot execution. One of the methods includes receiving data for a skill bundle from a skill developer. The data can include a definition of one or more preconditions for a robotic system to execute a skill; one or more effects to an operating environment after the robotic system has executed the skill; and a software module implementing the skill. The software module can define a state machine of subtasks. A skill bundle can be generated from the data received from the skill developer. Data identifying the generated skill bundle can be added to a skill registry. The skill bundle can be provided to the execution robot system for installation on the robot execution system.

User feedback for robotic demonstration learning

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for providing user feedback for robotic demonstration learning. One of the methods includes initiating a local demonstration learning process to collect respective local demonstration data for each of one or more demonstration subtasks defined by a skill template to be executed by a robot. Local demonstration data is repeatedly collected for each of the one or more demonstration subtasks of the skill template while a user manipulates a robot to perform each of the one or more demonstration subtasks defined by the skill template. A respective progress value for each of the one or more demonstration subtasks defined by the skill template is maintained. A user interface presentation is generated that presents a suggested demonstration to be performed by the user based on a respective progress value for each demonstration subtask.

Learning skills from video demonstrations

A method includes determining motion imitation information for causing a system to imitate a physical task using a first machine learning model that is trained using motion information that represents a performance of the physical task, determining a predicted correction based on the motion information and a current state from the system using a second machine learning model that is trained using the motion information, determining an action to be performed by the system based on the motion imitation information and the predicted correction; and controlling motion of the system in accordance with the action.

CONTROL DEVICE, CONTROL SYSTEM, ROBOT SYSTEM, AND CONTROL METHOD

A control device includes: first circuitry that generates a command to cause a robot to autonomously grind a grinding target portion; second circuitry that generates a command to cause the robot to grind a grinding target portion according to manipulation information from an operation device; third circuitry that controls operation of the robot according to the command; storage that stores image data of a grinding target portion and operation data of the robot corresponding to the command; and forth circuitry that performs machine learning by using image data of a grinding target portion and the operation data for the grinding target portion, receives the image data as input data, and outputs an operation correspondence command corresponding to the operation data as output data. The first circuitry generates the command, based on the operation correspondence command.

Method and system for robot manipulation planning

A method for planning a manipulation task of an agent, particularly a robot. The method includes: learning a number of manipulation skills wherein a symbolic abstraction of the respective manipulation skill is generated; determining a concatenated sequence of manipulation skills selected from the number of learned manipulation skills based on their symbolic abstraction so that a given goal specification indicating a given complex manipulation task is satisfied; and executing the sequence of manipulation skills.