G05B2219/40391

Robot control device, robot system, and robot control method

A robot control device includes: a trained model built by being trained on work data; a control data acquisition section which acquires control data of the robot based on data from the trained model; base trained models built for each of a plurality of simple operations by being trained on work data; an operation label storage section which stores operation labels corresponding to the base trained models; a base trained model combination information acquisition section which acquires combination information when the trained model is represented by a combination of a plurality of the base trained models, by acquiring a similarity between the trained model and the respective base trained models; and an information output section which outputs the operation label corresponding to each of the base trained models which represent the trained model.

ROBOTIC SYSTEM

The present disclosure relates generally to robotic systems, and more specifically to systems and methods for a robotic platform comprising an on-demand intelligence component. An exemplary computer-enabled method for operating a robot comprises obtaining an instruction for the robot, wherein the instruction is associated with a first user; identifying, based on the instruction, a task; transmitting the task to the robot; receiving, from the robot, a request associated with the task; determining whether the request can be solved by one or more trained machine-learning algorithms; if the request cannot be solved by the one or more trained machine-learning algorithms, transmitting a query to a second user's electronic device; receiving a response to the query from the second user; and causing the task to be performed by the robot based on the response

ROBOT SYSTEM, LEARNING APPARATUS, INFORMATION PROCESSING APPARATUS, LEARNED MODEL, CONTROL METHOD, INFORMATION PROCESSING METHOD, METHOD FOR MANUFACTURING PRODUCT, AND RECORDING MEDIUM

A robot system includes a robot, and an information processing portion. The information processing portion is configured to obtain a learned model by learning first force information about a force applied by a worker to a workpiece, first position information about a position of a first portion of the worker, and first workpiece information about a state of the workpiece, and control the robot on a basis of output data of the learned model.

Unified collaborative environments

A unified collaboration environment is formed by establishing a local workspace positional frame of reference using a plurality of UWB transceivers. With a frame of reference established a communication link is established between each of the workspaces, and a collaboration module to establish a peer-to-peer network. Data is received from each of the workspaces including the local workspace frame of reference, the set of available assets and workspace behavior (tasks). The collaboration module crafts a unified collaboration environment by transforming the local workspace into a collaborative positional frame of reference. A user, through a user interface, can offer real-time input to a virtualized version of the workspace to augment actions within the workspace environment.

Position/force controller, and position/force control method and program

A position/force controller performs: detecting information relating to a position based on the effect of an actuator; converting by distributing control energy to speed or positional energy and force energy in response to functions realized on the basis of speed (position) and force information corresponding to the information relating to the position and on the basis of information serving as a reference for control; calculating the control amount for speed or position on the basis of the speed or positional energy; calculating the force control amount on the basis of the force energy; and integrating the speed or position control amount and the force control amount and performing a reverse conversion on the speed or position control amount and the force control amount to return the output to the actuator, to determine the input to the actuator.

SYSTEMS, APPARATUS, AND METHODS FOR ROBOTIC LEARNING AND EXECUTION OF SKILLS

Systems, apparatus, and methods are described for robotic learning and execution of skills. A robotic apparatus can include a memory, a processor, sensors, and one or more movable components (e.g., a manipulating element and/or a transport element). The processor can be operatively coupled to the memory, the movable elements, and the sensors, and configured to obtain information of an environment, including one or more objects located within the environment. In some embodiments, the processor can be configured to learn skills through demonstration, exploration, user inputs, etc. In some embodiments, the processor can be configured to execute skills and/or arbitrate between different behaviors and/or actions. In some embodiments, the processor can be configured to learn an environmental constraint. In some embodiments, the processor can be configured to learn using a general model of a skill.

MULTI-SENSOR ARRAY INCLUDING AN IR CAMERA AS PART OF AN AUTOMATED KITCHEN ASSISTANT SYSTEM FOR RECOGNIZING AND PREPARING FOOD AND RELATED METHODS

An automated kitchen assistant system inspects a food preparation area in the kitchen environment using a novel sensor combination. The combination of sensors includes an Infrared (IR) camera that generates IR image data and at least one secondary sensor that generates secondary image data. The IR image data and secondary image data are processed to obtain combined image data. A trained convolutional neural network is employed to automatically compute an output based on the combined image data. The output includes information about the identity and the location of the food item. The output may further be utilized to command a robotic arm, kitchen worker, or otherwise assist in food preparation. Related methods are also described.

ROBOT AND CONTROL METHOD FOR ROBOT
20180319017 · 2018-11-08 · ·

Provided is a robot which allows a user to set a parameter, instead of separately providing any specific input section via which the parameter is set. A robot (1) includes (i) a right arm part, (ii) a servomotor, that is a right shoulder pitch, which is configured to drive the right arm part, (iii) an obtaining section (105) which is configured to obtain positional information on a position. of the right arm part which has been operated and (iv) a setting section (107) which is configured to set a value of a predetermined parameter to a value corresponding to the positional information that is obtained by the obtaining section (105).

SYSTEM AND METHOD FOR FLEXIBLE HUMAN-MACHINE COLLABORATION
20180290303 · 2018-10-11 ·

Methods and systems for enabling human-machine collaborations include a generalizable framework that supports dynamic adaptation and reuse of robotic capability representations and human-machine collaborative behaviors. Specifically, a method of feedback-enabled user-robot collaboration includes obtaining a robot capability that models a robot's functionality for performing task actions, specializing the robot capability with an information kernel that encapsulates task-related parameters associated with the task actions, and providing an instance of the specialized robot capability as a robot capability element that controls the robot's functionality based on the task-related parameters. The method also includes obtaining, based on the robot capability element's user interaction requirements, user interaction capability elements, via which the robot capability element receives user input and provides user feedback, controlling, based on the task-related parameters, the robot's functionality to perform the task actions in collaboration with the user input; and providing the user feedback including task-related information generated by the robot capability element in association with the task actions.

Work Teaching Device and Work Teaching Method for Robot

A work teaching device for a robot that teaches work by a teacher to the robot is configured to include a teaching pose measurement unit that measures a position and posture of an object grasped by the teacher, a positioning detection unit for detecting that the object moved by the teacher is positioned, a grasping motion detection unit for detecting that the object is grasped by the teacher, a functional operation detection unit for detecting that the teacher operates a function of the object, a work state confirming motion detection unit that detects that confirmation of a work state of the object by the teacher is performed, a teaching program generation unit that receives signals from the teaching pose measurement unit, the positioning detection unit, the grasping motion detection unit, the functional operation detection unit, and the work state confirming motion detection unit and generates a teaching program for the robot in which the signals are divided for each movement of the teacher, and a teaching program execution unit that executes the teaching program generated by the teaching program generation unit.