Patent classifications
B25J9/163
Grasping of an object by a robot based on grasp strategy determined using machine learning model(s)
Grasping of an object, by an end effector of a robot, based on a grasp strategy that is selected using one or more machine learning models. The grasp strategy utilized for a given grasp is one of a plurality of candidate grasp strategies. Each candidate grasp strategy defines a different group of one or more values that influence performance of a grasp attempt in a manner that is unique relative to the other grasp strategies. For example, value(s) of a grasp strategy can define a grasp direction for grasping the object (e.g., “top”, “side”), a grasp type for grasping the object (e.g., “pinch”, “power”), grasp force applied in grasping the object, pre-grasp manipulations to be performed on the object, and/or post-grasp manipulations to be performed on the object.
Gripping force adjustment device and gripping force adjustment system
During machining of a workpiece, a gripping force adjustment device takes into account the state of the machining and the state of the workpiece in order to set a more appropriate gripping force. The gripping force adjustment device acquires data indicating a machining state implemented by a machine tool and data relating to a gripping state realized on the workpiece by a jig, and creates data to be used in machine learning on the basis of the acquired data. The gripping force adjustment device then executes machine learning processing relating to the gripping force exerted on the workpiece by the jig in the environment in which the machine tool machines the workpiece on the basis of the created data.
Robot control system
A robot control system includes a control device for controlling a robot and a portable operating panel connected to the control device. The portable operating panel and at least one other device include contact points connected in series. The control device includes a reception circuit that can detect the opening of at least one of the contact points. The portable operating panel includes a smart device having a sensor. The contact point included in the portable operating panel is opened and closed in conjunction with a physical movement of a switch member attached to an exterior of the smart device. The sensor can detect a physical quantity that changes in conjunction with the physical movement of the switch member. The portable operating panel transmits, to the control device, a detection signal indicating the physical quantity detected by the sensor or information about the physical quantity.
System and method for augmenting a visual output from a robotic device
A method for visualizing data generated by a robotic device is presented. The method includes displaying an intended path of the robotic device in an environment. The method also includes displaying a first area in the environment identified as drivable for the robotic device. The method further includes receiving an input to identify a second area in the environment as drivable and transmitting the second area to the robotic device.
Robot control parameter interpolation
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for computing interpolated robot control parameters. One of the methods includes receiving, by a real-time bridge from a control agent for a robot, a non-real-time command for the robot, wherein the non-real-time command specifies a trajectory to be attained by a component of the robot and a target value for a control parameter, wherein the control parameter controls how a real-time controller will cause the robot to react to one or more external stimuli encountered during a control cycle of the real-time controller. The real-time bridge provides the one or more real-time commands translated from the non-real-time command and interpolated control parameter information to the real-time controller, thereby causing the robot to effectuate the trajectory of the non-real-time command according to the interpolated control parameter information.
Apparatus and method for generating robot interaction behavior
Disclosed herein are an apparatus and method for generating robot interaction behavior. The method for generating robot interaction behavior includes generating co-speech gesture of a robot corresponding to utterance input of a user, generating a nonverbal behavior of the robot, that is a sequence of next joint positions of the robot, which are estimated from joint positions of the user and current joint positions of the robot based on a pre-trained neural network model for robot pose estimation, and generating a final behavior using at least one of the co-speech gesture and the nonverbal behavior.
Hybrid computing achitectures with specialized processors to encode/decode latent representations for controlling dynamic mechanical systems
Provided is a robot that includes: a first sensor having a first output and configured to sense state of a robot or an environment of the robot; a first hardware machine-learning accelerator coupled to the first output of the first sensor and configured to transform information sensed by the first sensor into a first latent-space representation; a second sensor having a second output and configured to sense state of the robot or the environment of the robot; a second hardware machine-learning accelerator configured to transform information sensed by the second sensor into a second latent-space representation; and a processor configured to control the robot based on both the first latent-space representation and the second latent-space representation.
Operation Parameter Adjusting Method And Operation Parameter Adjusting Device For Adjusting Operation Parameters Of Robot
An operation parameter adjusting method according to an aspect includes a detecting step for causing a robot to execute a plurality of adjustment operations using candidate values of operation parameters and acquiring detection values of a detecting section, an operation parameter updating step for executing optimization processing for the operation parameters using the acquired detection values to thereby obtain new candidate values of the operation parameters, a repeating step for repeating the operation parameter updating step and the detecting step, and an operation parameter determining step for determining, based on one or more candidate values of the operation parameters obtained by the repeating step, the operation parameter used in the robot system. The detecting step includes a suspension determining step for performing continuation or suspension of the detecting step based on a result of comparison of the acquired detection values of the part of the adjustment operations and a reference value.
Information processing apparatus, information processing method, and system
An information processing apparatus includes an acquisition unit acquiring a first image and a second image, the first image being an image of a target area in an initial state, the second image being an image of the target area where a first object conveyed from a supply area is placed, an estimation unit estimating one or more second areas in the target area, based on a feature of a first area estimated using the first image and the second image, the first area being where the first object is placed, the one or more second areas each being an area where an object in the supply area can be placed and being different from the first area. A control unit controls a robot to convey a second object different from the first object from the supply area to any of the one or more second areas.
TRIGGERING DYNAMIC ROBOTIC PROCESS AUTOMATION
A device and method for robotic process automation (RPA) using speech recognition that receives a voice input; invokes, using the received voice input, an RPA workflow, the RPA workflow comprising a sequence of tasks; based at least on the invoked RPA workflow, retrieves an argument from a cloud device; modifies, with the retrieved argument, at least one task of the sequence of tasks; and executes the modified at least one task as part of the RPA workflow.