Patent classifications
G05B2219/40571
UTILIZING PAST CONTACT PHYSICS IN ROBOTIC MANIPULATION (E.G., PUSHING) OF AN OBJECT
Utilization of past dynamics sample(s), that reflect past contact physics information, in training and/or utilizing a neural network model. The neural network model represents a learned value function (e.g., a Q-value function) and that, when trained, can be used in selecting a sequence of robotic actions to implement in robotic manipulation (e.g., pushing) of an object by a robot. In various implementations, a past dynamics sample for an episode of robotic manipulation can include at least two past images from the episode, as well as one or more past force sensor readings that temporally correspond to the past images from the episode.
Object grasp system and method
A grasping system includes a robotic arm having a gripper. A fixed sensor monitors a grasp area and an onboard sensor moves with the gripper also monitors the area. A controller receives information indicative of a position of an object to be grasped and operates the robotic arm to bring the gripper into a grasp position adjacent the object based on information provided by the fixed sensor. The controller is also programmed to operate the gripper to grasp the object in response to information provided by the first onboard sensor.
ROBOT APPARATUS, METHOD FOR CONTROLLING ROBOT APPARATUS, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, METHOD FOR MANUFACTURING PRODUCT, AND RECORDING MEDIUM
A robot apparatus includes a robot, an image pickup portion, and a controller configured to control the robot. The controller obtains information about force by comparing a predetermined image with a captured image obtained by the image pickup portion imaging the robot, and performs force control of the robot on a basis of the information about force.
Robotic systems and methods for robustly grasping and targeting objects
Embodiments are generally directed to generating a training dataset of labelled examples of sensor images and grasp configurations using a set of three-dimensional (3D) models of objects, one or more analytic mechanical representations of either or both of grasp forces and grasp torques, and statistical sampling to model uncertainty in either or both sensing and control. Embodiments can also include using the training dataset to train a function approximator that takes as input a sensor image and returns data that is used to select grasp configurations for a robot grasping or targeting mechanism.
ROBOTIC PALLETIZATION SYSTEM WITH VARIABLE CONVEYOR HEIGHT
A robotic palletization/depalletization system is disclosed. In various embodiments, data associated with a plurality of items to be stacked on or in a destination location is received, and a plan to stack the items on or in the destination location is generated based at least in part on the received data. The generating the plan includes determining a source location from which to pick the item based at least in part on (i) an attribute of the source location, and (ii) a state of a platform or receptacle on which one or more items are to be stacked.
Robotic multi-item type palletizing and depalletizing
Techniques are disclosed to use a robotic arm to palletize or depalletize diverse items. In various embodiments, data associated with a plurality of items to be stacked on or in a destination location is received. A plan to stack the items on or in the destination location is generated based at least in part on the received data. The plan is implemented at least in part by controlling a robotic arm of the robot to pick up the items and stack them on or in the receptacle according to the plan, including by for each item: using one or more first order sensors to move the item to a first approximation of a destination position for that item at the destination location; and using one or more second order sensors to snug the item into a final position.
ROBOTIC MULTI-ITEM TYPE PALLETIZING & DEPALLETIZING
Techniques are disclosed to use a robotic arm to palletize or depalletize diverse items. In various embodiments, data associated with a plurality of items to be stacked on or in a destination location is received. A plan to stack the items on or in the destination location is generated based at least in part on the received data. The plan is implemented at least in part by controlling a robotic arm of the robot to pick up the items and stack them on or in the receptacle according to the plan, including by for each item: using one or more first order sensors to move the item to a first approximation of a destination position for that item at the destination location; and using one or more second order sensors to snug the item into a final position.
ROBOTIC DEVICE, ROBOTIC DEVICE CONTROLLING SYSTEM, AND ROBOTIC DEVICE CONTROLLING METHOD
A robotic device performs work defined by a series of unit jobs. The robotic device includes a first notice section and a controller. The controller includes a first calculation section that calculates end time of the work based on time required for the work. The controller controls the first notice section so that the first notice section issues notice of ending the work before the end time. The controller further includes a first determination section that determines, based on the end time, time to issue the notice. The time to issue the notice is time before the end time.
ASSEMBLING PARTS IN AN ASSEMBLY LINE
A method for assembling parts in an assembly line, such as an automotive final assembly line, is disclosed. The method includes advancing a part along the assembly line with an Automated Guided Vehicle (AGV), arranging a first real time vision system to monitor the position of the AGV in at least two directions, and providing the readings of the first real time vision system to a controller arranged to control an assembly unit of the assembly line to perform an automated operation on the part that is advanced or supported by the AGV. An assembly line is also disclosed.
Object Grasp System and Method
A grasping system includes a robotic arm having a gripper. A fixed sensor monitors a grasp area and an onboard sensor moves with the gripper also monitors the area. A controller receives information indicative of a position of an object to be grasped and operates the robotic arm to bring the gripper into a grasp position adjacent the object based on information provided by the fixed sensor. The controller is also programmed to operate the gripper to grasp the object in response to information provided by the first onboard sensor.