Patent classifications
G05B2219/40609
ROBOT CONTROL APPARATUS, ROBOT CONTROL SYSTEM, ROBOT CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM STORING A ROBOT CONTROL PROGRAM
A robot control apparatus according to one or more embodiments may include: a calculating unit configured to calculate an interference range of a robot based on a model of the robot in a state in which an object is gripped by a gripper with which the robot is equipped; and a planning unit configured to plan a motion of the robot based on the model and the interference range.
Apparatus and method for building a pallet load
A pallet building apparatus for automatically building a pallet load of pallet load article units onto a pallet support including a frame defining a pallet building base, at least one articulated robot to transport and place the pallet load article units, a controller to control articulated robot motion and effect therewith a pallet load build, at least one three-dimensional, time of flight, camera to generate three-dimensional imaging of the pallet support and pallet load build, wherein the controller registers, from the three-dimensional camera, real time three-dimensional imaging data embodying different corresponding three-dimensional images of the pallet support and pallet load build, to determine, in real time, from the corresponding real time three-dimensional imaging data, a pallet support variance or article unit variance and generate in real time an articulated robot motion signal, the articulated robot motion signal being generated real time so as to be performed real time by the at least one articulated robot between placement of at least one pallet load article unit and a serially consecutive pallet load article unit enabling substantially continuous building of the pallet load build.
Pre-filled parenteral drug inspection station and method of using the same
The invention is a flexible and configurable inspection system for the inspection of container units that combines and integrates a holding assembly for multiple containers integrating servo-controlled rotation of the units, transport and positioning of the containers that simulate human handling, and camera stations employing automated vision inspection. The system performs horizontal inspection for particulate and any other container defect that promotes particulate to better locate within the inspection area of the cameras. Inspection sequences and product recipes combine the typical manual inspection agitation with automated inspection rotational techniques to optimize detection. The system allows for semi-automatic operation with the operator at the front of the station feeding and out-feeding material manually or fully automated with conveyance system feeding and out-feeding material from the back of the station.
Robot
A robot includes a first arm having a hole and extending along a first axis, a second arm coupled to the first arm, and rotating around a second axis crossing the first axis, a sensor configured to detect a target, and an attachment member provided to the second arm, and configured to support the sensor, wherein the attachment member is inserted through the hole, and extending along the second axis. Further, the sensor may be located outside an outer surface of the first arm.
Machine learning control of object handovers
A robotic control system directs a robot to take an object from a human grasp by obtaining an image of a human hand holding an object, estimating the pose of the human hand and the object, and determining a grasp pose for the robot that will not interfere with the human hand. In at least one example, a depth camera is used to obtain a point cloud of the human hand holding the object. The point cloud is provided to a deep network that is trained to generate a grasp pose for a robotic gripper that can take the object from the human's hand without pinching or touching the human's fingers.
DEVICE AND METHOD FOR TRAINING A NEURAL NETWORK FOR CONTROLLING A ROBOT FOR AN INSERTING TASK
A method for training a neural network to derive, from a force and a moment exerted on an object when pressed on a plane in which an insertion for inserting the object is located, a movement vector to insert an object into an insertion. The method includes, for a plurality of positions in which the object or the part of the object held by the robot touches a plane in which the insertion is located, controlling the robot to move to the position, controlling the robot to press the object onto the plane, measuring the force and moment experienced by the object, scaling the pair of force and moment by a number randomly chosen between zero and a predetermined positive maximum number and labelling the scaled pair by a movement vector between the position and the insertion, and training the neural network using the labelled pairs of force and moment.
DEVICE AND METHOD FOR CONTROLLING A ROBOT TO INSERT AN OBJECT INTO AN INSERTION
A method for controlling a robot to insert an object into an insertion. The method includes controlling the robot to hold the object, generating an estimate of a target position to insert the object into the insertion, controlling the robot to move to the estimated target position, taking a camera image using a camera mounted on the robot after having controlled the robot to move to the estimated target position, feeding the camera image into a neural network which is trained to derive, from camera images, movement vectors which specify movements from the positions at which the camera images are taken to insert objects into insertions and controlling the robot to move according to the movement vector derived by the neural network from the camera image.
DEVICE AND METHOD FOR TRAINING A NEURAL NETWORK FOR CONTROLLING A ROBOT FOR AN INSERTING TASK
A method for training a neural network to derive, from an image of a camera mounted on a robot, a movement vector for the robot to insert an object into an insertion. The method includes controlling the robot to hold the object, bringing the robot into a target position in which the object is inserted in the insertion, for a plurality of positions different from the target position controlling the robot to move away from the target position to the position, taking a camera image by the camera and labelling the camera image by a movement vector to move back from the position to the target position and training the neural network using the labelled camera images.
DEVICE AND METHOD FOR TRAINING A NEURAL NETWORK FOR CONTROLLING A ROBOT FOR AN INSERTING TASK
A method for training a neural network to derive, from an image of a camera mounted on a robot, a movement vector to insert an object into an insertion. The method includes, for a plurality of positions in which the object held by the robot touches a plane in which the insertion is located controlling the robot to move to the position, taking a camera image by the camera and labelling the camera image with a movement vector between the position and the insertion in the plane and training the neural network using the labelled camera images.
Robot Teaching System
A robot teaching system includes: a photographing unit that photographs an image including a welding target and a marker installed on an industrial robot; a camera coordinate system setting unit that sets a camera coordinate system on a basis of the marker included in the image; an operation path setting unit that sets an operation path of the industrial robot on a basis of a welding position of the welding target included in the image in the camera coordinate system; and a program generation unit that generates a working program, while converting the set operation path from the camera coordinate system into a robot coordinate system set in a robot control apparatus on a basis of a position of the marker installed on the industrial robot. The robot teaching system generates a working program allowing appropriate welding at a welding position.