Patent classifications
G05B2219/39543
3D PRINTED OBJECT CLEANING
In one example in accordance with the present disclosure, a system is described. The system includes a reader to extract cleaning instructions associated with a three-dimensional (3D) printed object. The cleaning instructions include a termination condition to indicate when object cleaning is complete. The system also includes a controller to instruct at least one cleaning device to clean the 3D printed object based on the cleaning instructions. A measurement system of the system determines when the termination condition is met.
Picking facility
A picking facility is realized that can shorten the time required to transfer an article from a first support body to a second support body. Of a plurality of articles 50 supported by the first support body 51, the article 50 located at the highest position and the article 50 whose upper face T1 is present in a range of a set distance D downward from the upper face T1 of the article 50 located at the highest position are set as transfer-target articles 50A, and the control device performs a selection control to preferentially select, from the transfer-target articles 50A, a transfer-target article 50A in the normal orientation SC, and a transfer control to control the transfer device so as to transfer the transfer-target article 50A selected through the selection control from the first support body 51 to the second support body.
MACHINE LEARNING METHOD AND ROBOT SYSTEM
A machine learning method for learning an action of a robot including a hand to pick out a workpiece from a container containing a plurality of the workpieces stacked in bulk and install the workpiece such that the workpiece is in a predetermined installation state includes learning a reverse-order action of removing, by the hand, the workpiece in the predetermined installation state after completion of installation, and learning an installation order of the workpiece based on a learning result of the reverse-order action of removing the workpiece.
Parametric and Modal Work-holding Method for Automated Inspection
A system for inspecting each workpiece of a plurality of non-identical workpieces includes a controller in control communication with the instruments of the system, and a ruleset corresponding to one or more such non-identical workpieces, the system reconfiguring the inspection instruments to customize part tending operations for each such non-identical workpiece. A method for inspecting each workpiece of a plurality of non-identical workpiece includes providing a controller in control communication with the instruments of the system, and a ruleset corresponding to each such non-identical workpiece, the controller causing reconfiguration of the inspection instruments to customize part tending operations for each such non-identical workpiece.
Automated Work-holding for Precise Fastening of Light Parts during Automated Transfer
Illustrative embodiments improve holding of a workpiece in an industrial process by placing the workpiece on or in a workpiece interface of a workholder and, prior to securing the workpiece on or in the workholder, vibrating the workpiece interface to settle the workpiece onto or into the workpiece interface. The act of vibrating the workpiece interface is separate and distinct from an act of securing the workpiece to the workpiece interface, and the vibration from the act of vibrating the workpiece interface is separate and distinct from vibration that may occur incidental to the act of securing the workpiece to the workpiece interface. Some embodiments of a workholder include a vibration actuator distinct from a workpiece interface actuator that opens and closes the workpiece interface. Some embodiments of workpiece interface include a set of one or more tapered guides to guide a workpiece onto the workpiece interface.
ROBOT SYSTEM, CONTROL METHOD, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, METHOD OF MANUFACTURING PRODUCTS, AND RECORDING MEDIUM
A robot system includes a robot, an image capture apparatus, an image processing portion, and a control portion. The image processing portion is configured to specify in an image of a plurality of objects captured by the image capture apparatus, at least one area in which a predetermined object having a predetermined posture exists, and obtain information on position and/or posture of the predetermined object in the area. The control portion is configured to control the robot, based on the information on position and/or posture of the predetermined object, for the robot to hold the predetermined object.
Robot hand controller, robot system, and robot hand control method
A robot hand controller includes an air supply unit configured to supply air into fingers of a robot hand and configured to discharge air in the fingers, and a controller configured to control the air supply unit, where the air supply unit includes two or more air passages respectively connected to the different fingers, the air passages capable of supplying the air into the fingers and discharging the air in the fingers independently from each other, and the controller controls supply and discharge of the air through each of the two or more air passages in response to a shape of the workpiece and an object in a vicinity of a transport destination of the workpiece.
ADAPTIVE GRIPPER DEVICE
A gripper device and method is provided. The method includes capturing, using an electronic device, information of an object that is indicative of holding position; determining, using the information, by a hardware processor, an optimal holding orientation and an optimal movement of at least one of (i) a plurality of fingers, or (ii) a plurality of suction cups of a gripper device; identifying the at least one of (i) the plurality of fingers, and (ii) the plurality of suction cups as one or more grasping components based on the information, the optimal holding orientation and the optimal movement; and enabling, using an actuator, the one or more identified grasping components to grasp the object based on the information, the optimal holding orientation and the optimal movement.
MACHINE LEARNING CONTROL OF OBJECT HANDOVERS
A robotic control system directs a robot to take an object from a human grasp by obtaining an image of a human hand holding an object, estimating the pose of the human hand and the object, and determining a grasp pose for the robot that will not interfere with the human hand. In at least one example, a depth camera is used to obtain a point cloud of the human hand holding the object. The point cloud is provided to a deep network that is trained to generate a grasp pose for a robotic gripper that can take the object from the human's hand without pinching or touching the human's fingers.
METHOD FOR TEACHING A ROBOTIC ARM TO PICK OR PLACE AN OBJECT
A method for teaching a robotic arm to pick or place an object includes the following steps. Firstly, the robot arm is pushed until a target appears within a vision. Then, an appearance position of the target is set as a visual point. Then, a first image is captured. Then, the robot arm is pushed to a target position from the visual point. Then, the target position is set as a pick and place point. Then, an automatic movement control of the robot arm is activated. Then, the robot arm automatically picks and places the object and returns to the visual point from the pick and place point. Then, a second image is captured. Then, a differential image is formed by subtracting the second image from the first image, the target image is set according to the differential image, and image characteristic of the target are automatically learned.