Patent classifications
B25J9/1612
OBJECT HEIGHT DETECTION FOR PALLETIZING AND DEPALLETIZING OPERATIONS
Various embodiments described herein relate to techniques for object height detection for palletizing operations and/or depalletizing operations. In this regard, an automated industrial system comprises at least a column portion, a robot arm portion, and an end effector configured to grasp an object. An image-capturing device is mounted onto the automated industrial system and is configured to rotate, based on movement of the robot arm portion, to scan the object grasped by the end effector and to generate image-capturing data associated with the object. Furthermore, a processing device is configured to determine height data for the object based on the image-capturing data. The processing device is also configured to determine location data for the object with respect to a conveyor system based on the height data.
Robot controller and system
A robot controller is a controller which controls, via a hand control device, a robot hand that grips an article with two or more gripping portions. The robot controller includes, a size information acquisition unit which acquires size information about the article based on an image obtained by a visual sensor for detecting the article, and a gripping adjustment unit which changes, in response to the size information, a gripping distance, which is the space between the gripping portions, in a gripping state or a gripping force of the gripping portions in the gripping state.
Gripper system
A device for gripping an object includes a pair of grippers each having a gripping surface formed with a plurality of fine channels, a driver for driving the pair of grippers, a pump connected to at least one of the plurality of fine channels, to supply a fluid to the at least one of the plurality of fine channels, and a controller for controlling the driver to enable the pair of grippers to grip the object or controlling the pump to adjust an amount of the fluid supplied to the gripping surface.
CONTROL DEVICE, CONTROL METHOD, AND PROGRAM
The present technology relates to a control device, a control method, and a program capable of enabling predetermined motion while a gripped object is stabilized. A control device according to one aspect of the present technology is a device that detects a gripped state of an object gripped by a hand unit, and limits motion of a motion unit while the object is gripped by the hand unit, in accordance with a result of detection of the gripped state. The present technology can be applied to a device that controls a robot including a hand unit capable of gripping an object.
ENERGY CONSERVATION OF A MOTOR-DRIVEN DIGIT
Routines and methods disclosed herein can increase a power efficiency of a prosthetic hand without drastically reducing the speed at which it operates. A prosthesis can implement an acceleration profile, which can reduce an energy consumption of a motor, or an amount of electrical and/or mechanical noise produced by a motor, as the motor as the motor transitions from an idle state to a non-idle state. A prosthesis can implement a deceleration profile, which can reduce the energy consumption of the motor, or an amount of electrical and/or mechanical noise produced by a motor, as the motor transitions from a non-idle state to an idle state.
METHOD FOR OPERATING A PICKING ROBOT AND RELATED DEVICES
A method for operating a picking robot comprising an end effector assembly and a vision assembly, and related controller device is disclosed, the method comprising picking a subject with the end effector assembly from a bin comprising a plurality of subjects; moving the subject to a delivery station; and releasing the subject on the delivery station, wherein the method comprises locking a joint connection of the end effector assembly prior to and/or during the act of moving the subject to the delivery station.
RECONFIGURABLE, FIXTURELESS MANUFACTURING SYSTEM AND METHOD
Systems and methods for reconfigurable, fixtureless manufacturing are provided. Material handling robots grasp and move parts within an assembly area to adjoin one another in a predetermined orientation. While the parts remain grasped and suspended within the assembly area, out of contact with any fixtures, work surfaces, jigs, and locators, a machine vision system performs an alignment scan to determine locations of datums on the parts which are transmitted to a controller for comparison against stored virtual datums for a subassembly comprising the joined parts. The location of the datums are transmitted to a joining robot which joins the parts to form the subassembly. The machine vision system performs an inspection scan of the datums on the parts after joining.
Arithmetic device, control program, machine learner, grasping apparatus, and control method
The arithmetic device configured to perform a calculation for controlling a motion of a grasping apparatus that performs work involving a motion of sliding a grasped object includes: an acquisition unit configured to acquire a state variable indicating a state of the grasping apparatus during the work; a storage unit storing a learned neural network that has been learned by receiving a plurality of training data sets composed of a combination of the state variable acquired in advance and correct answer data corresponding to the state variable; an arithmetic unit configured to calculate a target value of each of various actuators related to the work of the grasping apparatus by inputting the state variable to the learned neural network read from the storage unit; and an output unit configured to output the target value of each of the various actuators to the grasping apparatus.
GRIP DEVICE AND ROBOT DEVICE COMPRISING SAME
A grip device is provided. A grip device according to an embodiment of the present disclosure includes: a first finger; a second finger facing the first finger; a first link part including a first guide slot and supporting the first finger; a second link part supporting the second finger and including a second guide slot, intersecting the first link part; a hinge configured to move inside the first guide slot and second guide slot and connecting the first link part and the second link part at an intersection point of the first link part and second link part; a first actuator configured to adjust a distance between the first finger and second finger by moving the first link part and/or the second link part; and a second actuator configured to move the hinge inside the first guide slot and second guide slot.
Robotic control using value distributions
Techniques are described herein for robotic control using value distributions. In various implementations, as part of performing a robotic task, state data associated with the robot in an environment may be generated based at least in part on vision data captured by a vision component of the robot. A plurality of candidate actions may be sampled, e.g., from continuous action space. A trained critic neural network model that represents a learned value function may be used to process a plurality of state-action pairs to generate a corresponding plurality of value distributions. Each state-action pair may include the state data and one of the plurality of sampled candidate actions. The state-action pair corresponding to the value distribution that satisfies one or more criteria may be selected from the plurality of state-action pairs. The robot may then be controlled to implement the sampled candidate action of the selected state-action pair.