G05B2219/40543

Virtual teach and repeat mobile manipulation system

A method for performing a task by a robotic device includes mapping a group of task image pixel descriptors associated with a first group of pixels in a task image of a task environment to a group of teaching image pixel descriptors associated with a second group of pixels in a teaching image based on positioning the robotic device within the task environment. The method also includes determining a relative transform between the task image and the teaching image based on mapping the plurality of task image pixel descriptors. The relative transform indicates a change in one or more of points of 3D space between the task image and the teaching image. The method also includes performing the task associated with the set of parameterized behaviors based on updating one or more parameters of a set of parameterized behaviors associated with the teaching image based on determining the relative transform.

Object pickup strategies for a robotic device

Example embodiments may relate to methods and systems for selecting a grasp point on an object. In particular, a robotic manipulator may identify characteristics of a physical object within a physical environment. Based on the identified characteristics, the robotic manipulator may determine potential grasp points on the physical object corresponding to points at which a gripper attached to the robotic manipulator is operable to grip the physical object. Subsequently, the robotic manipulator may determine a motion path for the gripper to follow in order to move the physical object to a drop-off location for the physical object and then select a grasp point, from the potential grasp points, based on the determined motion path. After selecting the grasp point, the robotic manipulator may grip the physical object at the selected grasp point with the gripper and move the physical object through the determined motion path to the drop-off location.

Detachable support member for robot gripper
09981379 · 2018-05-29 · ·

An example robotic gripping device includes two opposable gripping fingers and at least one actuator configured to move the two opposable gripping fingers towards and away from each other within a plane of motion. The robotic gripping device further includes a gripper base coupled to the two opposable gripping fingers, where the gripper base comprises an attachment interface. The robotic gripping device further includes a detachable elongated support member that mates with the attachment interface of the gripper base, such that when the elongated support member is attached to the attachment interface of the gripper base, the elongated support member extends parallel to the plane of motion of the two opposable gripping fingers.

FEEDER MANAGEMENT DEVICE

Provided is a feeder management device that appropriately controls editing of management information related to feeders so as to prevent electronic components of an incorrect type being supplied. The feeder management device includes: a data management section including management data in which a component type of an electronic component loaded on each of the feeders is linked to an identification code that specifies each of the multiple feeders; and an editing control section configured to determine whether editing is allowable based on a loading state of the electronic component on the feeder in a case in which editing of the management information is attempted to be performed in accordance with a change to the component type linked to the identification code of the feeder.

System and method for piece picking or put-away with a mobile manipulation robot

A method and system for piece-picking or piece put-away within a logistics facility. The system includes a central server and at least one mobile manipulation robot. The central server is configured to communicate with the robots to send and receive piece-picking data which includes a unique identification for each piece to be picked, a location within the logistics facility of the pieces to be picked, and a route for the robot to take within the logistics facility. The robots can then autonomously navigate and position themselves within the logistics facility by recognition of landmarks by at least one of a plurality of sensors. The sensors also provide signals related to detection, identification, and location of a piece to be picked or put-away, and processors on the robots analyze the sensor information to generate movements of a unique articulated arm and end effector on the robot to pick or put-away the piece.

Systems and methods for SKU induction, decanting and automated-eligibility estimation

An object induction system is disclosed for assigning handling parameters to an object. The system includes an analysis system, an association system, and an assignment system. The analysis system includes at least one characteristic perception system for providing perception data regarding an object to be processed. The association system includes an object information database and assigns association data to the object responsive to commonality with any of the characteristic perception data with any of the characteristic recorded data. The assignment system is for assigning programmable motion device handling parameters to the indicia perception data based on the association data, and includes a workflow management system as well as a separate operational controller.

Determining a Virtual Representation of an Environment By Projecting Texture Patterns
20180093377 · 2018-04-05 ·

Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.

Information processing apparatus, control method thereof, and computer readable storage medium that calculate an accuracy of correspondence between a model feature and a measurement data feature and collate, based on the accuracy, a geometric model and an object in an image

An information processing apparatus predicts, based on a geometric model of an object to be measured, an image degradation of the object in an image in which the object is captured by a capturing device. Furthermore, the information processing apparatus searches a two-dimensional image in which the object is captured by the capturing device for a measurement data feature that corresponds to a model feature of the geometric model, and evaluates, using the two-dimensional image, an image degradation with respect to the found measurement data feature. The information processing apparatus calculates, based on the predicted image degradation and the evaluated image degradation, the accuracy of the correspondence between the model feature and the measurement data feature, and collates, based on this accuracy, the geometric model and the object in the two-dimensional image.

Photogrammetric identification of locations for performing work

Systems and methods are provided for identifying a location to perform work (e.g., drilling.). One exemplary method includes acquiring images of a hole in a first object from multiple lighting angles, processing the images to identify shadows cast by a wall of the hole at each of the lighting angles, and analyzing the shadows to determine an orientation of a central axis of the hole in a coordinate system of the first object. The method also includes, at a second object, selecting a location to drill that will be aligned with the central axis of the hole of the first object, and drilling the second object at the location.

SYSTEM AND METHOD FOR PIECE PICKING OR PUT-AWAY WITH A MOBILE MANIPULATION ROBOT

A method and system for piece-picking or piece put-away within a logistics facility. The system includes a central server and at least one mobile manipulation robot. The central server is configured to communicate with the robots to send and receive piece-picking data which includes a unique identification for each piece to be picked, a location within the logistics facility of the pieces to be picked, and a route for the robot to take within the logistics facility. The robots can then autonomously navigate and position themselves within the logistics facility by recognition of landmarks by at least one of a plurality of sensors. The sensors also provide signals related to detection, identification, and location of a piece to be picked or put-away, and processors on the robots analyze the sensor information to generate movements of a unique articulated arm and end effector on the robot to pick or put-away the piece.