Patent classifications
G05B2219/39106
Simulation device
A simulation device according to the present disclosure includes: a model arrangement unit which arranges three-dimensional models of a feeder, a robot and a receiving device in virtual space; a workpiece supply unit which arranges the three-dimensional model of the workpiece on the conveyor surface; and a robot operation control unit which causes the robot to operate so as to move the workpiece on the conveyor surface to over the receiving surface, in which the workpiece supply unit includes: a reference position calculation part which calculates a position and orientation of the workpiece to be newly arranged, according to set conditions; a workpiece area setting part which sets a workpiece area occupied by the workpiece; an interference area setting part which sets an interference area adjacent to the workpiece area; a display control part which displays a screen indicating the workpiece area and the interference area; and a supply position adjustment part which adjusts a relative position of the workpiece to be newly arranged, so that the workpiece area of the workpiece to be newly arranged does not overlap with the workpiece area and the interference area of the workpiece arranged previously.
SYSTEMS AND METHODS FOR DYNAMIC PROCESSING OF OBJECTS USING BOX TRAY ASSEMBLIES
A box handling system is disclosed for use in an object processing system. The box handling system includes a box tray including a recessed area for receiving a box, and the recessed area includes a plurality of floor and edge portions for receiving the box that contains objects to be processed.
SYSTEMS AND METHODS FOR SKU INDUCTION, DECANTING AND AUTOMATED-ELIGIBILITY ESTIMATION
- John Richard Amend, Jr. ,
- Timothy Barber ,
- Benjamin Cohen ,
- Christopher Geyer ,
- Evan Glasgow ,
- James Guillochon ,
- Kirsten Wang ,
- Victoria Hinchey ,
- Jennifer Eileen King ,
- Thomas Koletschka ,
- Guoming Alex LONG ,
- Kyle Maroney ,
- Matthew T. Mason ,
- William Chu-Hyon McMahan ,
- Samuel Naseef ,
- Kevin O'Brien ,
- Dimitry Pechyoni ,
- Joseph Romano ,
- Max Saccoccio ,
- Jessica Scolnic ,
- Prasanna Velagapudi
An object induction system is disclosed for assigning handling parameters to an object. The system includes an analysis system, an association system, and an assignment system. The analysis system includes at least one characteristic perception system for providing perception data regarding an object to be processed. The association system includes an object information database and assigns association data to the object responsive to commonality with of any of the characteristic perception data with any of the characteristic recorded data. The assignment system is for assigning programmable motion device handling parameters to the indicia perception data based on the association data, and includes a workflow management system as well as a separate operational controller.
Machine for working glass slabs with a computerized numeric control assembly and related production process
A machine for working glass slabs includes a supporting structure, a slab grinding section having grinding heads, a conveyor assembly adapted to move the glass slab, and a slab drilling section having a conveyor adjacent to the conveyor assembly. The slab grinding section has suckers for keeping the glass slab next to a working plane spaced and/or offset with respect to the advancement plane, and a computerized numeric control assembly to perform workings on the glass slab.
Following robot and work robot system
A robot includes an arm, one or more visual sensors provided on the arm, a storage unit that stores a first feature value regarding at least a position and an orientation of a following target, the first feature value being stored as target data for causing the visual sensors provided on the arm to follow the following target, a feature value detection unit that detects a second feature value regarding at least a current position and a current orientation of the following target, the second feature value being detected using an image obtained by the visual sensors, a movement amount calculation unit that calculates a movement command for the arm based on a difference between the second feature value and the first feature value, and a movement command unit that moves the arm based on the movement command.
ENVIRONMENTAL CHANGE PROPOSING SYSTEM AND COMPUTER-READABLE STORAGE MEDIUM STORING AN ENVIRONMENTAL CHANGE PROPOSING PROGRAM
The environmental change proposing system includes: an acquiring unit configured to acquire processing information regarding processing of a conveyance task from a conveyance robot that has executed the conveyance task using an environmental map; an evaluation computing unit configured to perform an evaluation computation for evaluating, based on the processing information, a virtual layout in which the arrangement of installation objects on the environmental map has been changed; and an output unit configured to generate and output proposal information regarding a change in arrangement of the installation objects, based on an evaluation result of the evaluation computing unit.
Robotic system control method and controller
The present disclosure provides a control method of a robotic system. The control method includes: deriving an approach location at which the end effector grips an operation object; deriving a scan location for scanning an identifier of the operation object; and based on the approach location and the scan location, creating or deriving a control sequence to instruct the robot to execute the control sequence. The control sequence includes (1) gripping the operation object from a start location; (2) scanning an identifier of the operation object with a scanner located between the start location and a task location; (3) temporarily releasing the operation object from the end effector and regripping the operation object by the end effector to be shifted, at a shift location, when a predetermined condition is satisfied; and (4) moving the operation object to the task location.
Hide sorting systems and methods
Methods and systems for sorting hides are provided. In particular, one or more embodiments comprise a tanning control system that enhances the traceability of hides by capturing and utilizing data related to the unloading, tanning, sorting, and packaging of hides. Furthermore, one or more embodiments enable the tanning control system to improve efficiency by sorting hides based, at least in part, on data generated during prior tanning processes. Additionally, one or more embodiments facilitate the tanning control system in customizing the sorting and packaging of hides based, at least in part, on one or more hide characteristics and/or customer specifications.
Conveyor system with multiple robot singulators
A conveyor system includes: a pick conveyor defining a picking area for a bulk flow of parcels; a place conveyor positioned downstream of the picking area; a first robot singulator and a second robot singulator, which work in parallel to transfer parcels within a picking area of the pick conveyor to the place conveyor; and a vision and control subsystem that communicates instructions to control operation of some or all of the foregoing components. The vision and control subsystem includes a target camera for acquiring one or more images of the picking area, which are processed within the system to determine the location of parcels positioned within the picking area. The vision and control subsystem can execute one or more routines or subroutines to reduce system downtime associated with image acquisition and processing, parcel transfer to the place conveyor, and/or parcel delivery to the picking area.
VISION-GUIDED PICKING AND PLACING METHOD, MOBILE ROBOT AND COMPUTER-READABLE STORAGE MEDIUM
A vision-guided picking and placing method for a mobile robot that has a manipulator having a hand and a camera, includes: receiving a command instruction that instructs the mobile robot to grasp a target item among at least one object; controlling the mobile robot to move to a determined location, controlling the manipulator to reach for the at least one object, and capturing one or more images of the at least one object using the camera; extracting visual feature data from the one or more images, matching the extracted visual feature data to preset feature data of the target item to identify the target item, and determining a grasping position and a grasping vector of the target item; and controlling the manipulator and the hand to grasp the target item according to the grasping position and the grasping vector, and placing the target item to a target position.