G05B2219/40543

System and method for piece-picking or put-away with a mobile manipulation robot

A method and system for piece-picking or piece put-away within a logistics facility. The system includes a central server and at least one mobile manipulation robot. The central server is configured to communicate with the robots to send and receive piece-picking data which includes a unique identification for each piece to be picked, a location within the logistics facility of the pieces to be picked, and a route for the robot to take within the logistics facility. The robots can then autonomously navigate and position themselves within the logistics facility by recognition of landmarks by at least one of a plurality of sensors. The sensors also provide signals related to detection, identification, and location of a piece to be picked or put-away, and processors on the robots analyze the sensor information to generate movements of a unique articulated arm and end effector on the robot to pick or put-away the piece.

Assisted assigning of a workpiece to a mobile unit of an indoor location system

A method for assigning a workpiece to be processed to a mobile unit of an indoor location system used in a manufacturing hall in industrial processing of workpieces is provided. The method includes: providing a manufacturing control system for the industrial processing of workpieces with a machine tool in accordance with workpiece-specific processing plans, providing an assistance system for collecting measurement assistance workpiece data sets for the workpieces, detecting a workpiece to be assigned with the assistance system and generating a measurement-assistance-workpiece data set for the workpiece, comparing the measurement-assistance-workpiece data set with a processing plan-assistance-workpiece data set corresponding to the workpiece to identify a processing plan, providing a mobile unit assigned to the identified processing plan, the indoor location system configured to determine a position of the mobile unit, and spatially assigning the workpiece to the mobile unit assigned to the identified processing plan.

Robot Control Device, Method and Program
20220032459 · 2022-02-03 · ·

When occurrence of an obstruction has been detected during action of a robot, a path generation section acquires environment information at a periphery of the robot after obstruction occurred, robot specification information, and safe pose information representing a recovery-pose for the robot, and generates a path of the robot from a pose after obstruction occurred to a safe pose based on the acquired information.

Actuator control system, actuator control method, information processing program, and storage medium

An actuator control system includes: a transmission control unit configured to transmit final data, which is a final result of the computation by a sensor output computation unit, and to transmit intermediate data before transmitting the final data; and a command value computation unit configured to compute a command value for driving an actuator, based on the intermediate data and the final data transmitted by the transmission control unit.

SYSTEMS AND METHODS FOR 3D SCENE AUGMENTATION AND RECONSTRUCTION
20210383115 · 2021-12-09 · ·

A computer-implemented visual input reconstruction system for enabling selective insertion of content into preexisting media content frames may include at least one processor configured to perform operations. The operations may include accessing a memory storing object image identifiers associated with objects and transmitting, to one or more client devices, an object image identifier. The operations may include receiving bids from one or more client devices and determining a winning bid. The operations may include receiving winner image data from a winning client device and storing the winner image data in the memory. The operations may include identifying, in a preexisting media content frame, an object insertion location. The operations may include generating a processed media content frame by inserting a rendition of the winner image data at the object insertion location in the preexisting media content frame and transmitting the processed media content frame to one or more user devices.

Holding apparatus, container provided with tag, object holding program and object holding method

A holding apparatus includes a first detection unit configured to detect an indicated holding object, a second detection unit configured to detect, when the first detection unit detects the holding object, a tag proximate to the holding object, and a holding part configured to hold a container provided with the tag based on tag information of the tag detected by the second detection unit.

METHOD AND SYSTEM FOR ROBOTIC ASSEMBLY
20210370513 · 2021-12-02 ·

A method for robotic assembly includes: receiving product data including product structure data and/or product geometry data of a product with a base component and at least one assembly part to be assembled; analyzing the product data to determine robot functions relating to functions of a robot for assembly of the product as determined robot functions; generating a robot program including assembly instructions dependent on the determined robot functions and the product data; and executing the generated robot program so as to identify and/or localize the at least one assembly part and assemble the product.

System and method for piece picking or put-away with a mobile manipulation robot

A method and system for picking or put-away within a logistics facility. The system includes a central server and at least one mobile manipulation robot. The central server is configured to communicate with the robots to send and receive picking data which includes a unique identification for each item to be picked, a location within the logistics facility of the items to be picked, and a route for the robot to take within the logistics facility. The robots can then autonomously navigate and position themselves within the logistics facility by recognition of landmarks by at least one of a plurality of sensors. The sensors also provide signals related to detection, identification, and location of a item to be picked or put-away, and processors on the robots analyze the sensor information to generate movements of a unique articulated arm and end effector on the robot to pick or put-away the item.

AUTONOMOUS TASK PERFORMANCE BASED ON VISUAL EMBEDDINGS

A method for controlling a robotic device is presented. The method includes capturing an image corresponding to a current view of the robotic device. The method also includes identifying a keyframe image comprising a first set of pixels matching a second set of pixels of the image. The method further includes performing, by the robotic device, a task corresponding to the keyframe image.

Autonomous task performance based on visual embeddings

A method for controlling a robotic device is presented. The method includes capturing an image corresponding to a current view of the robotic device. The method also includes identifying a keyframe image comprising a first set of pixels matching a second set of pixels of the image. The method further includes performing, by the robotic device, a task corresponding to the keyframe image.