Patent classifications
G05B2219/40532
ROBOT CONTROLLING SYSTEM
An application processor processes an application. A sensor processor acquires image data from an image sensor and analyzes the image data. A motion controlling processor controls motion of a movable part of a robot. The motion controlling processor provides posture information for specifying an orientation of the image sensor to the sensor processor, not via the application processor. The posture information includes information for specifying a position of the image sensor.
Domain adaptation using simulation to simulation transfer
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a generator neural network to adapt input images.
ROBOTIC LAUNDRY SORTING DEVICES, SYSTEMS, AND METHODS OF USE
Devices, systems, and methods for autonomously sorting dirty laundry articles into batched loads for washing are described. For example, an autonomous sorting device includes an enclosed channel including a plurality of sequential work volumes and a stationary floor extending between an inlet end and an outlet end of the channel, a plurality of arms disposed in series along the enclosed channel for rotating, tilting, extending, and retracting a terminal gripper of each arm into an associated work volume for grabbing at least one of a plurality of deformable dirty laundry articles and passing the at least one deformable laundry article to an adjacent work volume for grasping and hoisting by an adjacent arm. The device includes an inlet orifice for receiving the dirty laundry articles into the enclosed channel and an outlet orifice adjacent the outlet end through which each separated deformable article exits the enclosed channel into sorting bins.
Robot teaching by human demonstration
A method for teaching a robot to perform an operation based on human demonstration with images from a camera. The method includes a teaching phase where a 2D or 3D camera detects a human hand grasping and moving a workpiece, and images of the hand and workpiece are analyzed to determine a robot gripper pose and positions which equate to the pose and positions of the hand and corresponding pose and positions of the workpiece. Robot programming commands are then generated from the computed gripper pose and position relative to the workpiece pose and position. In a replay phase, the camera identifies workpiece pose and position, and the programming commands cause the robot to move the gripper to pick, move and place the workpiece as demonstrated. A teleoperation mode is also disclosed, where camera images of a human hand are used to control movement of the robot in real time.
Robotic laundry sorting devices, systems, and methods of use
Devices, systems, and methods for autonomously sorting dirty laundry articles into batched loads for washing are described. For example, an autonomous sorting system includes an enclosed channel including a stationary floor extending between an inlet end and an outlet end of the channel, a plurality of arms disposed in series along the enclosed channel for selectively grasping at least one of the plurality of deformable articles in sequence. The system includes an outlet orifice adjacent the outlet end through which each separated deformable article exits the enclosed channel upon release by the terminal gripper of the one of the plurality of arms, and one or more conveyors disposed adjacent the outlet end configured for receiving thereon a plurality of bins for collecting for washing together two or more articles of the plurality of deformable articles released through the outlet orifice having a common sensor-detected one or more characteristics.
Sensorized Robotic Gripping Device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Sensorized robotic gripping device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Domain Adaptation Using Simulation to Simulation Transfer
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training a generator neural network to adapt input images.
DEVICE AND METHOD FOR CONTROLLING A ROBOT FOR PICKING UP AN OBJECT
A method for controlling a robot for picking up an object. The method includes: receiving a camera image of an object; ascertaining an image region in the camera image showing an area of the object where it may not be picked up, by conveying the camera image to a machine learning model which is trained to allocate values to regions in camera images that represent whether the regions show areas of an object where it may not be picked up, allocating the ascertained image region to a spatial region; and controlling the robot to grasp the object in a spatial region other than the ascertained spatial region.
ROBOT CONTROLLING SYSTEM
A control system 4 includes an application processor that executes a first operating system to process an application, a sensor processor that executes a second operating system to process image data acquired by an image sensor, and a motion controlling processor that executes a third operating system to control motion of a movable part of a robot. The first operating system, the second operating system, and the third operating system are different from each other.