Y10S901/31

METHOD OF TEACHING ROBOT AND ROBOT SYSTEM
20210114221 · 2021-04-22 ·

A robot system includes a robot, a vision sensor, and a controller. The vision sensor is configured to be detachably attached to the robot. The controller is configured to measure a reference object by using the vision sensor and calibrate a relative relationship between a sensor portion of the vision sensor and an engagement portion of the vision sensor, and teach the robot by referring to the relative relationship and by using the vision sensor, after the vision sensor is attached to the robot.

Automated proxy picker system for non-fungible goods

A system and method for providing proxy picking of non-fungible goods within an automated storage and retrieval system is provided, which repurposes one or more automated mobile robots operating within the automated inventory management system to perform a plurality of tasks across multiple different areas of an automated store. The proxy picking system and method are configured to pick individually identified non-fungible goods according to a customer selection on an ordering screen based on measured attributes and images of the goods, the attributes selected by the customer.

Robot simulation apparatus and robot simulation method

The robot simulation apparatus includes: a workpiece model setting unit 11 that sets a workpiece model obtained by forming a model of a three-dimensional shape of a workpiece; a bulk pile data generating unit 20 that generates virtual bulk pile data of a plurality of workpiece models piled up in a virtual work space as a virtually formed work space; a region estimating unit 22 that identifies an estimated region that is estimated to be three-dimensionally measurable by a sensor unit which is disposed above the work space, based on a position and a posture of each workpiece model in the bulk pile data; and a picking motion simulating unit 30 that executes a simulation for verifying the picking motion from the bulk pile of the workpiece models in the virtual work space, based on data of the estimated region identified by the region estimating unit 22.

Positioning a robot sensor for object classification
10967507 · 2021-04-06 · ·

In one embodiment, a method includes receiving, from a first sensor on a robot, first sensor data indicative of an environment of the robot. The method also includes identifying, based on the first sensor data, an object of an object type in the environment of the robot, where the object type is associated with a classifier that takes sensor data from a predetermined pose relative to the object as input. The method further includes causing the robot to position a second sensor on the robot at the predetermined pose relative to the object. The method additionally includes receiving, from the second sensor, second sensor data indicative of the object while the second sensor is positioned at the predetermined pose relative to the object. The method further includes determining, by inputting the second sensor data into the classifier, a property of the object.

Robot system

A robot system includes: at least one supporting unit, on which a person is placeable; a robot including a plurality of multi-jointed arms, each of which has a plurality of degrees of freedom, and at least one base provided with the plurality of multi-jointed arms; and at least one type of equipment mountable to the plurality of multi-jointed arms. The robot performs at least two nursing/medical actions on the person by operating the at least one type of equipment.

Robot simulation apparatus and robot simulation method
10913157 · 2021-02-09 · ·

The robot simulation apparatus includes: a workpiece model setting unit 11 that sets a workpiece model; a posture condition setting unit 16 that sets a condition related to a posture taken by the workpiece model, as a posture condition, when the workpiece model set by the workpiece model setting unit 11 is disposed in a virtual work space as a virtually formed work space; a bulk pile data generating unit 20 that generates bulk pile data of the plurality of workpiece models piled up in the virtual work space, in accordance with the posture condition set by the posture condition setting unit 16; and a picking motion simulating unit 30 that executes a picking motion simulation for verifying the picking motion of the workpiece models in the virtual work space, with respect to the bulk pile data generated by the bulk pile data generating unit 20.

End effector, hand device and robot equipped with the same
10933538 · 2021-03-02 · ·

A hand section 40 includes a hand base section 40a attached to a link section, a first finger section 40b provided to extend in a direction non-parallel with a direction toward a tip end portion from a base end portion of the hand base section 40a, from the tip end portion of the hand base section 40a, and a camera 40f provided at a side surface of the hand base section 40a, and capable of imaging a sideward of the hand base section 40a.

Holding device, handling apparatus, and detection device

According to one embodiment, a holding device includes a first holder and a first sensor. The first holder includes a first member and a second member. The first holder is configured to hold an object by interposing the object between the first member and the second member. The first sensor is in the first member. The first sensor is configured to detect a vibration.

Robotic gripper camera

An unmanned ground vehicle includes a main body, a drive system supported by the main body, and a manipulator arm pivotally coupled to the main body. The drive system comprising right and left driven track assemblies mounted on right and left sides of the main body. The manipulator arm includes a gripper, a wrist motor configured for rotating the gripper, and an inline camera in a palm of the gripper. The inline camera is mechanically configured to remain stationary with respect to the manipulator arm while the wrist motor rotates the gripper.

Method of teaching robot and robot system

A robot system includes a robot, a vision sensor, and a controller. The vision sensor is configured to be detachably attached to the robot. The controller is configured to measure a reference object by using the vision sensor and calibrate a relative relationship between a sensor portion of the vision sensor and an engagement portion of the vision sensor, and teach the robot by referring to the relative relationship and by using the vision sensor, after the vision sensor is attached to the robot.