Patent classifications
G05B2219/36412
TEACHING APPARATUS, ROBOT SYSTEM, AND TEACHING METHOD
A teaching apparatus includes circuitry. The circuitry is configured to obtain result information corresponding to a position of a worked region on a workpiece. The circuitry is configured to generate first teaching information based on the result information. The first teaching information specifies a motion of an examination robot configured to examine the workpiece that has undergone work.
Systems and Methods for Distributed Autonomous Robot Interfacing Using Live Image Feeds
Described in detail herein is an autonomous fulfillment system. The system includes the first computing system, with an interactive display. The first computing system can transmit a request for physical objects from a facility. A second computing system can transmit instructions autonomous robot devices to retrieve the physical objects from the facility. The second computing system can control the image capturing device of the autonomous robot device to capture a live image feed of the at least one physical object picked up by the at least autonomous robot device. The second computing system can switch an input feed of the first computing system to display the live image feed on the display of the first computing system. The second computing system, instruct the autonomous device to discard the physical objects picked up by the at least one autonomous robot device and to pick up a replacement physical object.
CONTROL SYSTEM AND PROGRAMMABLE LOGIC CONTROLLER
A control system includes a camera including a first synchronous counter, a PLC including a second synchronous counter, and a control network connecting the camera and the PLC. The control network repeatedly synchronizes the first synchronous counter and the second synchronous counter with high precision by measuring durations of a plurality of transmission delays between the camera and the PLC. The camera adds, to a captured image obtained by image capturing, a count value of the first synchronous counter at the image capturing, and transmits the captured image to the PLC through the control network. The PLC stores, into a storage device included in the PLC, a count value of the second synchronous counter, internal data or data of a connected control target, the count value from the first synchronous counter, and image data.
Systems and methods for distributed autonomous robot interfacing using live image feeds
Described in detail herein is an autonomous fulfillment system. The system includes the first computing system, with an interactive display. The first computing system can transmit a request for physical objects from a facility. A second computing system can transmit instructions autonomous robot devices to retrieve the physical objects from the facility. The second computing system can control the image capturing device of the autonomous robot device to capture a live image feed of the at least one physical object picked up by the at least autonomous robot device. The second computing system can switch an input feed of the first computing system to display the live image feed on the display of the first computing system. The second computing system, instruct the autonomous device to discard the physical objects picked up by the at least one autonomous robot device and to pick up a replacement physical object.
Systems and Methods for Distributed Autonomous Robot Interfacing Using Live Image Feeds
Described in detail herein is an autonomous fulfillment system. The system includes the first computing system, with an interactive display. The first computing system can transmit a request for physical objects from a facility. A second computing system can transmit instructions autonomous robot devices to retrieve the physical objects from the facility. The second computing system can control the image capturing device of the autonomous robot device to capture a live image feed of the at least one physical object picked up by the at least autonomous robot device. The second computing system can switch an input feed of the first computing system to display the live image feed on the display of the first computing system. The second computing system, instruct the autonomous device to discard the physical objects picked up by the at least one autonomous robot device and to pick up a replacement physical object.
Information processing apparatus and information processing method
An information processing apparatus includes a first sensor, a second sensor, and a central processing unit (CPU). The first sensor detects an object present in a first direction with respect to an autonomous mobile body. The second sensor detects the object present in a second direction with respect to the autonomous mobile body, by a system different from a system of the first sensor. The CPU controls an operation of the autonomous mobile body based on a detection result acquired by the first sensor and a detection result acquired by the second sensor.
INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD
An information processing apparatus includes a first sensor, a second sensor, an output device, and circuitry. The first sensor has a detection range in a first direction with respect to an autonomous mobile body, and the second sensor has a detection range in a second direction with respect to the autonomous mobile body, where the detection range of the first sensor in the first direction overlaps with the detection range of the second sensor in the second direction. Further, the output device presents information, and the circuitry controls an operation of the autonomous mobile body based on a first detection result of the first sensor and a second detection result of the second sensor.
Human-robot collaborative flexible manufacturing system and method
An exemplary method and system are disclosed to flexibly and adaptably manufacture and assemble a workpiece by using recordings of a user in machine learning/artificial intelligence algorithms to train a robot for subsequent automated manufacture. Machine learning and artificial intelligence learning can generate libraries of generalized dynamic motion primitives that can be subsequently combined for any type of manufacturing or assembling activity. The exemplary method and system can flexibly generate a model of an existing workpiece as a template or primer workpiece that can then be used in conjunction with the DMP operations to fabricate subsequent workpieces.