B25J19/04

SHOOTING SYSTEM, SHOOTING METHOD THEREFOR, AND PROGRAM
20220193935 · 2022-06-23 · ·

To be able to favorably shoot an object-to-be-shot displayed on a display unit of a remote control-type robot. A shooting system includes a remote control-type robot including a display unit for displaying an image of an object-to-be-shot and that is operated remotely, first image capturing device for shooting the remote control-type robot, and image control device for performing at least one of: replacing the image of the object-to-be-shot displayed on the display unit shot by the first image capturing device; adjusting the image of the object-to-be shot displayed on the display unit; changing the position of the display unit; and changing the posture of the display unit so that the image of the object-to-be-shot displayed on the display unit is made clearer when the display unit of the remote control-type robot falls within a shooting range of the first image capturing device.

SHOOTING SYSTEM, SHOOTING METHOD THEREFOR, AND PROGRAM
20220193935 · 2022-06-23 · ·

To be able to favorably shoot an object-to-be-shot displayed on a display unit of a remote control-type robot. A shooting system includes a remote control-type robot including a display unit for displaying an image of an object-to-be-shot and that is operated remotely, first image capturing device for shooting the remote control-type robot, and image control device for performing at least one of: replacing the image of the object-to-be-shot displayed on the display unit shot by the first image capturing device; adjusting the image of the object-to-be shot displayed on the display unit; changing the position of the display unit; and changing the posture of the display unit so that the image of the object-to-be-shot displayed on the display unit is made clearer when the display unit of the remote control-type robot falls within a shooting range of the first image capturing device.

Cleaning robot

A cleaning robot includes an arm including a distal end portion to which a brush is attached, the arm extending in a first direction parallel to a horizontal direction, a driver connected to the arm, the driver including a first mechanism that moves the arm in the first direction, a second mechanism that moves the arm in a second direction parallel to a vertical direction perpendicular to the first direction, and a third mechanism that moves the arm in a third direction perpendicular to both the first direction and second direction, a controller configured to switch the orientation of the distal end portion between an orientation for cleaning a first target face of the object and an orientation for cleaning a second target face of the object, the first target face facing the first direction, the second target face facing the second direction.

Cleaning robot

A cleaning robot includes an arm including a distal end portion to which a brush is attached, the arm extending in a first direction parallel to a horizontal direction, a driver connected to the arm, the driver including a first mechanism that moves the arm in the first direction, a second mechanism that moves the arm in a second direction parallel to a vertical direction perpendicular to the first direction, and a third mechanism that moves the arm in a third direction perpendicular to both the first direction and second direction, a controller configured to switch the orientation of the distal end portion between an orientation for cleaning a first target face of the object and an orientation for cleaning a second target face of the object, the first target face facing the first direction, the second target face facing the second direction.

HIGH ACCURACY MOLDED NAVIGATION ARRAYS, SYSTEMS, AND METHODS
20220184824 · 2022-06-16 ·

Systems, methods, and devices are described for high accuracy molded navigation arrays. In example embodiments, a navigation array may be formed by molding, as a single component, an array having a plurality of marker regions that may include a reflective layer disposed thereon. In other embodiments, a navigation array may be formed by molding over a frame having a plurality of marker elements. In still other embodiments, a navigation array may be formed by molding over individual marker elements. In certain embodiments, a navigation array may be formed by molding a frame with a plurality of voids and subsequently molding marker elements into each void where the marker elements may include a reflective layer disposed thereon. In some embodiments, a navigation array may be formed by molding a plurality of marker elements on a frame and disposing a reflective layer on the marker elements.

HIGH ACCURACY MOLDED NAVIGATION ARRAYS, SYSTEMS, AND METHODS
20220184824 · 2022-06-16 ·

Systems, methods, and devices are described for high accuracy molded navigation arrays. In example embodiments, a navigation array may be formed by molding, as a single component, an array having a plurality of marker regions that may include a reflective layer disposed thereon. In other embodiments, a navigation array may be formed by molding over a frame having a plurality of marker elements. In still other embodiments, a navigation array may be formed by molding over individual marker elements. In certain embodiments, a navigation array may be formed by molding a frame with a plurality of voids and subsequently molding marker elements into each void where the marker elements may include a reflective layer disposed thereon. In some embodiments, a navigation array may be formed by molding a plurality of marker elements on a frame and disposing a reflective layer on the marker elements.

Robot system and method of operating the same

A robot system includes a user interface configured to receive an operational instruction from an operator, a robot installed in a workspace and configured to perform a series of works including a plurality of processes, a sensor installed in the workspace, a transparent type display unit configured so that the operator is visible of a physical real world and configured to display information detected by the sensor as the image screen, and a control device. The control device displays on the transparent type display unit, when the robot is operated by the user interface, first information that is information detected by the sensor, as the image screen.

Robot system and method of operating the same

A robot system includes a user interface configured to receive an operational instruction from an operator, a robot installed in a workspace and configured to perform a series of works including a plurality of processes, a sensor installed in the workspace, a transparent type display unit configured so that the operator is visible of a physical real world and configured to display information detected by the sensor as the image screen, and a control device. The control device displays on the transparent type display unit, when the robot is operated by the user interface, first information that is information detected by the sensor, as the image screen.

Robotic multi-gripper assemblies and methods for gripping and holding objects
11345029 · 2022-05-31 · ·

A method for operating a transport robot includes receiving image data representative of a group of objects. One or more target objects are identified in the group based on the received image data. Addressable vacuum regions are selected based on the identified one or more target objects. The transport robot is command to cause the selected addressable vacuum regions to hold and transport the identified one or more target objects. The transport robot includes a multi-gripper assembly having an array of addressable vacuum regions each configured to independently provide a vacuum. A vision sensor device can capture the image data, which is representative of the target objects adjacent to or held by the multi-gripper assembly.

Robotic multi-gripper assemblies and methods for gripping and holding objects
11345029 · 2022-05-31 · ·

A method for operating a transport robot includes receiving image data representative of a group of objects. One or more target objects are identified in the group based on the received image data. Addressable vacuum regions are selected based on the identified one or more target objects. The transport robot is command to cause the selected addressable vacuum regions to hold and transport the identified one or more target objects. The transport robot includes a multi-gripper assembly having an array of addressable vacuum regions each configured to independently provide a vacuum. A vision sensor device can capture the image data, which is representative of the target objects adjacent to or held by the multi-gripper assembly.