Patent classifications
B25J13/086
Autonomously acting robot that stares at companion
A robot includes an operation control unit that selects a motion of the robot, a drive mechanism that executes a motion selected by the operation control unit, an eye control unit that causes an eye image to be displayed on a monitor installed in the robot, and a recognizing unit that detects a user. The eye control unit causes a pupil region included in the eye image to change in accordance with a relative position of the user and the robot. A configuration may be such that the eye control unit causes the pupil region to change when detecting a sight line direction of the user, or when the user is in a predetermined range.
METHOD FOR OPERATING A PICKING ROBOT AND RELATED DEVICES
A method for operating a picking robot comprising an end effector assembly and a vision assembly, and related controller device is disclosed, the method comprising picking a subject with the end effector assembly from a bin comprising a plurality of subjects; moving the subject to a delivery station; and releasing the subject on the delivery station, wherein the method comprises locking a joint connection of the end effector assembly prior to and/or during the act of moving the subject to the delivery station.
System and method for a robotic manipulator system
A robotic arm control system including a robotic arm configured to deploy one or more tools in an operating space, one or more sensors, and a control system operably configured to: scan the operating space with the one or more sensors, identify a surface of the operating space based at least in part upon information sensed by the one or more sensors, establish a virtual barrier offset from the surface, and limit movement of the robotic arm based at least in part upon the virtual barrier.
Robotic livery printing system
The present disclosure provides a robotic printing system for printing images on the surface of an object. One exemplary system includes a printing module carried by a motion platform to directly eject printing materials on a surface. One aspect of this disclosure provides methods for accurately controlling the motion of the motion platform, generating accurate triggering signals for printing heads, and properly aligning adjacent swaths of an image.
Systems and methods for managing a semantic map in a mobile robot
Described herein are systems, devices, and methods for maintaining a valid semantic map of an environment for a mobile robot. A mobile robot comprises a drive system, a sensor circuit to sense occupancy information, a memory, a controller circuit, and a communication system. The controller circuit can generate a first semantic map corresponding to a first robot mission using first occupancy information and first semantic annotations, transfer the first semantic annotations to a second semantic map corresponding to a subsequent second robot mission. The control circuit can generate the second semantic map that includes second semantic annotations generated based on the transferred first semantic annotations. User feedback on the first or the second semantic map can be received via a communication system. The control circuit can update first semantic map and use it to navigate the mobile robot in a future mission.
GRIP DEVICE AND ROBOT DEVICE COMPRISING SAME
A grip device is provided. A grip device according to an embodiment of the present disclosure includes: a first finger; a second finger facing the first finger; a first link part including a first guide slot and supporting the first finger; a second link part supporting the second finger and including a second guide slot, intersecting the first link part; a hinge configured to move inside the first guide slot and second guide slot and connecting the first link part and the second link part at an intersection point of the first link part and second link part; a first actuator configured to adjust a distance between the first finger and second finger by moving the first link part and/or the second link part; and a second actuator configured to move the hinge inside the first guide slot and second guide slot.
Devices, systems, and methods for robotic pipe handling
The present disclosure relates to systems and methods for automated drill pipe handling operations, such as trip in, trip out, and stand building operations. A pipe handling system of the present disclosure may include a lifting system for handling a load of a pipe stand, a pipe handling robot configured for engaging with the pipe stand and manipulating a position of the pipe stand, and a feedback device configured to provide information about a condition of the pipe stand, the lifting system, or the pipe handling robot. In some embodiments, the pipe handling robot may be a first robot configured for engaging with and manipulating a first end of the pipe stand, and the system may include a second pipe handling robot configured for engaging with and manipulating a second end of the pipe stand.
Intelligent robotic system for autonomous airport trolley collection
A robotic trolley collection system and methods for automatically collecting baggage/luggage trolleys are provided. The system includes a differential-driven mobile base; a manipulator mounted on the differential-driven mobile base for forking a trolley, having a structure same as a head portion of the trolley; a sensory and measurement assembly for providing sensing and measurement dataflow; and a main processing case for processing the sensing and measurement dataflow provided by the sensory and measurement assembly and for controlling the differential-driven mobile base, the manipulator, and the sensory and measurement assembly. The method includes localizing and mapping the robotic trolley collection system; detecting an idle trolley to be collected and estimating pose of the idle trolley; visually servoing control of the robotic trolley collection system; and issuing motion control commands to the robotic trolley collection system for automatically collecting the idle trolley.
Robot system
A robot system including a robot that is controlled by a robot controller and a wireless communication device that is worn or carried by a person present in the periphery of the robot. The wireless communication device has a sensor capable of detecting an acceleration, the wireless communication device is configured to transmit information related to the acceleration to the robot controller of the robot in a state in which the wireless communication device is not operated by the person, and the robot controller performs operation restriction of the robot when the acceleration exceeds a threshold.
Method of detecting human and/or animal motion and performing mobile disinfection
Implementations of the disclosed subject matter provide a method of moving a mobile robot within an area. The movement of the mobile robot and the emission of ultraviolet (UV) light may be stopped when a human and/or animal is determined to be within the area. Using at least one sensor, the method may be determine whether there is at least one of human identification, animal identification, motion, heat, and/or sound within the area for a predetermined period of time. When there is no human identification, animal identification, motion, heat, and/or sound within the predetermined period of time, UV light may be emitted and the drive system may be controlled to move the mobile robot within the area. When there is at least one of human identification, motion, heat, and/or sound within the predetermined period of time, a light source may be controlled to prohibit the emission of UV light.