G05B2219/39451

Augmented Reality System and Method for Conveying To a Human Operator Information Associated With Human-Imperceptible Indicia within an Operating Environment of a Robot
20220001538 · 2022-01-06 · ·

A robotic system comprising a robot, and human-imperceptible indicia associated with an object within an environment in which the robot operates, the human-imperceptible indicia comprising or linking to interaction information pertaining to a predetermined intended interaction of the robot with the object, the interaction information being operable to facilitate interaction with the object by the robot in accordance with the predetermined intended interaction. The system can further comprise at least one sensor operable to sense the human-imperceptible indicia and the interaction information, and an augmented reality system comprising a computer for conveying human-understandable information to a human operator, which is associated with at least one of the interaction or linking information. The machine readable indicia can comprise symbols that can be sensed by a sensor of the robot or the augmented reality system and interpreted by the robot or the augmented reality system. The robot can utilize a camera to transmit a real-world view of the operating environment to the augmented reality system that can be combined with the human-understandable information to provide augmented reality operation of the robot.

Virtual Pipetting
20230333134 · 2023-10-19 · ·

A method for generating a control program for a laboratory automation device includes receiving configuration data of the laboratory automation device, generating a three-dimensional model of the components of the laboratory automation device from the configuration data, the three-dimensional model additionally including a virtual pipette; displaying the three-dimensional model with a virtual reality headset; receiving movement data of a motion sensing controller controlled by a user wearing the virtual reality headset, the movement data indicating a three-dimensional movement of the motion sensing controller in space; determining a movement of the virtual pipette in the three-dimensional model from the movement data and updating the three-dimensional model according to the movement of the virtual pipette; and generating a control program for the laboratory automation device from the movement data.

Systems and Methods for Robotic Manipulation Using Extended Reality
20230286161 · 2023-09-14 ·

A method of controlling a robot includes: receiving, by a computing device, from one or more sensors, sensor data reflecting an environment of the robot, the one or more sensors configured to have a field of view that spans at least 150 degrees with respect to a ground plane of the robot; providing, by the computing device, video output to an extended reality (XR) display usable by an operator of the robot, the video output reflecting the environment of the robot; receiving, by the computing device, movement information reflecting movement by the operator of the robot; and controlling, by the computing device, the robot to move based on the movement information.

System(s) and method(s) of using imitation learning in training and refining robotic control policies

Implementations described herein relate to training and refining robotic control policies using imitation learning techniques. A robotic control policy can be initially trained based on human demonstrations of various robotic tasks. Further, the robotic control policy can be refined based on human interventions while a robot is performing a robotic task. In some implementations, the robotic control policy may determine whether the robot will fail in performance of the robotic task, and prompt a human to intervene in performance of the robotic task. In additional or alternative implementations, a representation of the sequence of actions can be visually rendered for presentation to the human can proactively intervene in performance of the robotic task.

Virtual pipetting
11747357 · 2023-09-05 · ·

A method for generating a control program (54) for a laboratory automation device (12) comprises: receiving configuration data (46) of the laboratory automation device (12), the configuration data (46) encoding positions of components (22) in the laboratory automation device (12); generating a three-dimensional model (58) of the components (22) of the laboratory automation device (12) from the configuration data (46), the three-dimensional model (22) additionally including a virtual pipette (60); displaying the three-dimensional model (58) with a virtual reality headset (14); receiving movement data (50) of a motion sensing controller (16) controlled by a user wearing the virtual reality headset (14), the movement data (50) indicating a three-dimensional movement of the motion sensing controller (16) in space; determining a movement of the virtual pipette (60) from the movement data (50) in the three-dimensional model (58) and updating the three-dimensional model (58) according to the movement of the virtual pipette (60); and generating a control program (54) for the laboratory automation device (12) from the movement data (50), wherein the control program (54) is adapted for moving a pipetting arm (30) with a pipette (32) of the laboratory automation device (12) with respect to the components (22) accordingly to the movement of the virtual pipette (60) in the three-dimensional model (58).

SYSTEM AND METHOD FOR ASSISTING OPERATOR ENGAGEMENT WITH INPUT DEVICES
20230019316 · 2023-01-19 ·

Systems and methods of assisting operator engagement with input devices include an input device configured to be operated by a hand of an operator, a repositionable structure coupled to the input device, a hand detection system, and a control unit. The control unit is configured to detect a position and an orientation of the hand using the hand detection system, determine, based on the position of the hand, a target position for the input device, wherein moving the input device from a current position of the input device to the target position moves the input device closer to a grasping position for the hand, and in response to determining that an orientation difference between the orientation of the hand and a current orientation of the input device is not greater than a threshold orientation difference, cause one or more actuators to move the input device toward the target position.

ROBOTIC CONTROL VIA A VIRTUAL WORLD SIMULATION
20230367289 · 2023-11-16 · ·

A system has a virtual-world (VW) controller and a physical-world (PW) controller. The pairing of a PW element with a VW element establishes them as corresponding physical and virtual twins. The VW controller and/or the PW controller receives measurements from one or more sensors characterizing aspects of the physical world, the VW controller generates the virtual twin, and the VW controller and/or the PW controller generates commands for one or more actuators affecting aspects of the physical world. To coordinate the corresponding virtual and physical twins, (i) the VW controller controls the virtual twin based on the physical twin or (ii) the PW controller controls the physical twin based on the virtual twin. Depending on the operating mode, one of the VW and PW controllers is a master controller, and the other is a slave controller, where the virtual and physical twins are both controlled based on one of VW or PW forces.

System and method for robot teaching based on RGB-D images and teach pendant

A system for robot teaching based on RGB-D images and a teach pendant, including an RGB-D camera, a host computer, a posture teach pendant, and an AR teaching system which includes an AR registration card, an AR module, a virtual robot model, a path planning unit and a posture teaching unit. The RGB-D camera collects RGB images and depth images of a physical working environment in real time. In the path planning unit, path points of a robot end effector are selected, and a 3D coordinates of the path points in the basic coordinate system of the virtual robot model are calculated; the posture teaching unit records the received posture data as the postures of a path point where the virtual robot model is located, so that the virtual robot model is driven to move according to the postures and positions of the path points, thereby completing the robot teaching.

Collaborative operation support device
11458618 · 2022-10-04 · ·

The collaborative operation support device includes a display device including a display area; and a processor configured to detect, based on an image in which the operator or the robot is represented, a position of a section of the robot in the display area when the operator looks at the robot through the display area, the section associated with an operation mode of the robot specified by means of an input device; select, in accordance with the specified operation mode of the robot, display data corresponding to the specified mode among display data stored in a memory; and display the selected display data in the display area of the display device in such a way that the selected display data is displayed at a position that satisfies a certain positional relationship with the position of the section of the robot in the display area.

Robot system
11400602 · 2022-08-02 · ·

The present invention provides a robot system capable of, regardless of the type of the robot, precisely measuring the positional relationship between an AR device and markers, and comparatively easily and with high precision recognizing the position or orientation of the robot with the AR device. The robot system includes a marker detecting unit that simultaneously detects a reference marker and a robot coordinate system identification marker in one detection operation; a robot system information receiving unit that receives information regarding the robot system; a robot coordinate system identifying unit that identifies a coordinate system of a robot from the position of the robot coordinate system identification indicator and coordinate system information; an AR device that displays the information regarding the robot system, based on the coordinate system of the robot; a coordinate system setting unit that sets an origin by moving the robot to a designated position; and a coordinate system information transmission unit that transmits the coordinate system information set by the coordinate system setting unit to the AR device.