Patent classifications
G05B2219/39014
User interface for a teleoperated robot
A teleoperated robotic system that utilizes a graphical user interface (GUI) to perform work on a workpiece(s) using a robot. A coordinate system of the GUI is correlated to the tool center point (TCP) of the robot and the TCP or workspace of a teleoperated member, such as a haptic joystick. Operable manipulation of the teleoperated member is correlated to a movement at a particular location in the robot station, such as movement of the TCP of the robot. The GUI can also provide digital representations of the workpiece, which is based on inputted and/or scanned information relating to a reference workpiece and/or the particular workpiece on which the robot is performing work. The GUI can further provide indications of the various stages of assembly of the workpiece, as well as an indication of work already, or to be, performed on the workpiece.
DEVICE CONTROL SYSTEM
A sensor unit outputs a real sensor signal corresponding to a measured physical quantity. A controller controls a control target device in an environmental domain on the basis of the real sensor signal. The controller (a) performs a simulation of a specific phenomenon in the environmental domain under a simulation condition, and virtually measures the specific physical quantity on the basis of a simulation result and thereby generates a virtual sensor signal; (b) changes the simulation condition on the basis of the real sensor signal and the virtual sensor signal; and (c) generates a control signal for the control target device on the basis of the simulation condition and/or the simulation result. The simulation, the generation of the virtual sensor signal, and the change of the simulation condition are iteratively performed until a sensor error between the real sensor signal and the virtual sensor signal is converged.
TEACHING DEVICE, TEACHING METHOD, AND ROBOT SYSTEM
A teaching device constructs, in a virtual space, a virtual robot system in which a virtual 3D model of a robot and a virtual 3D model of a peripheral structure of the robot are arranged, and teaches a moving path of the robot. The teaching device includes an acquisition unit configured to acquire information about a geometric error between the virtual 3D models, and a correction unit configured to correct the moving path of the robot in accordance with the information acquired by the acquisition unit.
METHOD AND SYSTEM FOR PROGRAMMING A COBOT FOR A PLURALITY OF INDUSTRIAL CELLS
Systems and a method for programming for a plurality of cells of an industrial environment. A physical cobot is provided within a lab cell comprising lab physical objects. A virtual simulation system with a user interface is provided. The virtual simulation system receives information inputs on the virtual cobot, on the virtual lab cell comprising lab virtual objects, and on a plurality of virtual industrial cells comprising virtual industrial objects. The virtual cobot and the physical cobot are connected together. A superimposed meta-cell is generated by superimposing the plurality of virtual cells and the virtual lab cell so as to obtain a single superimposed meta cell including a set of superimposed virtual objects. The virtual cobot is positioned in the superimposed meta cell. Inputs are received from the physical cobot's movement during teaching whereby the physical cobot is moved in the lab cell to the desired position(s) while providing, via the user interface, a visualization of the virtual cobot's movement within the superimposed meta cell so that collisions with any object are minimized. A robotic program is generated based on the received inputs of the physical cobot's movement.
ROBOT CALIBRATION FOR AR AND DIGITAL TWIN
A method and system for calibration of an augmented reality (AR) device's position and orientation based on a robot's positional configuration. A conventional visual calibration target is not required for AR device calibration. Instead, the robot itself, in any pose, is used as a three dimensional (3D) calibration target. The AR system is provided with a CAD model of the entire robot to use as a reference frame, and 3D models of the individual robot arms are combined into a single object model based on joint positions known from the robot controller. The 3D surface model of the entire robot in the current pose is then used for visual calibration of the AR system by analyzing images from the AR device camera in comparison to the surface model of the robot in the current pose. The technique is applicable to initial AR device calibration and to ongoing device tracking.
Methods, apparatus, computer programs and non-transitory computer readable storage mediums for controlling a robot within a volume
A method of controlling a robot within a volume, the method comprising: receiving a three dimensional model including a model of the robot and a model of the volume in which the robot is configured to move within; defining a plurality of positions within the model of the volume to which the robot is moveable to, the plurality of positions being identified by an operator; receiving scanned three dimensional data of the robot and at least a part of the volume; determining a transformation algorithm using the three dimensional model and the scanned three dimensional data; applying the transformation algorithm to one or more positions of the plurality of positions to provide one or more transformed positions; and controlling movement of the robot using one or more of the transformed positions.
PROCESS IMAGE WITHIN CONTROLLERS ENABLING VISIBILITY AND ACCESSIBILITY OF REAL WORLD OBJECTS
A system for using digital twins to interact with physical objects in an automation system includes a plurality of controller devices, a process image backbone, and a registry comprising a plurality of digital twins. Each respective controller device comprises a volatile computer-readable storage medium comprising a process image area. The process image backbone provides the controllers with uniform access to the process image area of each controller. Each digital twin in the registry corresponds to a physical device controllable via one of the controllers devices via a corresponding process image area.
Teaching device, teaching method, and robot system
A teaching device constructs, in a virtual space, a virtual robot system in which a virtual 3D model of a robot and a virtual 3D model of a peripheral structure of the robot are arranged, and teaches a moving path of the robot. The teaching device includes an acquisition unit configured to acquire information about a geometric error between the virtual 3D models, and a correction unit configured to correct the moving path of the robot in accordance with the information acquired by the acquisition unit.
INTERACTION WITH PHYSICAL OBJECTS AS PROXY OBJECTS REPRESENTING VIRTUAL OBJECTS
Systems and techniques for enabling interaction with physical objects as proxy objects representing virtual objects are provided herein. Virtual reality application data associated with a virtual reality application executed on a virtual reality device, a first virtual reality object data associated with a first virtual reality object from the virtual reality application, and virtual reality event data associated with one or more events from the virtual reality application may be received. Robotic arms including a robotic hand may grasp a first physical object which corresponds to the first virtual reality object of the virtual reality application. Sensors may detect a user interaction with the first physical object. Force feedback instructions commanding the robotic arms to move while maintaining grasp of the first physical object may be generated and executed based on detecting the user interaction with the first physical object and based on the virtual reality event data.
VIRTUAL PIPETTING
A method for generating a control program (54) for a laboratory automation device (12) comprises: receiving configuration data (46) of the laboratory automation device (12), the configuration data (46) encoding positions of components (22) in the laboratory automation device (12); generating a three-dimensional model (58) of the components (22) of the laboratory automation device (12) from the configuration data (46), the three-dimensional model (22) additionally including a virtual pipette (60); displaying the three-dimensional model (58) with a virtual reality headset (14); receiving movement data (50) of a motion sensing controller (16) controlled by a user wearing the virtual reality headset (14), the movement data (50) indicating a three-dimensional movement of the motion sensing controller (16) in space; determining a movement of the virtual pipette (60) from the movement data (50) in the three-dimensional model (58) and updating the three-dimensional model (58) according to the movement of the virtual pipette (60); and generating a control program (54) for the laboratory automation device (12) from the movement data (50), wherein the control program (54) is adapted for moving a pipetting arm (30) with a pipette (32) of the laboratory automation device (12) with respect to the components (22) accordingly to the movement of the virtual pipette (60) in the three-dimensional model (58).