G05B2219/39014

Process image within controllers enabling visibility and accessibility of real world objects

A system for using digital twins to interact with physical objects in an automation system includes a plurality of controller devices, a process image backbone, and a registry comprising a plurality of digital twins. Each respective controller device comprises a volatile computer-readable storage medium comprising a process image area. The process image backbone provides the controllers with uniform access to the process image area of each controller. Each digital twin in the registry corresponds to a physical device controllable via one of the controllers devices via a corresponding process image area.

Automated calibration system and method for a workpiece coordinate frame of a robot

An automated calibration system for a workpiece coordinate frame of a robot includes a physical image sensor having a first image central axis, and a controller for controlling the physical image sensor adapted on a robot to rotate by an angle to set up a virtual image sensor having a second image central axis. The first and the second image central axes are intersected at an intersection point. The controller controls the robot to repeatedly move back and forth a characteristic point on the workpiece between these two axes until the characteristic point overlaps the intersection point. The controller records a calibration point including coordinates of joints of the robot, then the controller moves another characteristic point and repeats the foregoing movement to generate several other calibration points. According to the calibration points, the controller calculates relative coordinates of a virtual tool center point and the workpiece to the robot.

Method and system for programming a cobot for a plurality of industrial cells

Systems and a method are provided for programming a cobot for a plurality of cells of an industrial environment. A physical cobot is provided within a lab cell comprising physical lab objects. A virtual simulation system receives information inputs on a virtual cobot representing the physical cobot, regarding a virtual lab cell comprising virtual lab objects, and on a plurality of virtual industrial cells comprising virtual industrial objects. Inputs are received from the physical cobot's movement during teaching whereby the physical cobot is moved in the lab cell to the desired position(s) while providing, via a user interface, a visualization of the virtual cobot's movement within a meta cell generated by superimposing the plurality of virtual industrial cells with the virtual lab cell, so that collisions with any object are minimized. A robotic program is generated based on the received inputs of the physical cobot's movement.

Interaction with physical objects as proxy objects representing virtual objects

Systems and techniques for enabling interaction with physical objects as proxy objects representing virtual objects are provided herein. Virtual reality application data associated with a virtual reality application executed on a virtual reality device, a first virtual reality object data associated with a first virtual reality object from the virtual reality application, and virtual reality event data associated with one or more events from the virtual reality application may be received. Robotic arms including a robotic hand may grasp a first physical object which corresponds to the first virtual reality object of the virtual reality application. Sensors may detect a user interaction with the first physical object. Force feedback instructions commanding the robotic arms to move while maintaining grasp of the first physical object may be generated and executed based on detecting the user interaction with the first physical object and based on the virtual reality event data.

AUTOMATED CALIBRATION SYSTEM AND METHOD FOR A WORKPIECE COORDINATE FRAME OF A ROBOT

An automated calibration system for a workpiece coordinate frame of a robot includes a physical image sensor having a first image central axis, and a controller for controlling the physical image sensor adapted on a robot to rotate by an angle to set up a virtual image sensor having a second image central axis. The first and the second image central axes are intersected at an intersection point. The controller controls the robot to repeatedly move back and forth a characteristic point on the workpiece between these two axes until the characteristic point overlaps the intersection point. The controller records a calibration point including coordinates of joints of the robot, then the controller moves another characteristic point and repeats the foregoing movement to generate several other calibration points. According to the calibration points, the controller calculates relative coordinates of a virtual tool center point and the workpiece to the robot.

IN-HAND OBJECT POSE TRACKING
20210122045 · 2021-04-29 ·

Apparatuses, systems, and techniques are described that estimate the pose of an object while the object is being manipulated by a robotic appendage. In at least one embodiment, a sample-based optimization algorithm tracks in-hand object poses during manipulation via contact feedback and a GPU-accelerated robotic simulation is developed. In at least one embodiment, parallel simulations concurrently model object pose changes that may be caused by complex contact dynamics. In at least one embodiment, the optimization algorithm tunes simulation parameters during object pose tracking to further improve tracking performance. In various embodiments, real-world contact sensing may be improved by utilizing vision in-the-loop.

CONTROL DEVICE, CONTROL METHOD, AND PROGRAM

Provided is a control device including a control unit that outputs information that controls output of scent on the basis of a result of recognizing a hand.

SYSTEM AND METHOD FOR ROBOT TEACHING BASED ON RGB-D IMAGES AND TEACH PENDANT
20210023694 · 2021-01-28 ·

A system for robot teaching based on RGB-D images and a teach pendant, including an RGB-D camera, a host computer, a posture teach pendant, and an AR teaching system which includes an AR registration card, an AR module, a virtual robot model, a path planning unit and a posture teaching unit. The RGB-D camera collects RGB images and depth images of a physical working environment in real time. In the path planning unit, path points of a robot end effector are selected, and a 3D coordinates of the path points in the basic coordinate system of the virtual robot model are calculated; the posture teaching unit records the received posture data as the postures of a path point where the virtual robot model is located, so that the virtual robot model is driven to move according to the postures and positions of the path points, thereby completing the robot teaching.

Control System and Control Method
20200398435 · 2020-12-24 · ·

A control device estimates a position and pose of an imaging device relative to a robot based on an image of the robot captured by the imaging device. A simulation device arranges a robot model at a teaching point, and generates a simulation image of the robot model captured by a virtual camera that is arranged so that a position and pose of the virtual camera relative to the robot model in the virtual space coincide with the estimated position and pose of the imaging device. The control device determines an amount of correction of a position and pose of the robot for the teaching point so that the position and pose of the robot on the actual image captured after the robot has been driven according to a movement command to the teaching point approximate to the position and pose of the robot model on the simulation image.

ROBOTIC CONTROL VIA A VIRTUAL WORLD SIMULATION
20200249654 · 2020-08-06 · ·

A system has a virtual-world (VW) controller and a physical-world (PW) controller. The pairing of a PW element with a VW element establishes them as corresponding physical and virtual twins. The VW controller and/or the PW controller receives measurements from one or more sensors characterizing aspects of the physical world, the VW controller generates the virtual twin, and the VW controller and/or the PW controller generates commands for one or more actuators affecting aspects of the physical world. To coordinate the corresponding virtual and physical twins, (i) the VW controller controls the virtual twin based on the physical twin or (ii) the PW controller controls the physical twin based on the virtual twin. Depending on the operating mode, one of the VW and PW controllers is a master controller, and the other is a slave controller, where the virtual and physical twins are both controlled based on one of VW or PW forces.