Patent classifications
G05B2219/40131
VIRTUAL PRESENCE FOR TELEROBOTICS IN A DYNAMIC SCENE
Described herein are methods and systems for providing virtual presence for telerobotics in a dynamic scene. A sensor captures frames of a scene comprising one or more objects. A computing device generates a set of feature points corresponding to objects in the scene and matches the set of feature points to 3D points in a map of the scene. The computing device generates a dense mesh of the scene and the objects using the matched feature points and transmits the dense mesh the frame to a remote viewing device. The remote viewing device generates a 3D representation of the scene and the objects for display to a user and receives commands from the user corresponding to interaction with the 3D representation of the scene. The remote viewing device transmits the commands to a robot device that executes the commands to perform operations on the objects in the scene.
REMOTE CONTROL DEVICE
A remote control device is a remote control device for an operator to remotely control a real machine, and includes: a limitation unit that sets a movable range of the real machine on the basis of limitation information from the operator; a control unit that generates first command information that is input information for operating the real machine on the basis of control information from the operator and the movable range; and a video display unit that displays a real machine video that is a video of the real machine and a movable information video that is a video of the movable range corresponding to the real machine.
Virtual assistant factory computing platform
Various aspects of the disclosure relate to a virtual assistant factory that supports back office operations within a computing system. Various internal services (e.g., banking services, user management services, and the like) may be accessible to external application via an application programming interface. In some cases, a virtual assistant factory computing platform may expose services that call on and/or are tied to various back office computing operations. The virtual assistant factory platform may spin up and host robots, controls, and/or processes that may supplement computing operations and other associated back office operations. The application interface may call API functions that may cause the virtual assistant factory to spin up a virtual assistant to perform various functions and interact with both the user and the appropriate back office operations and/or third-party computing systems.
VIRTUAL DESIGN ENVIRONMENT
An industrial integrated development environment (IDE) supports a virtual design environment that allows an automation system designer to perform project development via interaction with a virtual reality presentation of the plant facility. The industrial design environment can generate system project data for an automation project—including but not limited to device selections, industrial control programming, device configurations, visualizations, engineering drawings, etc. —based on the developer's manual interactions with the virtual reality presentation. These interactions can include, for example, placing and moving machines or other industrial assets within the virtualized environment, defining trajectories of motion devices or robots using manual gestures, or other such interactive input. The IDE system interprets the developer's interactions as design specifications for the automation system being designed and translates these interactions into control code, visualizations, device configurations, and other system aspects that satisfy the design specifications.
Virtual design environment
An industrial integrated development environment (IDE) supports a virtual design environment that allows an automation system designer to perform project development via interaction with a virtual reality presentation of the plant facility. The industrial design environment can generate system project data for an automation project—including but not limited to device selections, industrial control programming, device configurations, visualizations, engineering drawings, etc. —based on the developer's manual interactions with the virtual reality presentation. These interactions can include, for example, placing and moving machines or other industrial assets within the virtualized environment, defining trajectories of motion devices or robots using manual gestures, or other such interactive input. The IDE system interprets the developer's interactions as design specifications for the automation system being designed and translates these interactions into control code, visualizations, device configurations, and other system aspects that satisfy the design specifications.
Robot control, training and collaboration in an immersive virtual reality environment
System and methods to create an immersive virtual environment using a virtual reality system that receives parameters corresponding to a real-world robot. The real-world robot may be simulated to create a virtual robot based on the received parameters. The immersive virtual environment may be transmitted to a user. The user may supply input and interact with the virtual robot. Feedback such as the current state of the virtual robot or the real-world robot may be provided to the user. The user may train the virtual robot. The real-world robot may be programmed based on the virtual robot training.
Systems and methods for distributed training and management of AI-powered robots using teleoperation via virtual spaces
In some aspects, a system comprises a computer hardware processor and a non-transitory computer-readable storage medium storing processor-executable instructions for receiving, from one or more sensors, sensor data relating to a robot; generating, using a statistical model, based on the sensor data, first control information for the robot to accomplish a task; transmitting, to the robot, the first control information for execution of the task; and receiving, from the robot, a result of execution of the task.
Robot system
A robot system includes a robot, a robot controller, a video acquisition device configured to acquire a real video of a work space, and a head-mounted type video display device provided with a visual line tracking section configured to acquire visual line information. A robot controller includes an information storage section configured to store information used for controlling the robot while associating the information with a type of an object, a gaze target identification section configured to identify, in the video, a gaze target viewed by a wearer based on the visual line information, and a display processing section configured to cause the video display device to display the information associated with the object corresponding to the identified gaze target, side by side with the gaze target in the form of one image through which the wearer can visually grasp, select, or set contents of the information.
Apparatus and Method for Immersive Computer Interaction
Methods and an arrangement for immersive human computer interaction with a virtual mechanical operator of an industrial automation arrangement in virtual reality, wherein input information is transmitted to a component of the arrangement through the interaction with the virtual operator modelled in a simulation device for a rigid-body simulation, where the virtual operator is replicated in the virtual reality, an interaction with the represented virtual operator is detected by the virtual reality environment, second parameters calculated from first parameters of the detected virtual interaction are transmitted to the simulation device and used via the modelled virtual operator to simulate movement of a part of the operator, whether a switching state change of the virtual operator is produced by the simulated movement is decided, and where the switching state or the switching state change is reported as the input information to the component at least when a switching state change occurs.
ROBOTICS FOR THREE-DIMENSIONAL PRINTING
The present disclosure provides systems and methods for training a robot. The systems and methods may provide a robotic system. The robotic system may comprise a trainable robot and a sensor. The sensor may be attached to at least one physical tool. The method may comprise using the sensor to capture movement from a user operating the at least one physical tool. The method may include using at least the movement captured to train the robot, such that upon training, the robot may be trained to perform at least the movement.