Patent classifications
G05B2219/39449
Laser Processing Robot System for Performing Laser Processing Using Robot
A laser processing robot system, in which an augmented reality processing technology is used to enable a processing laser beam and its irradiation position to be safely and easily seen, is provided. A laser processing robot system includes an image processing device having an augmented reality image processing unit for performing augmented reality image processing for an actual image including an image of a robot captured by an imaging device. The augmented reality image processing unit is adapted to superimpose a virtual image representing at least one of a laser beam obtained by assuming that the laser beam is emitted from a laser irradiation device to a workpiece, and an irradiation position of the laser beam, onto the actual image, and to display the superimposed image on the display device.
METHOD AND SYSTEM FOR PROGRAMMING A ROBOT
A method comprising identifying a robotic device and a calibration fixture in a vicinity of the robotic device; referencing the calibration fixture to a base of the robotic device to determine a first pose of the robotic device; receiving a 3D image of the environment, wherein the 3D image includes the calibration fixture; determining a second pose of the calibration fixture relative to the sensor; determining a third pose of the robotic device relative to the sensor based on the first pose and the second pose; receiving a plurality of trajectory points; determining a plurality of virtual trajectory points corresponding to the plurality of trajectory points based on the 3D image and the third pose; providing for display of the plurality of virtual trajectory points; and providing an interface for manipulating the virtual trajectory points.
Display device and display program
A display device includes an augmented reality display part that displays a virtual robot that operates in accordance with a predetermined program together with objects in real space, a position detection part that detects the position of a work target in real space by measuring a distance to the work target from the augmented reality display part, and a control part that causes the virtual robot displayed on the augmented reality display part to operate based on the position of the work target detected by the position detection part to perform predetermined work on the work target.
METHOD AND SYSTEM FOR REMOTE COLLABORATION
A method for a remote collaboration, as a method for providing an augmented reality (AR)-based remote collaboration between a robot located in a worksite, a field worker terminal, and a remote administrator terminal located outside the worksite, includes acquiring a captured image including a field image captured by a robot located at the worksite or a captured image including a user image captured by the field worker terminal, displaying the captured image of the worksite, generating virtual content based on an input of a remote administrator and a field worker with respect to the displayed captured image, and displaying an AR image in which the virtual content is augmented on the displayed captured image.
Remote control manipulator system and control device
A remote control manipulator system includes a manipulator controlled remotely by an operator; a camera to capture an image including the manipulator; a posture sensor to detect posture data; an action instruction inputter with which the operator inputs an action instruction instructing action to move or stop the manipulator; a control device including a structural data storage to store manipulator structural data representing a structure of the manipulator, a model image generator to generate a model image with referring to the structural data storage and the posture data, and a presentation image generator to generate a presentation image by superimposing a model image on the captured image; and a display to display the presentation image.
Technologies for fault related visual content
This technology enables a receipt of a plurality of readings from a sensor monitoring a piece of equipment in a building. The receipt enables an identification of a present fault or a projected fault in the piece of equipment. The identification enables a generation of an augmented reality content related to the present fault or the projected fault. The augmented reality content is sent to a mobile device when the mobile device is in proximity of the piece of equipment.
OPERATION SYSTEM FOR INDUSTRIAL MACHINERY
An operation system for industrial machinery comprises: an information acquisition unit which acquires machine identification information corresponding to an industrial machine; a machine identification unit which identifies the industrial machine on the basis of the acquired machine identification information; a model projection unit which projects a model corresponding to the identified industrial machine into a virtual space; a distance/direction calculation unit which calculates the distance and direction, of a user observing the model, with respect to the projected model; a gesture observation unit which observes the gesture of the user as an instruction from the user to the identified industrial machine; an instruction determination unit which determines whether or not a user can give an instruction; and an operation instruction unit which operates the identified industrial machine on the basis of the observed gesture of the user, when the determination result is positive.
Off-line programming apparatus, robot controller, and augmented reality system
An off-line programming apparatus includes a model creation unit that creates three-dimensional models of a robot and a load, a storage unit that stores a dynamic parameter of the load, a graphic creation unit that creates a three-dimensional graphic representing the dynamic parameter based on the dynamic parameter, and a display unit that displays the three-dimensional models of the robot and the load and the three-dimensional graphic. The dynamic parameter includes inertia around three axes that are orthogonal to one another at a centroid of the load. The three-dimensional graphic is a solid defined by dimensions in three directions orthogonal to one another. The graphic creation unit sets a ratio of the dimensions in the three directions of the three-dimensional graphic to a ratio corresponding to a ratio of the inertia around the three axes.
SYNTHETIC REPRESENTATION OF A SURGICAL ROBOT
A synthetic representation of a robot tool for display on a user interface of a robotic system. The synthetic representation may be used to show the position of a view volume of an image capture device with respect to the robot. The synthetic representation may also be used to find a tool that is outside of the field of view, to display range of motion limits for a tool, to remotely communicate information about the robot, and to detect collisions.
SYNTHETIC REPRESENTATION OF A SURGICAL INSTRUMENT
A synthetic representation of a tool for display on a user interface of a robotic system. The synthetic representation may be used to show force on the tool, an actual position of the tool, or to show the location of the tool when out of a field of view. A three-dimensional pointer is also provided for a viewer in the surgeon console of a telesurgical system.