Patent classifications
G05B2219/39449
Terminal for Processing Facilities
A terminal for processing facilities having a communication device arranged to exchange information about the processing facility with a communication partner, a screen arranged to display the information about the processing facility, a position sensor arranged to collect alignment information relating to an alignment of the terminal and a processor arranged to control the communication device, the screen and the position sensor. The functionality of the terminal is extended by the fact that the processor sets a function of the terminal dependent on alignment information from the position sensor.
SYSTEMS AND METHODS FOR GENERATING AUGMENTED REALITY ENVIRONMENTS
This disclosure relates to systems and methods for generating augmented reality environments. An augmented reality environment may be generated using physical tags placed throughout a real-world environment, for example, on real-world objects and/or surfaces. Identifying features of the tags may be detected. The presence of one or more particular tags may be determined by virtue of the detected identifying features. A computing platform may present or show views of the real-world and may overlay images of virtual object at locations that correspond to the tags present in the real-world environment.
Image generation apparatus, image generation method, and non-transitory recording medium recording program
An image generation apparatus includes: a memory; and one or more processors configured to receive moving object-related information regarding a moving object from the moving object having identification information that is recognizable from the outside of the moving object, receive, from the usage terminal, usage terminal information regarding a usage terminal and imaging data including an image of the moving object captured by the usage terminal configured to image the moving object, specify the moving object based on the identification information included in the imaging data, generate superimposition image data based on the moving object-related information of the specified moving object and the usage terminal information of the usage terminal having output the imaging data including the image of the moving object, the superimposition image data being an image data to be displayed on a display unit of the usage terminal while being superimposed on the imaging data.
METHOD, ELECTRONIC DEVICE, AND COMPUTER PROGRAM PRODUCT FOR MONITORING FIELD DEVICE
The present disclosure relates to a method, an electronic device, and a computer program product for monitoring a field device. For example, a method for monitoring a field device is provided. The method may include receiving facility information data associated with locations of a group of field devices and a sensing data set acquired by a sensing apparatus arranged near the group of field devices. The method may further include determining, according to a determination that sensing data associated with at least one field device in the group of field devices in the sensing data set is abnormal, a target location of the at least one field device based on the facility information data. In addition, the method may further include generating navigation information from a source location where a user is located to the target location.
Laser machining system including laser machining head and imaging device
A machining system includes a laser irradiation device, a camera that captures an image of a workpiece, and a display that displays the image captured by the camera. The machining system includes a robot control device that controls the camera and the display. The camera captures the image of the workpiece before machining. The robot control device calculates a laser-beam irradiation position on the workpiece and virtually displays the laser-beam irradiation position on the display so as to overlap the workpiece image captured by the camera.
Simulation device for robot
Provided is a simulation device for a simulation of a cooperative task carried out cooperatively by a cooperative robot and a person, and the simulation device includes a head mounting-type display device to be mounted on an operator to simulatively carry out the cooperative task, a detecting section configured to detect a position of the operator in a real space, a three-dimensional model display section configured to cause an image in which a robot system model including a cooperative robot model is arranged in a three-dimensional virtual space to be displayed on the head mounting-type display device, and a simulation execution section configured to simulatively operate the cooperative robot model in the three-dimensional virtual space based on an operation program of the cooperative robot to carry out the cooperative task and the detected position of the operator.
Sensing system, work system, augmented-reality-image displaying method, augmented-reality-image storing method, and program
A sensing system with a detecting device that is used to detect a position of a target and a controller, where, for display on a display device or projection by a projection apparatus, the controller creates an augmented-reality image that shows: at least one of a setting related to detection of the target using the detecting device, a setting of a moving apparatus, and a setting of a robot that performs work on the target, a position of the target being recognized by the controller, a result of the detection of the target, a work plan of the moving apparatus, a work plan of the robot, a determination result of the controller and a parameter related to the target.
Robot controller and display device using augmented reality and mixed reality
A robot controller configured to assist an operation of a user, by effectively utilizing both techniques of augmented reality and mixed reality. The robot controller includes: a display device configured to display information generated by a computer so that the information is overlapped with an actual environment, etc.; a position and orientation obtaining section configured to obtain relative position and orientation between the display device and a robot included in the actual environment; a display controlling section configured to display a virtual model of the robot, etc., on the display device; an operation controlling section configured to operate the virtual model displayed on the display device; and a position and orientation determining section configured to determine the position and/or orientation of the robot by using the position and/or orientation of the operated virtual model and using the relative position and orientation between the robot and the display device.
SYNTHETIC REPRESENTATION OF A SURGICAL ROBOT
A system comprises a first robotic arm adapted to support and move a tool and a second robotic arm adapted to support and move a camera configured to capture an image of a camera field of view. The system further comprises an input device, a display, and a processor. The processor is configured to display a first synthetic image including a first synthetic image of the tool. The first synthetic image of the tool includes a portion of the tool outside of the camera field of view. The processor is also configured to receive a user input at the input device and responsive to the user input, change the display of the first synthetic image to a display of a second synthetic image including a second synthetic image of the tool that is different from the first synthetic image of the tool.
DISPLAY DEVICE FOR INDUSTRIAL MACHINE
Provided is a display device for an industrial machine capable of automatically switching between augmented reality display and 3D computer graphics display. A robot display device 2 comprises: a camera 2b; a display unit 20 capable of switching between a first display by means of 3D computer graphics for a robot 3 or for the robot 3 and peripherals 4 thereof, and a second display using augmented reality for the robot 3 and the peripherals 4 thereof captured by the camera 2b; and a selection unit 21 that activates the first display and displays the first display on the display unit when the camera 2b does not face the robot and the peripherals 4 thereof, and activates the second display and displays the second display on the display unit 20 when the camera 2b faces the robot 3 and the peripherals 4 thereof.