Patent classifications
G05B2219/35482
System and method for using virtual/augmented reality for interaction with collaborative robots in manufacturing or industrial environment
A method includes determining a movement of an industrial robot in a manufacturing environment from a first position to a second position. The method also includes displaying an image showing a trajectory of the movement of the robot on a wearable headset. The displaying of the image comprises at least one of: displaying an augmented reality (AR) graphical image or video of the trajectory superimposed on a real-time actual image of the robot, or displaying a virtual reality (VR) graphical image or video showing a graphical representation of the robot together with the trajectory.
Control system
A control system controls an industrial machine, and each of controllers includes a screen generation unit which generates a controller screen that is displayed on a controller display unit and which generates a glasses screen that is displayed on s glasses-type display device based on a variation in an internal state of the controller screen and the glasses-type display device includes: a transmissive glasses display unit which is arranged so as to correspond to the positions of the eyes of a wearer and which can display the generated glasses screen; a glasses side transmission/reception unit which acquires specific information for specifying the controller that is connected; and a display control unit which displays the glasses screen and the specific information on the glasses display unit.
WEARABLE ROBOT DATA COLLECTION SYSTEM WITH HUMAN-MACHINE OPERATION INTERFACE
A data collection system that performs data collection of human-driven robot actions for robot learning. The data collection system includes: i) a wearable computation subsystem that is worn by a human data collector and that controls the data collection process and ii) a human-machine operation interface subsystem that allows the human data collector to use the human-machine operation interface to operate an attached robotic gripper to perform one or more actions. A user interface subsystem receives instructions from the wearable computation subsystem that direct the human data collector to perform the one or more actions using the human-machine operation interface subsystem. A visual sensing subsystem includes one or more cameras that collect raw visual data related to the pose and movement of the robotic gripper while performing the one or more actions. A data collection subsystem receives collected data related to the one or more actions.
Robot control device
A robot control device includes a camera configured to be attached to a display device carried by or put on an operator and capture an environment surrounding the operator to generate an image of the environment; and a processor configured to slow down or stop motion of a predetermined robot included in the environment when the predetermined robot is not displayed on the display device, when only a portion of the predetermined robot is displayed, or when a ratio of a region representing the predetermined robot to a display area of the display device is equal to or lower than a predetermined threshold.
Operating system for a machine of the food industry
The present disclosure relates to an operating system for a machine of a food industry. The operating system includes eyeglasses for a user of the operating system and a transceiver for exchanging data between the operating system and the machine. The eyeglasses include a display system configured to display a control element and information of a human machine interface (HMI). Furthermore, the operating system includes at least one input module which receives user input from the user with respect to the control element. The operating system additionally includes a processing module. The processing module converts the received user input into an input signal for controlling at least one of the machine or the HMI.
ROBOT REMOTE OPERATION CONTROL DEVICE, ROBOT REMOTE OPERATION CONTROL SYSTEM, ROBOT REMOTE OPERATION CONTROL METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM
In a robot remote operation which recognizes a movement of an operator and transmits the movement of the operator to a robot to operate the robot, a robot remote operation control device includes: an information acquisition part, acquiring an environment sensor value acquired by an environment sensor provided in the robot or a surrounding environment of the robot and an operator sensor value, which is information indicating the movement of the operator that is detected; and an intention estimation part, estimating a motion of the operator, which is a motion instruction with respect to the robot, by using a trained model from the operator sensor value.
SYSTEMS AND METHODS FOR LIFESAVING TRAUMA STABILIZATION MEDICAL TELEPRESENCE OF A REMOTE USER
Methods and systems for providing a lifesaving trauma stabilization medical telepresence to a remote user are presented. A data connection between a lifesaving trauma stabilization helmet associated with a first user in a first location and a display device in a second location is established. First video data and first audio data are collected and transmitted to the display device. The first video data and the first audio data are output on the display device to the remote user. Contextual information for the first user is collected from the remote user. The contextual information collected from the remote user is transmitted to the lifesaving trauma stabilization helmet. The contextual information is presented to the first user as haptic feedback using a haptic output device comprising two groups of three vibrating elements having unique tones integrated to opposing lateral sides of the lifesaving trauma stabilization helmet.
SYSTEM AND METHOD FOR USING VIRTUAL/AUGMENTED REALITY FOR INTERACTION WITH COLLABORATIVE ROBOTS IN MANUFACTURING OR INDUSTRIAL ENVIRONMENT
A method includes determining a movement of an industrial robot in a manufacturing environment from a first position to a second position. The method also includes displaying an image showing a trajectory of the movement of the robot on a wearable headset. The displaying of the image comprises at least one of: displaying an augmented reality (AR) graphical image or video of the trajectory superimposed on a real-time actual image of the robot, or displaying a virtual reality (VR) graphical image or video showing a graphical representation of the robot together with the trajectory.
Robot and housing
Convenience and usefulness of a tele-existence system are enhanced taking notice of the possibility by collaboration of tele-existence and a head-mounted display apparatus. A movable member is supported for pivotal motion on a housing. In the housing, a driving motor and a transmission member for transmitting rotation of the driving motor to the movable member are provided. A state information acquisition unit acquires facial expression information and/or emotion information of a user who wears a head-mounted display apparatus. A driving controlling unit controls rotation of the driving motor on the basis of the facial expression information and/or the emotion information.
CONTROL SYSTEM
A control system controls an industrial machine, and each of controllers includes a screen generation unit which generates a controller screen that is displayed on a controller display unit and which generates a glasses screen that is displayed on s glasses-type display device based on a variation in an internal state of the controller screen and the glasses-type display device includes: a transmissive glasses display unit which is arranged so as to correspond to the positions of the eyes of a wearer and which can display the generated glasses screen; a glasses side transmission/reception unit which acquires specific information for specifying the controller that is connected; and a display control unit which displays the glasses screen and the specific information on the glasses display unit.