Patent classifications
G05B2219/40131
INFORMATION PROCESSING DEVICE, ROBOT MANIPULATING SYSTEM AND ROBOT MANIPULATING METHOD
A robot manipulating system includes a game terminal having a game computer, a game controller, and a display configured to display a virtual space, a robot configured to perform a work in a real space based on robot control data, and an information processing device configured to mediate between the game terminal and the robot. The information processing device supplies game data associated with a content of work to the game terminal, acquires game manipulation data including a history of an input of manipulation accepted by the game controller while a game program to which the game data is reflected is executed, converts the game manipulation data into the robot control data based on a given conversion rule, and supplies the robot control data to the robot.
Streaming media transmission method and client applied to virtual reality technology
Embodiments of the present invention describe streaming media transmission methods and apparatus applied to a virtual reality technology. A method for streaming media transmissions may include sending a media information obtaining request to a server, where the media information obtaining request includes client capability information and auxiliary information, the client capability information indicates that the client supports reception of data pushed by the server, and the auxiliary information indicates an attribute that the client supports virtual reality presentation. The method may also include receiving a media presentation description and media data, where the media presentation description and the media data are sent by the server after the server responds to the media information obtaining request. According to the streaming media transmission methods and apparatus applied to a virtual reality technology in the embodiments of the present invention, a transmission delay can be reduced, and transmission efficiency can be improved.
SYSTEM AND METHOD FOR USING VIRTUAL/AUGMENTED REALITY FOR INTERACTION WITH COLLABORATIVE ROBOTS IN MANUFACTURING OR INDUSTRIAL ENVIRONMENT
A method includes determining a movement of an industrial robot in a manufacturing environment from a first position to a second position. The method also includes displaying an image showing a trajectory of the movement of the robot on a wearable headset. The displaying of the image comprises at least one of: displaying an augmented reality (AR) graphical image or video of the trajectory superimposed on a real-time actual image of the robot, or displaying a virtual reality (VR) graphical image or video showing a graphical representation of the robot together with the trajectory.
METHOD FOR USING A MULTI-LINK ACTUATED MECHANISM, PREFERABLY A ROBOT, PARTICULARLY PREFERABLY AN ARTICULATED ROBOT, BY A USER BY MEANS OF A MOBILE DISPLAY APPARATUS
A method at least including the steps of aligning an image capturing element of a mobile display apparatus on a multi-link actuated mechanism by a user, capturing at least the multi-link actuated mechanism by means of the image capturing element of the mobile display apparatus, identifying the multi-link actuated mechanism in the captured image data of the image capturing element of the mobile display apparatus, indicating in three dimensions the multi-link actuated mechanism on the basis of the captured image data together with the depth information items, and overlaying the virtual representation of the multi-link actuated mechanism on the multi-link actuated mechanism in the display element of the mobile display apparatus, wherein the overlay is implemented taking account of the geometric relationships of the multi-link actuated mechanism.
VIRTUAL DESIGN ENVIRONMENT
An industrial integrated development environment (IDE) supports a virtual design environment that allows an automation system designer to perform project development via interaction with a virtual reality presentation of the plant facility. The industrial design environment can generate system project data for an automation project—including but not limited to device selections, industrial control programming, device configurations, visualizations, engineering drawings, etc.—based on the developer's manual interactions with the virtual reality presentation. These interactions can include, for example, placing and moving machines or other industrial assets within the virtualized environment, defining trajectories of motion devices or robots using manual gestures, or other such interactive input. The IDE system interprets the developer's interactions as design specifications for the automation system being designed and translates these interactions into control code, visualizations, device configurations, and other system aspects that satisfy the design specifications.
Systems and method for robotic learning of industrial tasks based on human demonstration
A system for performing industrial tasks includes a robot and a computing device. The robot includes one or more sensors that collect data corresponding to the robot and an environment surrounding the robot. The computing device includes a user interface, a processor, and a memory. The memory includes instructions that, when executed by the processor, cause the processor to receive the collected data from the robot, generate a virtual recreation of the robot and the environment surrounding the robot, receive inputs from a human operator controlling the robot to demonstrate an industrial task. The system is configured to learn how to perform the industrial task based on the human operator's demonstration of the task, and perform, via the robot, the industrial task autonomously or semi-autonomously.
METHODS AND SYSTEMS FOR ASSIGNING FORCE VECTORS TO ROBOTIC TASKS
A system is disclosed and includes an electronic controller configured to generate a virtual reality representation of an environment. The electronic controller is configured to generate a menu within the virtual reality representation of the environment comprising at least one task user interface element and determine when an option for configuring a force parameter is selected from the at least one task user interface element in the menu. The electronic controller is configured to prompt a user to configure the force parameter for a virtual robot manipulation task and assign at least one of a force magnitude or a force direction to the virtual robot manipulation task in response to an input received from the prompt to configure the force parameter.
Autonomous robot telerobotic interface
An indication of a task to be performed in a network data center is received. A robotic manipulator of an autonomous robot is controlled to autonomously perform at least a portion of the task. It is determined that an assistance is required in performing an identified limited portion of the task. A notification of a request for the assistance is provided. A remote assistance from an operator in performing the identified limited portion of the task is received. Autonomous performance of the task is resumed after completion of the remote assistance for the identified limited portion of the task.
System and method for flexible human-machine collaboration
Methods and systems for enabling human-machine collaborations include a generalizable framework that supports dynamic adaptation and reuse of robotic capability representations and human-machine collaborative behaviors. Specifically, a method of feedback-enabled user-robot collaboration includes obtaining a robot capability that models a robot's functionality for performing task actions, specializing the robot capability with an information kernel that encapsulates task-related parameters associated with the task actions, and providing an instance of the specialized robot capability as a robot capability element that controls the robot's functionality based on the task-related parameters. The method also includes obtaining, based on the robot capability element's user interaction requirements, user interaction capability elements, via which the robot capability element receives user input and provides user feedback, controlling, based on the task-related parameters, the robot's functionality to perform the task actions in collaboration with the user input; and providing the user feedback including task-related information generated by the robot capability element in association with the task actions.
Development of control applications in augmented reality environment
A system and method is disclosed for development of a control application for a controller of an automation system. The controller receives sensor signals associated with perception of a first real component during an execution of the control application program. Activity of a virtual component, including interaction with the real first component, is simulated, the virtual component being a digital twin of a second real component designed for the work environment and absent in the work environment. Virtual data is produced in response to the simulated activity of the virtual component. A control application module determines parameters for development of the control application program using the sensor signals for the first real component and the virtual data. An AR display signal for the work environment is rendered and displayed based on a digital representation of the virtual data during an execution of the control application program.