Patent classifications
Y10S901/05
Robot control, training and collaboration in an immersive virtual reality environment
System and methods to create an immersive virtual environment using a virtual reality system that receives parameters corresponding to a real-world robot. The real-world robot may be simulated to create a virtual robot based on the received parameters. The immersive virtual environment may be transmitted to a user. The user may supply input and interact with the virtual robot. Feedback such as the current state of the virtual robot or the real-world robot may be provided to the user. The user may train the virtual robot. The real-world robot may be programmed based on the virtual robot training.
Robot and device having multi-axis motion sensor, and method of use thereof
A device including a housing configured to attach to a robot arm, and a multi-axis motion sensor provided within the housing. The multi-axis motion sensor is configured to detect movement of the housing, and is configured to communicate with a controller of the robot arm. The device further includes a user interface configured to operate in conjunction with the multi-axis motion sensor, and a connection port provided on the housing. The connection port is configured to connect to an external device.
Apparatus and methods for removal of learned behaviors in robots
Computerized appliances may be operated by users remotely. In one implementation, a learning controller apparatus may be operated to determine association between a user indication and an action by the appliance. The user indications, e.g., gestures, posture changes, audio signals may trigger an event associated with the controller. The event may be linked to a plurality of instructions configured to communicate a command to the appliance. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The sensory input may be used to determine the user indications. During operation, upon determine the indication using sensory input, the controller may cause execution of the respective instructions in order to trigger action by the appliance. Device animation methodology may enable users to operate computerized appliances using gestures, voice commands, posture changes, and/or other customized control elements.