Patent classifications
G05B2219/35444
CONTROL DEVICE, CONTROL METHOD AND COMPUTER-READABLE STORAGE MEDIUM
A control device 1B includes a preprocessor 21B, a translator 22B and an intention detector 23B. The preprocessor 21B is configured to generate movement signals of a target human 10C subjected to assistance by processing a detection signal Sd outputted by a first sensor which senses the target human 10C. The translator 21B is configured to identify a gesture of the target human 10C by use of the movement signals Sd, the gesture being expressed by a pose and/or movement of the target human 10C. The intention detector 23B is configured to detect an intention of the target human 10C based on history of an event and the identified gesture, the event relating to the assistance.
ROBOT CONTROL USING GESTURES
A method and a device for operating a robot are provided. According to an example of the method, information of a first gesture is acquired from a group of gestures of an operator, each gesture from the group of gestures corresponding to an operation instruction from a group of operation instructions. A first operation instruction from the group of operation instructions is obtained based on the acquired information of the first gesture, the first operation corresponding to the first gesture. The first operation instruction is executed.
SYSTEMS AND METHODS FOR ADAPTIVE GESTURE RECOGNITION
Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. Gestural data or parameters may be locally or remotely stored for further processing.
Gestural control of an industrial robot
A robot system is configured to identify gestures performed by an end-user proximate to a work piece. The robot system then determines a set of modifications to be made to the work piece based on the gestures. A projector coupled to the robot system projects images onto the work piece that represent the modification to be made and/or a CAD model of the work piece. The robot system then performs the modifications.
Brain-computer interface based robotic arm self-assisting system and method
Disclosed are a brain-computer interface based robotic arm self-assisting system and method. The system comprises a sensing layer, a decision-making layer and an execution layer. The sensing layer comprises an electroencephalogram acquisition and detection module and a visual identification and positioning module and is used for analyzing and identifying the intent of a user and identifying and locating positions of a corresponding cup and the user's mouth based on the user intent. The execution layer comprises a robotic arm control module that performs trajectory planning and control for a robotic arm based on an execution instruction received from a decision-making module. The decision-making layer comprises the decision-making module that is connected to the electroencephalogram acquisition and detection module, the visual identification and positioning module and the robotic arm control module to implement the acquisition and transmission of data of an electroencephalogram signal, a located position and a robotic arm status and the sending of the execution instruction for the robotic arm. The system combines the visual identification and positioning technology, a brain-computer interface and a robotic arm to facilitate paralyzed patients to drink water by themselves, improving the quality of life of the paralyzed patients.
METHOD AND OPERATING SYSTEM FOR SETTING UP A MACHINING DEVICE, MACHINING DEVICE, AND COMPUTER PROGRAM FOR SETTING UP A MACHINING DEVICE
The invention relates to a method for setting up a machining device (10), in particular a machining device (10) for machining workpieces (11) which are at least partially made of wood, wood materials, synthetic material, composite materials or the like, comprising the steps of optically detecting the machining device (10) at least in some areas by means of an image capture device (13); detecting, by means of the image capture device (13), an individual first gesture (21) of an operator of the machining device (10) that is performed in the optically detected area; and triggering at least one function of the machining device (10) depending on the detected individual first gesture (21), and to an operating system (12) for setting up a machining device (10), a machining device (10) and a computer program for setting up a machining device (10).
OPERATION SYSTEM FOR INDUSTRIAL MACHINERY
An operation system for industrial machinery comprises: an information acquisition unit which acquires machine identification information corresponding to an industrial machine; a machine identification unit which identifies the industrial machine on the basis of the acquired machine identification information; a model projection unit which projects a model corresponding to the identified industrial machine into a virtual space; a distance/direction calculation unit which calculates the distance and direction, of a user observing the model, with respect to the projected model; a gesture observation unit which observes the gesture of the user as an instruction from the user to the identified industrial machine; an instruction determination unit which determines whether or not a user can give an instruction; and an operation instruction unit which operates the identified industrial machine on the basis of the observed gesture of the user, when the determination result is positive.
SYSTEM AND METHOD FOR FACTORY AUTOMATION AND ENHANCING MACHINE CAPABILITIES
System and Method for Factory Automation and Enhancing Machine Capabilities The present invention relates to system and method for factory automation and for enhancing machine capabilities. The system is configured to extract data from machine through data capturing module, data extraction module and data conversion module and jointly analyze them with data from other equipment and sensors through data analysis engine and transmitting to factory systems in user configured protocol through protocol conversion module. It also accepts machine control commands from the factory systems and executes them through command processor and mouse & keyboard simulator modules. It involves combining GUI from multiple equipments/sensors and sending to single display device through GUI manager and video output modules and mapping required actions of mouse clicks and keyboard entries from new to old user interfaces. GUI manager module detects GUI elements from captured images and applies user configuration to automatically
FACILITY MANAGEMENT SYSTEM
A facility management system, FMS (1) and method, for planning and/or controlling a facility (2), in particular a fabrication facility, including a plurality of components, C. The facility management system, FMS (1), includes at least one apparatus (3) adapted to load component data object cubes, CDOCs, from a data cube library, DCL, stored in a database (4) of the facility management system (1), and to link the component data object cubes, CDOCs, to the component data objects, CDOs, representing components, C, of the facility (2). The loaded component data object cubes, CDOCs, linked to the component data objects, CDOs, include parameters, P, of the respective components, C, and are editable in a data edit mode, DEM, of said facility management system (1).
NUMERICAL CONTROLLER
A numerical controller making it possible to manipulate a processing machine just as an operator intends without visually observing a screen is for controlling the machine and is provided with: a touch-type pointing device capable of detecting a plurality of touch manipulations performed at the same time; a manipulation analyzing portion analyzing and extracting a first manipulation which is a touch manipulation by at least one touch and a second manipulation which is a manipulation performed while a touch state by the first manipulation is maintained, from among manipulations detected by the touch-type pointing device; and an operation deciding portion deciding a function to be caused to operate, based on the first manipulation and the second manipulation and giving an instruction to perform the operation.