Patent classifications
G05B2219/35444
SYSTEM AND METHOD UTILIZING AN INTEGRATED CAMERA WITH A FLUID INJECTOR
A fluid injector system configured for use in administering at least one fluid to a patient, the fluid injector system including at least one image capture device configured for capturing image data in an environment surrounding the fluid injector system; and a control device comprising at least one processor programmed or configured to receive, with the at least one processor, the image data captured by the at least one image capture device; determine, with the at least one processor, whether the received image data comprises at least one predetermined characteristic; and perform, with the at least one processor, at least one action in response to determining whether the received image data comprises at least one predetermined char-acteristic.
Systems and methods for adaptive gesture recognition
Systems and methods are described for adaptively recognizing gestures indicated by user inputs received from a touchpad, touchscreen, directional pad, mouse or other multi-directional input device. If a user's movement does not indicate a gesture using current gesture recognition parameters, additional processing can be performed to recognize the gesture using other factors. The gesture recognition parameters can then be adapted so that subsequent user inputs that are similar to the previously-rejected inputs will appropriately trigger gesture commands as desired by the user. Gestural data or parameters may be locally or remotely stored for further processing.
METHOD, SYSTEM AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM FOR SUPPORTING OBJECT CONTROL
A method of assisting an object control is provided. The method includes: determining an instruction vector with reference to at least one of whether or not a trigger event relating to a movement of a control means is generated and a distance between a motion coordinate of the control means and a control object region; and determining, when a position of the control means is changed, a control position in the control object region with reference to a vector that connects a virtual reference point specified based on an extension line of the instruction vector before the position of the control means is changed and the motion coordinate of the control means after the position of the control means is changed.
ROBOT CONTROL SYSTEM
A posture estimating section acquires image data in which images of a person are recorded, to estimate a posture of the person. A motion control section controls the motion of a robot device on the basis of an estimation result of the posture estimating section. A synchronization control section synchronizes a posture of the robot device with the posture of the person estimated by the posture estimating section. A correction processing section corrects the synchronized motion of the robot device made by the synchronization control section.
REDUNDANT TOUCHLESS INPUTS FOR AUTOMATION SYSTEM
A human machine interface for an industrial automation control system includes at least one touchless input device that is adapted to be in a first state in which said human machine interface provides a first input to said industrial automation control system or a second state in which said human machine interface provides a second input to said industrial automation control system. The at least one touchless input device includes first and second touchless input sensors each configured to detect hand gestures of an operator's hand to provide input to said human machine interface based upon said gestures. The first and second touchless input sensors can be identical with respect to each other or different. In one example, one or both of the sensors are both time-of-flight sensors and one of the sensors can be an electric field proximity sensor. A method of providing a human machine interface with at least one touchless input device is provided. In one embodiment, the touchless input device provides an emergency stop (Estop) switch device.
GESTURES AND TOUCH IN OPERATOR INTERFACE
Techniques for adjusting process variables in a process plant using gesture-based input on a user interface device include presenting a graphic representation associated with a process plant entity in the process plant and an indication of a process variable value corresponding to the process plant entity as measured with the actual process plant and providing a user control for receiving gesture-based input at a location on the user interface device corresponding to the graphic representation associated with the process plant entity to adjust the process variable value. In response to receiving gesture-based input from an operator to adjust the process variable value for the corresponding process plant entity, the user interface device presents an adjusted process variable value and sets the process variable in the actual process plant to the adjusted process variable value.
DETERMINING AND EVALUATING DATA REPRESENTING AN ACTION TO BE PERFORMED BY A ROBOT
In one embodiment, a processor accesses sensor input data received from one or more sensors. The sensor input data represents one or more gestures. The processor determines, based on the sensor input data representing the one or more gestures, action data representing an action to be performed by a robot. The action includes physical movements of the robot. The processor evaluates the action data representing the action to be performed by the robot in light of evaluation data.
Robot system and method for controlling the same
Provided is a robot system. The robot system includes a manipulator configured to perform a preset operation on a plurality of objects, a transparent cover configured to define a chamber in which the plurality of objects and the manipulator are accommodated, the transparent cover being provided with a touch panel, a camera installed to face an internal region of the chamber, a projector configured to emit light to one area within the chamber, and a controller configured to control the projector so that the projector emits the light to a target area corresponding to a touch point of the touch panel, recognize a target object disposed in the target area based on image information of the camera, and control the manipulator so that an operation is performed on the target object.
Method and Device for Creating a Robot Control Program
The present disclosure relates to a method for creating a robot control program for operating a machine tool, in particular a bending machine, having the steps of generating image material of a machining operation of a workpiece on the machine tool by means of at least one optical sensor; extracting at least one part of the workpiece and/or at least one part of a hand of an operator handling the workpiece from the image material; generating a trajectory and/or a sequence of movement points of at least one part of the workpiece and/or at least one part of a hand of an operator from the extracted image material; and creating a robot control program by reverse transformation of the trajectory and/or the sequence of movement points.
Systems and methods for optical performance captured animated figure with real-time reactive projected media
A reactive media system includes a motion control system having an animated figure with a body and actuators configured to adjust a figure portion of the body in response to interactive data received from one or more interactive data sources. The reactive media system includes a media control system having a tracking camera configured to generate signals indicative of a current position and orientation of the figure portion based on a set of trackers coupled to the figure portion. The media control system includes a media controller configured to receive the signals, determine the current position and orientation based on the signals, and generate data indicative of images to be projected onto an external surface of the figure portion having the current position and orientation. The media control system also includes a projector configured to receive the data from the media controller and project the images onto the external surface.