Patent classifications
G05B19/423
Method for controlling an industrial robot during lead-through programming of the robot and an industrial robot
An industrial robot having a manipulator and a robot controller configured to control the motions of the manipulator. The robot controller is configured during lead-through programming of the robot to compare a robot position or a robot orientation (TCP) with at least one virtual position or virtual orientation defined in space, and to actively control the motions of the robot in relation to the at least one virtual position or virtual orientation when the difference between the robot position or robot orientation and the least one virtual position or virtual orientation is smaller than an offset value.
Robot Teaching System Based On Image Segmentation And Surface Electromyography And Robot Teaching Method Thereof
The present invention relates to a robot teaching system based on image segmentation and surface electromyography and robot teaching method thereof, comprising a RGB-D camera, a surface electromyography sensor, a robot and a computer, wherein the RGB-D camera collects video information of robot teaching scenes and sends to the computer; the surface electromyography sensor acquires surface electromyography signals and inertial acceleration signals of the robot teacher, and sends to the computer; the computer recognizes a articulated arm and a human joint, detects a contact position between the articulated arm and the human joint, and further calculates strength and direction of forces rendered from a human contact position after the human joint contacts the articulated arm, and sends a signal controlling the contacted articulated arm to move along with such a strength and direction of forces and robot teaching is done.
METHOD, SYSTEM AND NONVOLATILE STORAGE MEDIUM
Disclosed herein is a method, system, and non-volatile storage medium for simplifying the automation of a process of flow. The method may include determining a machine-independent process model based on data representing a handling of a work tool for performing a process flow. The process flow may include a plurality of sub-processes and the process model may link a process activity with spatial information for each sub-process. The method may also include mapping the machine-independent process model to a machine-specific control model of a machine using a model of the machine. The machine-specific control model may define an operating point of the machine for each sub-process, and the operating point may correspond to the process activity and to the spatial information.
METHOD, SYSTEM AND NONVOLATILE STORAGE MEDIUM
Disclosed herein is a method, system, and non-volatile storage medium for simplifying the automation of a process of flow. The method may include determining a machine-independent process model based on data representing a handling of a work tool for performing a process flow. The process flow may include a plurality of sub-processes and the process model may link a process activity with spatial information for each sub-process. The method may also include mapping the machine-independent process model to a machine-specific control model of a machine using a model of the machine. The machine-specific control model may define an operating point of the machine for each sub-process, and the operating point may correspond to the process activity and to the spatial information.
Orientation Angle Display During the Manual Guidance of a Robot Manipulator
A robot system with a robot manipulator and with a visual output unit, wherein the robot manipulator includes a robot link and the robot link includes an inertial measuring unit, wherein the inertial measuring unit is designed to determine a direction of a gravity vector when the robot link is immobile, and to determine, over a plurality of points in time, a current orientation of the robot link in relation to the gravity vector using attitude gyros, and to transmit, to the visual output unit, the current orientation of the robot link in relation to the gravity vector, and wherein the visual output unit is designed to display the current orientation of the robot link in relation to the gravity vector.
Control device and robot system
In teaching of a robot, a control device controls a movable unit in a first control mode in which the movable unit continuously moves according to a force detected by a force detector and a second control mode in which the movable unit moves by a predetermined movement amount according to the force detected by the force detector. A controller selects a first control mode or a second control mode according to a temporal change in the force detected by the force detector and a magnitude of the force.
Control device and robot system
In teaching of a robot, a control device controls a movable unit in a first control mode in which the movable unit continuously moves according to a force detected by a force detector and a second control mode in which the movable unit moves by a predetermined movement amount according to the force detected by the force detector. A controller selects a first control mode or a second control mode according to a temporal change in the force detected by the force detector and a magnitude of the force.
FEEDBACK CONTINUOUS POSITIONING CONTROL OF END-EFFECTORS
A positioning controller (50) including an imaging predictive model (80) and inverse control predictive model (70). In operation, the controller (50) applies the imaging predictive model (80) to imaging data generated by an imaging device (40) to render a predicted navigated pose of the imaging device (40), and applies the control predictive model (70) to error positioning data derived from a differential aspect between a target pose of the imaging device 40) and the predicted navigated pose of the imaging device (40) to render a predicted corrective positioning motion of the imaging device (40) (or a portion of the interventional device associated with this imaging device) to the target pose. From the predictions, the controller (50) further generates continuous positioning commands controlling a corrective positioning by the interventional device (30) of the imaging device (40) (or said portion of interventional device) to the target pose based on the predicted corrective positioning motion of the interventional device (30).
DEVICE FOR AUTOMATICALLY DETECTING COUPLING BETWEEN ELECTRONIC DEVICES
A method for automatically detecting a sensor coupled to an electronic computer including steps of detecting the sensor and steps of configuring a hardware interface.
Systems and Hybrid Position Force Control Processes of an Industrial Robot
The present process of controlling an industrial robot includes steps consisting of calculating, in the modules implemented by the central unit, a time-dependent composite setpoint defining articular forces and velocities, according to a target trajectory and to an operating mode; calculating, in modules implemented by the central unit, a behavior matrix which describes a desired behavior of the robot arm, defining directions along which the calculated composite setpoint is to be applied; calculating, in a module implemented by the in auxiliary unit, an articular force setpoint for controlling the axis controller module; and calculating, in the axis controller module implemented by the auxiliary unit, the control setpoints for the power units according to the articular force setpoint.