Patent classifications
Y10S901/05
Apparatus and methods for remotely controlling robotic devices
Computerized appliances may be operated by users remotely. A learning controller apparatus may be operated to determine association between a user indication and an action by the appliance. The user indications, e.g., gestures, posture changes, audio signals may trigger an event associated with the controller. The event may be linked to a plurality of instructions configured to communicate a command to the appliance. The learning apparatus may receive sensory input conveying information about robot's state and environment (context). The sensory input may be used to determine the user indications. During operation, upon determine the indication using sensory input, the controller may cause execution of the respective instructions in order to trigger action by the appliance. Device animation methodology may enable users to operate computerized appliances using gestures, voice commands, posture changes, and/or other customized control elements.
Calibration and programming of robots
Methods includes calibrating robots without the use of external measurement equipment and copying working programs between un-calibrated robots. Both methods utilize the properties of a closed chain and the relative position of the links in the chain in order to update the kinematic models of the robots.
Robot system and method for manufacturing component
After a forward end of a workpiece is inserted into a through-hole and fitting is started, a follow operation of moving the workpiece to follow the shape of the through-hole is performed during the movement of the workpiece in a fitting direction. At this time, the workpiece is fitted into the through-hole while a control point of a robot is changed in a direction opposite to the fitting direction according to the amount of movement of the workpiece in the fitting direction.
ROBOT CONTROL, TRAINING AND COLLABORATION IN AN IMMERSIVE VIRTUAL REALITY ENVIRONMENT
System and methods to create an immersive virtual environment using a virtual reality system that receives parameters corresponding to a real-world robot. The real-world robot may be simulated to create a virtual robot based on the received parameters. The immersive virtual environment may be transmitted to a user. The user may supply input and interact with the virtual robot. Feedback such as the current state of the virtual robot or the real-world robot may be provided to the user. The user may train the virtual robot. The real-world robot may be programmed based on the virtual robot training.
Robot control, training and collaboration in an immersive virtual reality environment
System and methods to create an immersive virtual environment using a virtual reality system that receives parameters corresponding to a real-world robot. The real-world robot may be simulated to create a virtual robot based on the received parameters. The immersive virtual environment may be transmitted to a user. The user may supply input and interact with the virtual robot. Feedback such as the current state of the virtual robot or the real-world robot may be provided to the user. The user may train the virtual robot. The real-world robot may be programmed based on the virtual robot training.
Horizontal articulated robot
A horizontal articulated robot includes a support unit, a movable unit provided in the support unit, from which an end effector is detachable, and a control unit that controls the movable unit, wherein the control unit is provided in the support unit and the end effector is connected to the control unit.
Method for programming robot in vision base coordinate
A method for programming a robot in a vision base coordinate is provided. The method includes the following steps. A robot is drawn to an operation point. The coordinates of the operation point in a photo operation are set as a new point. A teaching image is captured and a vision base coordinate system is established. A new point is added according to the newly established vision base coordinate system. When the robot is operating, the robot is controlled to capture an image from a photo operation point. A comparison between the captured image and a teaching image is made. The image being the same as the teaching image is searched according to the comparison result. Whether the vision base coordinate system maintains the same corresponding relation as in the teaching process is checked. Thus, the robot can be precisely controlled.
Teaching apparatus used for operation of industrial robot
A teaching apparatus that is connected to a control apparatus controlling a welding robot and outputs a command intended for the welding robot to the control apparatus is provided. The teaching apparatus includes a welding-parameter-manipulating portion provided on a front face and accepting an operation of changing a setting that defines a movement, of the welding robot, an enabling switch provided on a back face and accepting an operation of changing a state of electrification of the welding robot, and a robot-manipulating portion provided on the back face and accepting an operation of executing the movement of the welding robot. The welding-parameter-manipulating portion is positioned on a left side, in front view, with respect to the center, and the enabling switch and the robot-manipulating portion are positioned on a right side, in rear view, with respect to the center.
ROBOT CONTROL, TRAINING AND COLLABORATION IN AN IMMERSIVE VIRTUAL REALITY ENVIRONMENT
System and methods to create an immersive virtual environment using a virtual reality system that receives parameters corresponding to a real-world robot. The real-world robot may be simulated to create a virtual robot based on the received parameters. The immersive virtual environment may be transmitted to a user. The user may supply input and interact with the virtual robot. Feedback such as the current state of the virtual robot or the real-world robot may be provided to the user. The user may train the virtual robot. The real-world robot may be programmed based on the virtual robot training.
Inspection of drilled features in objects
Disclosed is a method and apparatus for determining a depth of a feature (4) formed in an object (2), the feature (4) having been formed in the object (2) by a cutting tool (38). The apparatus comprises: a camera (42) configured to capture an image of the feature (4) and a portion of the object (2) proximate to the feature (4); and one or more processors operatively coupled to the camera (42) and configured to: detect, in the image, an edge (72) of the feature (4) between the feature (4) and a surface of the object (2); using the detected edge (72), calculate a diameter for a circle (74, 76, 78); acquire a point angle of the cutting tool (38); and, using the calculated diameter and the acquired point angle, calculate a depth value for the feature (4).