Patent classifications
G05B2219/40425
METHOD AND ELECTRONIC DEVICE, SYSTEM AND COMPUTER READABLE MEDIUM FOR TIME CALIBRATION
Devices, systems, and methods for time calibration. The method comprises determining a first reference position of a robot in a robot coordinate system based on first feedback information received from the robot; determining an association between the first reference position and first sensing information receive from a sensor; receiving, from the robot, second feedback information associated with a second motion of the robot and, from the sensor, second sensing information associated with the second motion; and determining a time delay between a sensing time point when a sensing position of the robot in the second motion is sensed by the sensor and a recording time point when the sensing position is recorded by the robot in the second motion.
METHOD OF ROBOTIC SYSTEM DYNAMIC VELOCITY MODIFICATION
A method and system for robotic motion planning which perform dynamic velocity attenuation to avoid robot collision with static or dynamic objects. The technique maintains the planned robot tool path even when speed reduction is necessary, by providing feedback of a computed slowdown ratio to a tracking controller so that the path computation is always synchronized with current robot speed. The technique uses both robot-obstacle distance and relative velocity to determine when to apply velocity attenuation, and computes a joint speed limit vector based on a robot-obstacle distance, a maximum obstacle speed, and a computed stopping time as a function of the joint speed. Two different control structure implementations are disclosed, both of which provide feedback of the slowdown ratio to the motion planner as needed for faithful path following. A method of establishing velocity attenuation priority in multi-robot systems is also provided.
Robot Teaching System
A robot teaching system includes: a photographing unit that photographs an image including a welding target and a marker installed on an industrial robot; a camera coordinate system setting unit that sets a camera coordinate system on a basis of the marker included in the image; an operation path setting unit that sets an operation path of the industrial robot on a basis of a welding position of the welding target included in the image in the camera coordinate system; and a program generation unit that generates a working program, while converting the set operation path from the camera coordinate system into a robot coordinate system set in a robot control apparatus on a basis of a position of the marker installed on the industrial robot. The robot teaching system generates a working program allowing appropriate welding at a welding position.
SYSTEMS AND METHODS FOR OBJECT GUIDANCE AND COLLISION AVOIDANCE
Systems and methods for object guidance and collision avoidance are provided. One system includes a location sensor disposed on a movable crane. The system also includes a plurality of sensors disposed on a plurality of objects within a facility. The system further includes a controller having a receiver for monitoring signals transmitted from the location sensor disposed on a movable crane and the plurality of sensors disposed on a plurality of objects within the facility. The controller is configured to generate a travel path for the movable crane to move an object coupled with the movable crane based on the one or more intersection regions and generate an output signal to an alarm device to provide an alert, when at least one object of the plurality of objects is within a predetermined proximity of at least the object being moved by the crane.
Vision guided robot arm and method for operating the same
A method for operating a vision guided robot arm system comprising a robot arm provided with an end effector at a distal end thereof, a display, an image sensor and a controller, the method comprising: receiving from the sensor image an initial image of an area comprising at least one object and displaying the initial image on the display; determining an object of interest amongst the at least one object and identifying the object of interest within the initial image; determining a potential action related to the object of interest and providing a user with an identification of the potential action; receiving a confirmation of the object of interest and the potential action from the user; and automatically moving the robot arm so as to position the end effector of the robot arm at a predefined position relative to the object of interest.
METHODS FOR DISPENSING A LIQUID OR VISCOUS MATERIAL ONTO A SUBSTRATE
Systems and methods for dispensing a liquid or viscous material onto a substrate are disclosed herein. One exemplary method of positioning an applicator of a dispensing system to apply a liquid or viscous material to an electronic substrate includes generating a two-dimensional image of the electronic substrate using a camera communicatively connected to the dispensing system. Based on the two-dimensional image of the electronic substrate, a first set of one or more sub-regions of the electronic substrate having one or more components that protrude above the surface of the electronic substrate is identified. The method further includes using height information relating to the one or more sub-regions having the one or more components to determine a control program for the dispensing system to position the applicator relative to the electronic substrate and dispense the liquid or viscous material onto the electronic substrate.
System, Method and Product for Utilizing Prediction Models of an Environment
A first method comprising: predicting a scene of an environment using a model of the environment and based on a first scene of the environment obtained from sensors observing scenes of the environment; comparing the predicted scene with an observed scene from the sensors; and performing an action based on differences determined between the predicted scene and the observed scene. A second method comprising applying a vibration stimuli on an object via a computer-controlled component; obtaining a plurality of images depicting the object from a same viewpoint, captured during the application of the vibration stimuli. The second method further comprising comparing the plurality of images to detect changes occurring in response to the application of the vibration stimuli, which changes are attributed to a change of a location of a boundary of the object; and determining the boundary of the object based on the comparison.
Location calibration for automated production manufacturing
Systems and methods for calibrating the location of an end effector-carrying apparatus relative to successive workpieces before the start of a production manufacturing operation. The location calibration is performed using a positioning system. These disclosed methodologies allow an operator to program (or teach) the robot motion path once and reuse that path for subsequent structures by using relative location feedback from a measurement system to adjust the position and orientation offset of the robot relative to the workpiece. When each subsequent workpiece comes into the robotic workcell, its location (i.e., position and orientation) relative to the robot may be different than the first workpiece that was used when developing the initial program. The disclosed systems and methods can also be used to compensate for structural differences between workpieces intended to have identical structures.
Imager for detecting visual light and projected patterns
Methods and systems for depth sensing are provided. A system includes a first and second optical sensor each including a first plurality of photodetectors configured to capture visible light interspersed with a second plurality of photodetectors configured to capture infrared light within a particular infrared band. The system also includes a computing device configured to (i) identify first corresponding features of the environment between a first visible light image captured by the first optical sensor and a second visible light image captured by the second optical sensor; (ii) identify second corresponding features of the environment between a first infrared light image captured by the first optical sensor and a second infrared light image captured by the second optical sensor; and (iii) determine a depth estimate for at least one surface in the environment based on the first corresponding features and the second corresponding features.
ROBOT PLANNING
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for controlling robotic movements. One of the methods includes receiving, for a robot, an initial plan specifying a path and a local trajectory; receiving an updated observation of an environment of the robot; generating an initial modified local trajectory for the robot based on the updated observation in the environment of the robot; repeatedly following the initial modified local trajectory for the robot while generating a modified global path for the robot, comprising: obtaining data representing a workspace footprint for the robot, the workspace footprint defining a volume for a workspace of the robot, and generating the modified global path to avoid causing the robot to cross a boundary of the volume defined by the workspace footprint; and causing the robot to follow the modified global path for the robot.