Patent classifications
B25J9/1692
INSPECTION ROBOTS WITH SWAPPABLE DRIVE MODULES
Inspection robots with swappable drive modules are described. An example inspect robot may include a first removeable interface plate on the side of a robot chassis. The first removable interface plate may couple a first drive module to an electronic board, within the chassis, where the electronic board includes a drive module interface circuit communicatively coupled to the first drive module. The example inspect robot may also include a second removeable interface plate on a side of a robot chassis. The second removable interface plate may couple a second drive module to an electronic board, within the chassis, where the electronic board includes a drive module interface circuit communicatively coupled to the second drive module.
Method and control system for updating camera calibration for robot control
A robot control system and a method for updating camera calibration is presented. The method comprises the robot control system performing a first camera calibration to determine camera calibration information, and outputting a first movement command based on the camera calibration information for a robot operation. The method further comprises outputting, after the first camera calibration, a second movement command to move a calibration pattern within a camera field of view, receiving one or more calibration images, and adding the one or more calibration images to a captured image set. The method further comprises performing a second camera calibration based on calibration images in the captured image set to determine updated camera calibration information, determining whether a deviation between the camera calibration information and the updated camera calibration information exceeds a defined threshold, and outputting a notification signal if the deviation exceeds the defined threshold.
Work Program Production System and Work Program Production Method
The work program production system includes a photographing unit that photographs an image including an object to be welded, a coordinate system setting unit that sets a user coordinate system based on a marker included in the image photographed by the photographing unit, a point-group-data plotting unit that detects a specific position of the marker on the basis of the image, sets the detected specific position on point group data acquired by a distance measurement sensor that measures a distance to the object to be welded, and plots, in the user coordinate system, the point group data to which coordinates in the user coordinate system using the set specific position as an origin are given, and a program production unit that produces a welding program so as to allow a welding robot virtually placed in the user coordinate system to perform a welding operation on the basis of the point group data plotted in the user coordinate system, while avoiding interference with the point group data.
AUTOMATIC ROBOTIC ARM SYSTEM AND COORDINATING METHOD FOR ROBOTIC ARM AND COMPUTER VISION THEREOF
An automatic robotic arm system and a coordinating method for robotic arm and computer vision thereof are disclosed. A beam-splitting mirror splits an incident light into a visible light and a ranging light and respectively guides to an image capturing device and an optical ranging device arranged in the different reference axes. In a calibration mode, a transformation relation is computed based on a plurality of the calibration postures and corresponding calibration images. In an operation mode, a mechanical space coordinate is determined based on an operation image and the transformation relation, and the robotic arm is controlled to move based on the mechanical space coordinate.
Component-Inventory-Based Robot Fleet Management in Value Chain Networks
A robot fleet management platform includes a resources data store that maintains a robot inventory indicating robots that can be assigned to a robot fleet and, for each respective robot, a set of baseline features of the robot and a respective status. The resources data store maintains a components inventory indicating different components that can be provisioned to one or more multi-purpose robots and, for each component, a respective set of extended capabilities corresponding to the component and a respective status. The robot fleet management platform receives a request for a robotic fleet to perform a job, determines a job definition data structure based on the request, determines a robot fleet configuration data structure corresponding to the job based on the set of tasks and the robot inventory, determines a respective configuration for each assigned robot based on the components inventory, configures the assigned robots, and deploys the robotic fleet.
IMAGE PROCESSING SYSTEM
The present invention addresses the problem of providing an image processing system that can accurately and efficiently detect a target object even when the positional relationship between an image capturing device and the target object differs between the time of teaching and the time of detection. An image processing system 1 includes: a control unit 52 that obtains, on the basis of position information of a robot 2 for specifying the position of a visual sensor 4 in a robot coordinate system and on the basis of position information showing the position of a target object W in an image coordinate system, the positional relationship between the visual sensor 4 and the target object W; and a storage unit 51 that stores, on the basis of a model pattern consisting of feature points extracted from a teaching image and on the basis of the positional relationship between the visual sensor 4 and the target object W when capturing the teaching image, the model pattern in the form of three-dimensional position information. The control unit 52 performs detection processing, in which the target object W is detected from a detection image on the basis of a result obtained by matching the model pattern with the feature points extracted from the detection image including the target object W.
ROBOT SYSTEM
A robot system is provided which can suitably perform robot movement correction. The robot system is provided with: a visual sensor which captures a first image of a target with the robot in a prescribed position and which captures a second image of the target with the robot in the position resulting from moving the robot a prescribed distance from the aforementioned prescribed position; a calibration data storage unit which stores calibration data that associates the robot coordinate system of the robot and the image coordinate system of the visual sensor; a first acquisition unit which, on the basis of the first image and the calibration data, acquires a first position of the target in the robot coordinate system; a second acquisition unit which, on the basis of the first image and the second image, acquires a second position of the target in the robot coordinate system; and a determination unit which determines whether or not the difference between the first position and the second position is within a prescribed range.
PROCESS ASSEMBLY LINE WITH ROBOTIC PROCESS AUTOMATION
In an example embodiment, a novel “process assembly line” solution is provided that organizes software robots in a manner that, once configured, allows them to be duplicated and cloned into multiple scenarios, including in organizational structures where one software robot is triggered or called by another software robot. Instead of designing software robots with multiple functionalities in each to perform complex situations, the process assembly line can utilize software robots with single functions. This aids developers in building robust software robots and reduces potential errors.
TOOL POSITION DETERMINATION IN A ROBOTIC APPENDAGE
System and techniques for tool position determination in a robotic appendage are described herein. A robotic appendage is put through a rotational movement to induce acceleration in a tool mounted to the appendage. A model for acceleration is created from positional kinematics of the appendage. A measurement of acceleration is taken at the tool and fit to the model to determine distance from the axis of rotation to the tool. The distance is provided for use in control or modeling of the robotic appendage.
CALIBRATING A VIRTUAL FORCE SENSOR OF A ROBOT MANIPULATOR
The invention relates to a method for calibrating a virtual force sensor of a robot manipulator, wherein the following steps are carried out in a plurality of poses: applying an external wrench to the robot manipulator, ascertaining an estimate of the external wrench, ascertaining a first calibration matrix based on the ascertained estimate and a specified external wrench, ascertaining a second calibration matrix by inverting the first calibration matrix, and storing the respective second calibration matrix in a data set of all of the second calibration matrices, thereby assigning each second calibration matrix to the respective pose for which each second calibration matrix was ascertained.