Patent classifications
G05B2219/40613
Robot, analog-to-digital converter, and solid-state imaging device
An analog-to-digital converter includes: a first to an (m+1)-th capacitive element each of which has a first end connected to a first terminal of a comparison circuit and have a predetermined capacitance ratio; and selection circuits which are connected to second ends of the capacitive elements, respectively. Each of the capacitive elements includes: a first electrode disposed in a semiconductor substrate and electrically connected to the second end; a third electrode disposed above the semiconductor substrate to oppose the first electrode and electrically connected to the second end; a second electrode disposed between the first electrode and the third electrode, above the semiconductor substrate, and electrically connected to the first end; a first insulation film disposed between the first and second electrodes; and a second insulation film disposed between the third and second electrodes.
Eye in-hand robot
A robot includes a gripping member configured to move and pick up the object, a camera affixed to the gripping member such that movement of the gripping member causes movement of the camera, the camera configured to measure and store data related to intensity of light and direction of light rays within the environment, an image processing module configured to process the data to generate a probabilistic model defining a location of the object within the environment, and an operation module configured to move the gripping member to the location and pick up the object.
OBJECT DETECTION DEVICE, CONTROL DEVICE, AND OBJECT DETECTION COMPUTER PROGRAM
An object detection device detects, when a camera that generates an image representing a target object and the target object do not satisfy a predetermined positional relationship, a position of the target object on the image by inputting the image to a classifier, and detects, when the camera and the target object satisfy the predetermined positional relationship, a position of the target object on the image by comparing, with the image, a template representing a feature of an appearance of the target object when the target object is viewed from a predetermined direction.
Sensorized Robotic Gripping Device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Robot system
A robot system includes a light source, an image capture device, a robot mechanism unit having a target site of position control where the light source is provided, and a robot controller that controls the position of the robot mechanism unit based on a position command, a position feedback, and a position compensation value. The robot controller includes a path acquisition unit that makes the image capture device capture an image of light from the light source continuously during the predetermined operation to acquire a path of the light source from the image capture device, a positional error estimation unit that estimates positional error of the path of the light source from the position command based on the acquired path of the light source and the position command, and a compensation value generation unit that generates the position compensation value based on the estimated positional error.
Object conveying system
Provided is an object conveying system including: a conveying apparatus that conveys an object; one or more cameras that capture images of feature points of the object; a position measuring portion that measures positions of the feature points from the acquired images; a detecting portion that detects a position or a movement velocity of the object; a position correcting portion that corrects the positions of the feature points so as to achieve positions at which the feature points are disposed at the same time; a line-of-sight calculating portion that calculates lines of sight that pass through the feature points on the basis of the corrected positions of the feature points and the positions of the cameras; and a position calculating portion that calculates a three-dimensional position of the object by applying a polygon having a known shape to the calculated lines of sight.
Arrangement and method for the model-based calibration of a robot in a working space
An arrangement for the model-based calibration of a mechanism in a workspace with calibration objects that are either directed laser radiation patterns together with an associated laser radiation-pattern generator or radiation-pattern position sensors. Functional operation groups made up of at least one laser radiation pattern and at least one position sensor interact in such a way when a radiation pattern impinges on the sensor that measured sensor position information values are passed along to computing devices that determine the parameters of a mathematical mechanism model with the aid of these measured values. In the process, at least two different functional operation groups are used to calibrate the mechanism, and at least two calibration objects from different functional operation groups are rigidly connected to one another.
Sensorized robotic gripping device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Control system and method for applying force to grasp a target object
Systems and methods are provided for an automation system. The systems and methods calculate a motion trajectory of a manipulator and an end-effector. The end-effector is configured to grasp a target object. The motion trajectory defines successive positions of the manipulator and the end-effector along a plurality of via-points toward the target object. The systems and methods further acquire force/torque (F/T) data from an F/T sensor associated with the end-effector, and adjusts the motion trajectory based on the F/T data.
Robot arm apparatus, robot arm apparatus control method, and program
A robot arm apparatus according to the present disclosure includes: one or a plurality of a joint unit that joins a plurality of links constituting a multi-link structure; an acquisition unit that acquires an on-screen enlargement factor of a subject imaged by an imaging unit attached to the multi-link structure; and a driving control unit that controls driving of the joint unit based on a state of the joint unit and the enlargement factor.