Patent classifications
G05B2219/39391
ROBOTIC DEVICE, ROBOTIC DEVICE CONTROLLING SYSTEM, AND ROBOTIC DEVICE CONTROLLING METHOD
A robotic device performs work defined by a series of unit jobs. The robotic device includes a first notice section and a controller. The controller includes a first calculation section that calculates end time of the work based on time required for the work. The controller controls the first notice section so that the first notice section issues notice of ending the work before the end time. The controller further includes a first determination section that determines, based on the end time, time to issue the notice. The time to issue the notice is time before the end time.
METHOD FOR CONTROLLING A ROBOT ARM
A method for visually controlling a robot arm which is displaceable in a plurality of degrees of freedom, the robot arm carrying at least one displaceable reference point, includes the steps of: a) placing at least one camera so that a target point where the reference point is to be placed is contained in an image output by the at least one camera; b) displacing the robot arm so that the reference point is within the image; c) determining a vector which, in the image, connects the reference point to the target point; d) choosing one of the plurality of degrees of freedom, moving the robot arm by a predetermined standard distance in the one degree of freedom, and recording a standard displacement of the reference point within the image resulting from the movement of the robot arm; e) repeating step d) at least until the vector can be decomposed.
Robot device and method of controlling movement of robot device
The robot device includes a spatial information recording unit where first spatial information about an area in which the movement of a robot device to be self-propelled with respect to a structure is supposed and which is associated with the structure is recorded, a spatial information acquisition section that is mounted on the robot device and acquires second spatial information about a peripheral area of the robot device with the movement of the robot device, and a spatial information updating unit that updates the first spatial information recorded in the spatial information recording unit with the second spatial information acquired by the spatial information acquisition section. The first spatial information recorded in the spatial information recording unit is used in a case in which the robot device is moved with respect to the structure.
VISUAL SERVO SYSTEM
A visual servo system includes a robot that handles an object, an irradiation device that irradiates light onto the object, and a camera that captures an image of the object and outputs a current image. The visual servo system reads, from a storage medium, a target image that is assumed to be captured by the camera when the object is in target position and attitude, and the light irradiated from the irradiation device is striking the object. The visual servo system calculates control input to be inputted to the robot based on a difference in luminance value between the current image and the target image, and inputs the control input to the robot. The light that is irradiated by the irradiation device is light that has a luminance distribution based on a reference image in winch a luminance value changes along a predetermined direction.
ROBOTIC SYSTEM WITH ERROR DETECTION AND DYNAMIC PACKING MECHANISM
A method for operating a robotic system includes determining a discretized object model based on source sensor data; comparing the discretized object model to a packing plan or to master data; determining a discretized platform model based on destination sensor data; determining height measures based on the destination sensor data; comparing the discretized platform model and/or the height measures to an expected platform model and/or expected height measures; and determining one or more errors by (i) determining at least one source matching error by identifying one or more disparities between (a) the discretized object model and (b) the packing plan or the master data or (ii) determining at least one destination matching error by identifying one or more disparities between (a) the discretized platform model or the height measures and (b) the expected platform model or the expected height measures, respectively.
ASSEMBLING PARTS IN AN ASSEMBLY LINE
A method for assembling parts in an assembly line, such as an automotive final assembly line, is disclosed. The method includes advancing a part along the assembly line with an Automated Guided Vehicle (AGV), arranging a first real time vision system to monitor the position of the AGV in at least two directions, and providing the readings of the first real time vision system to a controller arranged to control an assembly unit of the assembly line to perform an automated operation on the part that is advanced or supported by the AGV. An assembly line is also disclosed.
Integrated robotic system and method for autonomous vehicle maintenance
A robotic system includes a controller configured to obtain image data from one or more optical sensors and to determine one or more of a location and/or pose of a vehicle component based on the image data. The controller also is configured to determine a model of an external environment of the robotic system based on the image data and to determine tasks to be performed by components of the robotic system to perform maintenance on the vehicle component. The controller also is configured to assign the tasks to the components of the robotic system and to communicate control signals to the components of the robotic system to autonomously control the robotic system to perform the maintenance on the vehicle component.
COMMUNICATION SYSTEM AND METHOD FOR CONTROLLING COMMUNICATION SYSTEM
A communication system according to the present disclosure includes a camera configured to be able to photograph a user who is a communication partner and a microphone configured to be able to form a beam-forming in a specific direction. The control unit identifies a position of the mouth of a user using an image of the user taken by the camera and controls a position of a head part so that the identified position of the mouth of the user is included in a region of the beam-forming.
ROBOT AND METHOD FOR ESTIMATING DIRECTION ON BASIS OF VANISHING POINT IN LOW LIGHT IMAGE
The present invention relates to a robot and method for estimating an orientation on the basis of a vanishing point in a low-luminance image, and the robot for estimating an orientation on the basis of a vanishing point in a low-luminance image according to an embodiment of the present invention includes a camera unit configured to capture an image of at least one of a forward area and an upward area of the robot and an image processor configured to extract line segments from a first image captured by the camera unit by applying histogram equalization and a rolling guidance filter to the first image, calculate a vanishing point on the basis of the line segments, and estimate a global angle of the robot corresponding to the vanishing point.
Techniques For Detecting Errors Or Loss Of Accuracy In A Surgical Robotic System
Systems and methods for operating a robotic surgical system are provided. The system includes a surgical tool, a manipulator comprising links for controlling the tool, a navigation system comprising a tracker coupled to the manipulator or the tool and a localizer to monitor a state of the tracker. Controller(s) determine a raw (or lightly filtered) relationship between one or more components of the manipulator and one or more components of the navigation system by utilizing one or more of raw kinematic measurement data from the manipulator and raw navigation data from the navigation system. The controller(s) utilize the raw (or lightly filtered) relationship to determine whether an error has occurred relating to at least one of the manipulator and the navigation system.