Patent classifications
G05B2219/39391
POSITION CORRECTION SYSTEM, POSITION CORRECTION METHOD, AND POSITION CORRECTION PROGRAM
A position correction system includes: one or more conveyer devices; an imaging device fixed to differential installation positions of the one or more conveyer devices and configured to image the robot to generate a plurality of captured images in a state in which the conveyer device has stopped at a predetermined stop position in a predetermined range from the robot; a position calculating device configured to calculate position coordinates of an actual reference point of the robot using the generated captured image; a correction value calculating device configured to calculate a correction value based on a difference between the calculated position coordinates of the actual reference point of the robot and position coordinates of a target reference point of the robot which is determined by simulation; and a position correcting device configured to correct the position of the actual reference point of the robot based on the calculated correction value.
Vision-based sensor system and control method for robot arms
A method for determining the joint positions of a kinematic chain uses only an imaging sensor and a computing unit. Characteristic features on the links and joints of the kinematic chain are identified and the joint positions are calculated from these visual measurements. The robot can be controlled without the use of joint encoders. A sensor system for monitoring the status of a kinematic chain includes a computing unit and an imaging sensor. The imaging sensor may be mounted to the kinematic chain or in the surroundings of the kinematic chain and monitors the kinematic chain and/or the surroundings of the kinematic chain. The computing unit determines a pose and/or movement parameters of at least one element of the kinematic chain by analyzing an output signal of the imaging sensor, in particular by analyzing characteristic features and determines a rotational joint position by analyzing the characteristic features.
HEAD MOUNTED DISPLAY FOR REMOTE OPERATION OF MACHINERY
A system and method for providing real-time, sensory information associated with a remote location using a remote capture device and a head-mounted display. In some embodiments, the system comprises a fiber-optic cable to transmit a signal comprising sensory information collected by the remote capture device to the head-mounted display. Further, the remote capture device may be secured onto a boom of an aerial device.
Techniques For Detecting Errors Or Loss Of Accuracy In A Surgical Robotic System
Systems and methods for operating a robotic surgical system are provided. The system includes a surgical tool, a manipulator comprising links for controlling the tool, a navigation system includes a tracker and a localizer to monitor a state of the tracker. Controller(s) determine a relationship between one or more components of the manipulator and one or more components of the navigation system by utilizing kinematic measurement data from the manipulator and navigation data from the navigation system. The controller(s) utilize the relationship to determine whether an error has occurred relating to at least one of the manipulator and the navigation system. The error is at least one of undesired movement of the manipulator, undesired movement of the localizer, failure of any one or more components of the manipulator or the localizer, and/or improper calibration data.
Method for controlling a robot arm
A method for visually controlling a robot arm which is displaceable in a plurality of degrees of freedom, the robot arm carrying at least one displaceable reference point, includes the steps of: a) placing at least one camera so that a target point where the reference point is to be placed is contained in an image output by the at least one camera; b) displacing the robot arm so that the reference point is within the image; c) determining a vector which, in the image, connects the reference point to the target point; d) choosing one of the plurality of degrees of freedom, moving the robot arm by a predetermined standard distance in the one degree of freedom, and recording a standard displacement of the reference point within the image resulting from the movement of the robot arm; e) repeating step d) at least until the vector can be decomposed.
INTEGRATED ROBOTIC SYSTEM AND METHOD FOR AUTONOMOUS VEHICLE MAINTENANCE
A robotic system includes a controller configured to obtain image data from one or more optical sensors and to determine one or more of a location and/or pose of a vehicle component based on the image data. The controller also is configured to determine a model of an external environment of the robotic system based on the image data and to determine tasks to be performed by components of the robotic system to perform maintenance on the vehicle component. The controller also is configured to assign the tasks to the components of the robotic system and to communicate control signals to the components of the robotic system to autonomously control the robotic system to perform the maintenance on the vehicle component.
ROBOTIC SYSTEM WITH ERROR DETECTION AND DYNAMIC PACKING MECHANISM
A method for operating a robotic system includes determining a discretized object model based on source sensor data; comparing the discretized object model to a packing plan or to master data; determining a discretized platform model based on destination sensor data; determining height measures based on the destination sensor data; comparing the discretized platform model and/or the height measures to an expected platform model and/or expected height measures; and determining one or more errors by (i) determining at least one source matching error by identifying one or more disparities between (a) the discretized object model and (b) the packing plan or the master data or (ii) determining at least one destination matching error by identifying one or more disparities between (a) the discretized platform model or the height measures and (b) the expected platform model or the expected height measures, respectively.
Control of a robot assembly
A method for the control of a robot assembly having at least one robot. The method includes acquiring pose data from an object arrangement having at least one object, which data has a first time interval; determining modified pose data from the object arrangement, which data has a second time interval that is larger or smaller than the first time interval, or is equal to the first time interval, on the basis of the acquired pose data; and controlling the robot assembly on the basis of said modified pose data.
Determining a Virtual Representation of an Environment By Projecting Texture Patterns
Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.
SYSTEMS, APPARATUSES, AND METHODS FOR DYNAMIC FILTERING OF HIGH INTENSITY BROADBAND ELECTROMAGNETIC WAVES FROM IMAGE DATA FROM A SENSOR COUPLED TO A ROBOT
Systems, apparatuses, and methods for dynamic filtering of high intensity broadband electromagnetic waves in image data from a sensor of a robot are disclosed herein. According to at least one non-limiting exemplary embodiment, sunlight or light emitted from nearby fluorescent lamps may cause a robot to generate false positives of objects nearby the robot as the light may be of high intensity and large bandwidth. These false positives may cause a robot to get stuck or navigate without use of a camera sensor, which may be unsafe.