G05B2219/39391

Determining a Virtual Representation of an Environment By Projecting Texture Patterns
20180093377 · 2018-04-05 ·

Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.

ROBOT DEVICE AND METHOD OF CONTROLLING MOVEMENT OF ROBOT DEVICE
20180074500 · 2018-03-15 · ·

The robot device includes a spatial information recording unit where first spatial information about an area in which the movement of a robot device to be self-propelled with respect to a structure is supposed and which is associated with the structure is recorded, a spatial information acquisition section that is mounted on the robot device and acquires second spatial information about a peripheral area of the robot device with the movement of the robot device, and a spatial information updating unit that updates the first spatial information recorded in the spatial information recording unit with the second spatial information acquired by the spatial information acquisition section. The first spatial information recorded in the spatial information recording unit is used in a case in which the robot device is moved with respect to the structure.

ROBOT CONTROL DEVICE
20180050452 · 2018-02-22 ·

A robot control device includes a feature-point detecting unit that detects, from an image of an object acquired by a visual sensor, the positions of a plurality of feature points on the object in a predetermined cycle; a position/orientation calculating unit that updates, in the predetermined cycle, respective equations of motion of the plurality of feature points on the basis of the detected positions of the plurality of feature points and that calculates the position or orientation of the object on the basis of the detected positions of the plurality of feature points calculated from the updated equations of motion; and a robot-arm-movement control unit that controls the movement of a robot arm so as to follow the object, on the basis of the calculated position or orientation of the object.

Systems and methods for control of robotic manipulation

A robot system and method are provided that move an articulable arm relative to a target object. Perception information corresponding to a position of the arm relative to the target object is acquired at an acquisition rate. Movement of the arm is controlled at a control rate that is at least one of faster than or unsynchronized with the acquisition rate. Predicted position information representative of a predicted positioning of the arm is provided using the perception information. The arm is controlled using the perception information and the predicted position information.

Dish Manipulation Systems And Methods
20180036889 · 2018-02-08 ·

Example dish manipulation systems and methods are described. In one implementation, a robotic actuator includes at least one magnet. The robotic actuator is configured to manipulate, using magnetic attraction, an article of magnetic dishware. A processing system electrically coupled to the robotic actuator is configured to generate commands for positioning the robotic actuator in three-dimensional space.

Determining a virtual representation of an environment by projecting texture patterns
09862093 · 2018-01-09 · ·

Example methods and systems for determining 3D scene geometry by projecting patterns of light onto a scene are provided. In an example method, a first projector may project a first random texture pattern having a first wavelength and a second projector may project a second random texture pattern having a second wavelength. A computing device may receive sensor data that is indicative of an environment as perceived from a first viewpoint of a first optical sensor and a second viewpoint of a second optical sensor. Based on the received sensor data, the computing device may determine corresponding features between sensor data associated with the first viewpoint and sensor data associated with the second viewpoint. And based on the determined corresponding features, the computing device may determine an output including a virtual representation of the environment that includes depth measurements indicative of distances to at least one object.

Techniques for detecting errors or loss of accuracy in a surgical robotic system
12171510 · 2024-12-24 · ·

A surgical system and method involve a robotic system with a base and a localizer that monitors a tracker supported by the robotic system. Sensor(s) are coupled to the robotic system and/or the localizer. Controller(s) determine a relationship between the base and the localizer. The controller(s) monitor the relationship to detect an error related to one or both of the robotic system and the localizer. The controller(s) utilize the sensor(s) to determine a cause of the error.

Robotic system with error detection and dynamic packing mechanism
12162166 · 2024-12-10 · ·

A method for operating a robotic system includes determining a discretized object model based on source sensor data; comparing the discretized object model to a packing plan or to master data; determining a discretized platform model based on destination sensor data; determining height measures based on the destination sensor data; comparing the discretized platform model and/or the height measures to an expected platform model and/or expected height measures; and determining one or more errors by (i) determining at least one source matching error by identifying one or more disparities between (a) the discretized object model and (b) the packing plan or the master data or (ii) determining at least one destination matching error by identifying one or more disparities between (a) the discretized platform model or the height measures and (b) the expected platform model or the expected height measures, respectively.

Techniques For Detecting Errors Or Loss Of Accuracy In A Surgical Robotic System
20250041004 · 2025-02-06 · ·

A surgical system and method involve a robotic system with a base and a localizer that monitors a tracker supported by the robotic system. Controller(s) determine a relationship between the base and the localizer. The controller(s) monitor the relationship to detect an error related to one or both of the robotic system and the localizer. In response to detection of the error, the controller(s) modify operation of the robotic system.

Methods and systems for recognizing machine-readable information on three-dimensional objects
09707682 · 2017-07-18 · ·

Methods and systems for recognizing machine-readable information on three-dimensional (3D) objects are described. A robotic manipulator may move at least one physical object through a designated area in space. As the at least one physical object is being moved through the designated area, one or more optical sensors may determine a location of a machine-readable code on the at least one physical object and, based on the determined location, scan the machine-readable code so as to determine information associated with the at least one physical object encoded in the machine-readable code. Based on the information associated with the at least one physical object, a computing device may then determine a respective location in a physical environment of the robotic manipulator at which to place the at least one physical object. The robotic manipulator may then be directed to place the at least one physical object at the respective location.