G05B2219/40039

METHODS, APPARATUS, COMPUTER PROGRAMS, AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUMS FOR CONTROLLING AT LEAST ONE OF A FIRST ROBOT AND A SECOND ROBOT TO COLLABORATE WITHIN A SYSTEM
20180009109 · 2018-01-11 · ·

A method of controlling at least one of a first robot and a second robot to collaborate within a system, the first robot and the second robot being physically separate to one another, the method including: receiving sensed data associated with the second robot; determining position and/or orientation of the second robot using the received sensed data; determining an action for the second robot using the determined position and/or orientation of the second robot; and providing a control signal to the second robot to cause the second robot to perform the determined action to collaborate with the first robot.

MACHINE LEARNING BASED DECISION MAKING FOR ROBOTIC ITEM HANDLING

A method for controlling a robotic item handler is described. The method includes obtaining first point cloud data related to a first three-dimensional (3D) image and second point cloud data related to a second 3D image, captured by a first sensor device and a second sensor device respectively. Further, the method can include transforming the first point cloud data and the second point cloud data to combined point cloud data that is used as an input to a convolutional neural network, to construct a machine learning model. The machine learning model can output a decision classification indicative of a first probability associated with a first operating mode and a second probability associated with a second operating mode. Furthermore, the method can include operating the robotic item handler according to the first operating mode or the second operating mode based on a comparison of the first probability and the second probability.

ACTIVE DAMPING SYSTEM
20210291362 · 2021-09-23 ·

The present disclosure provides a system for performing interactions within a physical environment, the system including: (a) a robot base; (b) a robot base actuator that moves the robot base relative to the environment; (c) a robot arm mounted to the robot base, the robot arm including an end effector mounted thereon; (d) a tracking system that measures at least one of: (i) a robot base position indicative of a position of the robot base relative to the environment; and, (ii) a robot base movement indicative of a movement of the robot base relative to the environment; (e) an active damping system that actively damps movement of the robot base relative to the environment; and, (f) a control system that: (i) determines a movement correction in accordance with signals from the tracking system; and, (ii) controls the active damping system at least partially in accordance with the movement correction.

System and method for providing in-cockpit actuation of aircraft controls

An actuation system to manipulate an interface in an aircraft having an actuation controller, a vision system, a robotic arm, and a housing. Each of the vision system and the robotic arm assembly may be operatively coupled to the actuation controller. The vision system may be configured to optically image a display device of the preexisting interface, while the robotic arm assembly may be configured to engage a user-actuable device of the preexisting interface. The housing can be configured to affix to a surface adjacent the preexisting interface, where each of the vision system and the robotic arm assembly are coupled to the housing. In operation, the actuation controller may be configured to instruct the robotic arm assembly based at least in part on data from the vision system.

Proximity detection in assembly environments having machinery
10795342 · 2020-10-06 · ·

Systems and methods are provided for proximity detection in a fabrication environment. One embodiment is a method for reporting proximity in an assembly environment. The method includes inserting an arm of a bracket into an interior of a part that is held by a cradle, and that is worked upon by a robot, placing indexing features at the bracket into contact with indexing features of the cradle, operating sensors at the bracket to directly detect a location of a first proximity detector worn by a technician and a location of the second proximity detector at the robot, and directing the first proximity detector to provide a warning to the technician if a distance between the first proximity detector and the second proximity detector is less than a threshold.

PROXIMITY DETECTION IN ASSEMBLY ENVIRONMENTS HAVING MACHINERY
20200310387 · 2020-10-01 ·

Systems and methods are provided for proximity detection in a fabrication environment. One embodiment is a method for reporting proximity in an assembly environment. The method includes inserting an arm of a bracket into an interior of a part that is held by a cradle, and that is worked upon by a robot, placing indexing features at the bracket into contact with indexing features of the cradle, operating sensors at the bracket to directly detect a location of a first proximity detector worn by a technician and a location of the second proximity detector at the robot, and directing the first proximity detector to provide a warning to the technician if a distance between the first proximity detector and the second proximity detector is less than a threshold.

Control system and method for applying force to grasp a target object

Systems and methods are provided for an automation system. The systems and methods calculate a motion trajectory of a manipulator and an end-effector. The end-effector is configured to grasp a target object. The motion trajectory defines successive positions of the manipulator and the end-effector along a plurality of via-points toward the target object. The systems and methods further acquire force/torque (F/T) data from an F/T sensor associated with the end-effector, and adjusts the motion trajectory based on the F/T data.

Methods, apparatus, computer programs, and non-transitory computer readable storage mediums for controlling at least one of a first robot and a second robot to collaborate within a system

A method of controlling at least one of a first robot and a second robot to collaborate within a system, the first robot and the second robot being physically separate to one another, the method including: receiving sensed data associated with the second robot; determining position and/or orientation of the second robot using the received sensed data; determining an action for the second robot using the determined position and/or orientation of the second robot; and providing a control signal to the second robot to cause the second robot to perform the determined action to collaborate with the first robot.

System and Method for Providing In-Cockpit Actuation of Aircraft Controls
20190321981 · 2019-10-24 ·

An actuation system to manipulate an interface in an aircraft having an actuation controller, a vision system, a robotic arm, and a housing. Each of the vision system and the robotic arm assembly may be operatively coupled to the actuation controller. The vision system may be configured to optically image a display device of the preexisting interface, while the robotic arm assembly may be configured to engage a user-actuable device of the preexisting interface. The housing can be configured to affix to a surface adjacent the preexisting interface, where each of the vision system and the robotic arm assembly are coupled to the housing. In operation, the actuation controller may be configured to instruct the robotic arm assembly based at least in part on data from the vision system.

CONTROL SYSTEM AND METHOD FOR APPLYING FORCE TO GRASP A TARGET OBJECT

Systems and methods are provided for an automation system. The systems and methods calculate a motion trajectory of a manipulator and an end-effector. The end-effector is configured to grasp a target object. The motion trajectory defines successive positions of the manipulator and the end-effector along a plurality of via-points toward the target object. The systems and methods further acquire force/torque (F/T) data from an F/T sensor associated with the end-effector, and adjusts the motion trajectory based on the F/T data.