B25J13/08

SHEAR STUD WELDING SYSTEM
20230051339 · 2023-02-16 ·

A shear stud welding system is disclosed. The system comprises a shear stud holder, a robotic arm and a microcontroller. The shear stud holder comprises a turret and a first motor coupled to the turret and configured to rotate the turret about an axis to a predetermined angle such that a shear stud among the plurality of shear studs is at a dispensing position. The robotic arm is configured to transfer the shear stud from the shear stud holder to a workpiece. The microcontroller is configured to control the movement of the robotic arm to pick up the shear stud from the dispensing position and transfer the shear stud holder to the workpiece at a welding position and cause the first motor to rotate the turret to a predetermined angle to cause a shear stud among the plurality of shear studs assume the dispensing position.

SYSTEM AND METHOD FOR CONTROLLING THE ROBOT, ELECTRONIC DEVICE AND COMPUTER READABLE MEDIUM
20230047834 · 2023-02-16 ·

Systems, devices, and methods for controlling a robot. Some methods include, in response to determining that an object enters a reachable area of the robot, triggering a first sensor to sense a movement of the object; determining first position information of the object based on data received from the first sensor; determining second position information of the object based on second data received from a second sensor; and generating a first prediction of a target position at which the object is operated by the robot. In this way, the robot can complete an operation for the object on the AGV within a limit operation time during which the AGV passes through the reachable area of the robot. Meanwhile, by collecting the sensing data from different sensor groups, a target position at which the object is handled by the robot may be predicted more accurately.

ROBOT CONTROL APPARATUS, ROBOT CONTROL SYSTEM, ROBOT CONTROL METHOD, AND COMPUTER-READABLE STORAGE MEDIUM STORING A ROBOT CONTROL PROGRAM
20230046793 · 2023-02-16 · ·

A robot control apparatus according to one or more embodiments may include: a calculating unit configured to calculate an interference range of a robot based on a model of the robot in a state in which an object is gripped by a gripper with which the robot is equipped; and a planning unit configured to plan a motion of the robot based on the model and the interference range.

CONTINUUM ARM ROBOT SYSTEM

A continuum arm robot system comprising at least a first continuum arm robot and a second continuum arm robot, each continuum arm robot being controlled by its own actuator pack, and each actuator pack being coupled to a single control computer, wherein at least the second continuum arm robot comprises a releasable connection mechanism to engage in gripping the first continuum arm robot in a workspace, so as to link the at least two continuum arm robots into a single redundant robotic system with at least the second continuum arm robot providing support for the first continuum arm robot.

METHOD AND APPARATUS FOR THE AUTOMATED TRANSFER OF AN INTRAOCULAR LENS
20230048115 · 2023-02-16 ·

Disclosed is a method for the automated transfer of an intraocular lens (1) comprising an optical lens body (10) and two haptics (11) attached to a peripheral edge of the optical lens body (10) and extending outwardly from the peripheral edge of the optical lens body (10). The method comprises the steps of: picking the intraocular lens (1) up at a start location; moving the intraocular lens (1) to a destination location; releasing the intraocular lens (1) at the destination location,
wherein picking the intraocular lens (1) up at the start location comprises gripping the intraocular lens (1) only at the haptics (11) of the intraocular lens (1).

METHOD AND APPARATUS FOR THE AUTOMATED TRANSFER OF AN INTRAOCULAR LENS
20230048115 · 2023-02-16 ·

Disclosed is a method for the automated transfer of an intraocular lens (1) comprising an optical lens body (10) and two haptics (11) attached to a peripheral edge of the optical lens body (10) and extending outwardly from the peripheral edge of the optical lens body (10). The method comprises the steps of: picking the intraocular lens (1) up at a start location; moving the intraocular lens (1) to a destination location; releasing the intraocular lens (1) at the destination location,
wherein picking the intraocular lens (1) up at the start location comprises gripping the intraocular lens (1) only at the haptics (11) of the intraocular lens (1).

Gravity and Inertial Compensation of Force/Torque Sensors
20230049155 · 2023-02-16 ·

Force and torque measurements from a robotic F/T sensor are compensated for the effects of gravity, and optionally additionally for the effects of robot motion. The weight of an attached tool W.sub.tool, and a vector {right arrow over (r)}.sub.CG from the F/T sensor body CF origin to a center of gravity of the tool are obtained, such as from user input or by parameter identification. During a robotic operation, a rotation matrix R.sub.International CF.sup.Body CF from the F/T sensor body CF to an inertial reference frame is obtained, such as from an internal inertial measurement unit (IMU), or from forward kinematics data from the robot. The force and torque measurements resolved by the F/T sensor from transducer outputs are compensated for gravity based on the W.sub.tool and {right arrow over (r)}.sub.CG, and the instantaneous value of R.sub.International CF.sup.Body CF. For inertial compensation, the additional information is obtained, including: the mass m of the attached tool; the angular velocity {right arrow over (ω)} of the F/T sensor body CF; the angular acceleration {dot over (ω)} of the F/T sensor body CF; the linear acceleration {right arrow over (a)} of the F/T sensor body CF; and inertia tensor I defined in the F/T sensor body CF which contains all moments and products of inertia. The force and torque measurements resolved by the F/T sensor from transducer outputs are compensated for inertial effects based on m, {right arrow over (ω)}, {right arrow over (ω)}, {right arrow over (r)}.sub.CG, {right arrow over (α)}, and I.

Visual annotations in robot control interfaces

Methods, apparatus, systems, and computer-readable media are provided for visually annotating rendered multi-dimensional representations of robot environments. In various implementations, an entity may be identified that is present with a telepresence robot in an environment. A measure of potential interest of a user in the entity may be calculated based on a record of one or more interactions between the user and one or more computing devices. In some implementations, the one or more interactions may be for purposes other than directly operating the telepresence robot. In various implementations, a multi-dimensional representation of the environment may be rendered as part of a graphical user interface operable by the user to control the telepresence robot. In various implementations, a visual annotation may be selectively rendered within the multi-dimensional representation of the environment in association with the entity based on the measure of potential interest.

Manipulator system with input device for force reduction
11576741 · 2023-02-14 · ·

A manipulator system includes a manipulator configured for guiding an instrument. The system furthermore includes a controller configured to actuate the manipulator such that the instrument is pressed with a pressing force against a human body. A force reduction input device is provided separately from the manipulator and is operable by an operator to reduce the pressing force.

Teaching apparatus, robot system, and teaching program
11577381 · 2023-02-14 · ·

A teaching apparatus includes a display unit that displays a command display area in which a plurality of input motion commands of a robot are displayed, an extraction display area in which at least one motion command extracted from the plurality of motion commands displayed in the command display area is displayed, and a settings input area in which details of the extracted motion command are set, and a display control unit that controls actuation of the display unit, wherein the display control unit extracts and displays a motion command related to one of position information, velocity information, and acceleration information of the robot out of the plurality of motion commands displayed in the command display area in the extraction display area.