G05B2219/40617

ROBOT AND HOUSING

Convenience and usefulness of a tele-existence system are enhanced taking notice of the possibility by collaboration of tele-existence and a head-mounted display apparatus. A movable member is supported for pivotal motion on a housing (20). In the housing, a driving motor and a transmission member for transmitting rotation of the driving motor to the movable member are provided. A state information acquisition unit acquires facial expression information and/or emotion information of a user who wears a head-mounted display apparatus (100). A driving controlling unit controls rotation of the driving motor on the basis of the facial expression information and/or the emotion information.

Active camera movement determination for object position and extent in three-dimensional space

A method of motion planning includes observing an object from a first pose of an agent having a controllable camera. The method also includes determining one or more subsequent control inputs to move the agent and the camera to observe the object from at least one subsequent pose. The subsequent control input(s) are determined so as to minimize an expected enclosing measure of the object based on visual data collected from the camera. The method further includes controlling the agent and the camera based on the subsequent control input(s).

ROBOT SYSTEM, CONTROL APPARATUS OF ROBOT SYSTEM, CONTROL METHOD OF ROBOT SYSTEM, IMAGING APPARATUS, AND STORAGE MEDIUM
20240269857 · 2024-08-15 ·

A robot system including a robot apparatus and an imaging apparatus includes a control apparatus configured to control the robot apparatus and the imaging apparatus, and the control apparatus controls, based on a path in which a predetermined part of the robot apparatus is moved, a movement of the imaging apparatus to image the predetermined part even if the robot apparatus is moved.

DEBURRING APPARATUS
20180161952 · 2018-06-14 · ·

A deburring apparatus including: a robot that uses a deburring tool to deburr an object supported by a support in a machine tool, a visual sensor, a relative movement mechanism for causing relative movement between the visual sensor and the object supported by the support; and a controller, wherein the controller is configured to conduct: an operation process that operates the relative movement mechanism based on a visual sensor relative movement program for controlling an operation of the relative movement mechanism so that a ridge of the object supported by the support is detected by the visual sensor during the relative movement; and a deburring operation program generation process which generates a deburring operation program by using the detected ridge obtained by the visual sensor when the relative movement mechanism is operated based on the visual sensor relative movement program.

Control apparatus and method, and motor control system

A control apparatus includes a control unit that controls a first motor and a second motor. The control unit controls a rotation angle of a driven unit using one of the first motor and the second motor based on a direction of disturbance and controls, using the other of the first motor and the second motor, backlash removal from a decelerating unit configured to transmit an output of the first motor or the second motor to the driven unit.

Drone assisted adaptive robot control
09855658 · 2018-01-02 ·

A method, a drone device, and an adaptive robot control system (ARCS) for adaptively controlling a programmable robot are provided. The ARCS receives environmental parameters of a work environment where the drone device operates and geometrical information of a target object to be operated on by the programmable robot. The ARCS dynamically receives a calibrated spatial location of the target object in the work environment based on the environmental parameters and a discernment of the target object from the drone device. The ARCS determines control information including parts geometry of the target object, a task trajectory of a task to be performed on the target object, and a collision-free robotic motion trajectory for the programmable robot, and dynamically transmits the control information to the programmable robot via a communication network to adaptively control the programmable robot while accounting for misalignments of the target object in the work environment.

Technologies for pan tilt unit calibration

Technologies for calibrating a pan tilt unit with a robot include a robot controller to move a camera of the pan tilt unit about a first rotational axis of the pan tilt unit to at least three different first axis positions. The robot controller records a first set of positions of a monitored component of the robot in a frame of reference of the robot and a position of the camera in a frame of reference of the pan tilt unit during a period in which the monitored component is within a field of view of the camera for each of the at least three different first axis positions. Further, the robot controller moves the camera about a second rotational axis of the pan tilt unit to at least three different second axis positions and records a second set of positions of the monitored component in the frame of reference of the robot and a position of the camera in the frame of reference of the pan tilt unit during a period in which the monitored component is within a field of view of the camera for each of the at least three different second axis positions. Further, the robot controller determines a transformation from the frame of reference of the robot to the frame of reference of the pan tilt unit based on the first set of recorded positions and the second set of recorded positions.

ACTIVE CAMERA MOVEMENT DETERMINATION FOR OBJECT POSITION AND EXTENT IN THREE-DIMENSIONAL SPACE

A method of motion planning includes observing an object from a first pose of an agent having a controllable camera. The method also includes determining one or more subsequent control inputs to move the agent and the camera to observe the object from at least one subsequent pose. The subsequent control input(s) are determined so as to minimize an expected enclosing measure of the object based on visual data collected from the camera. The method further includes controlling the agent and the camera based on the subsequent control input(s).

Eye-on-Hand Reinforcement Learner for Dynamic Grasping with Active Pose Estimation

A controller is provided for performing dynamic grasping of a target object using visual sensory inputs. The controller includes a robotic interface connected to a robotic arm including links connected by joints having actuators and encoders, and a gripper of the end-effector of the robotic arm configured to grasp the target object in response to robot control signals, and a vision sensor configured to continuously provide visual observations for tracking poses of the target object in a workspace and compute grasp poses, wherein the vision sensor is mounted on a distal end of the robotic arm adjacent to the gripper. The controller trains the Eye-on-Hand reinforcement learner policy, tracks the poses of the target object, and generates robot control signals to follow the target object while keeping it in the field of view of the vision sensor and grasp the target object in the workspace.

Robot and method for controlling a robot
09579793 · 2017-02-28 · ·

The invention relates to a robot and a method for controlling a robot. The distance between an object and the robot and/or the derivative thereof or a first motion of the object is detected by means of a non-contact distance sensor arranged in or on a robot arm of the robot and/or on or in an end effector fastened on the robot arm. The robot arm is moved based on the first motion detected by means of the distance sensor, a target force or a target torque to be applied by the robot is determined based on the distance detected between the object and the robot, and/or a function of the robot or a parameterization of a function of the robot is triggered based on the first motion detected and/or a target distance between the object and the robot and/or the derivative thereof detected by means of the distance sensor.