G05B2219/39523

Tactile information estimation apparatus, tactile information estimation method, and program

According to some embodiments, a tactile information estimation apparatus may include one or more memories and one or more processors. The one or more processors are configured to input at least first visual information of an object acquired by a visual sensor to a model. The model is generated based on visual information and tactile information linked to the visual information. The one or more processors are configured to extract, based on the model, a feature amount relating to tactile information of the object.

Method and apparatus for providing food to user
11524408 · 2022-12-13 · ·

Provided is a method of providing food to a user, the method including determining to provide first food among the food to the user; moving a first gripper to a container that contains the first food, determining whether the first gripper reciprocates in the container, calculating a weight difference value indicating an amount of change in a total weight of the food before and after the reciprocating in response to a determination that the first gripper reciprocates in the container, and determining that the first food is provided to the user based on the weight difference value. In addition, an apparatus for providing food to a user to perform the food providing method is provided. Also, a non-transitory computer-readable storage medium storing programs to perform the food providing method is provided.

METHOD FOR AUTOMATIC LOAD COMPENSATION FOR A COBOT OR AN UPPER LIMB EXOSKELETON

A control method for controlling an actuator (11) connected to a load (50) for handling, the method comprising the steps of: detecting an intention to handle the load (50); applying an increasing command to the actuator (11) until detecting a movement of the actuator (11); storing the value reached by the command when a movement of the actuator (11) is detected; using the stored value reached by the command to determine an estimate of the opposing force exerted by the load (50) for handling; and controlling the actuator by means of a force servocontrol relationship using the estimate of the opposing force exerted by the load (50) for handling in order to establish the commands to be applied to the actuator (11).

A cobot (1) arranged to perform the method.

ESTIMATION APPARATUS, ESTIMATION METHOD, AND ESTIMATION PROGRAM

An estimation apparatus includes: an acquisition section that acquires a measurement result of a measurement unit that measures an object to be an estimation target of a contact sense in a contactless manner; a determination section that makes a determination as to an aspect of the object or a measurement condition of the object on a basis of the measurement result of the measurement unit; a selection section that selects, on a basis of a result of the determination, an estimation scheme to be used for estimation of the contact sense of the object from among a plurality of estimation schemes; and an estimation section that estimates the contact sense of the object using the selected estimation scheme.

System and method for controlling the contact pressure applied by an articulated robotic arm to a working surface
11312015 · 2022-04-26 · ·

A system and method for moving an object against a working surface of a finishing machine that is set in a fixed position. The object is moved in a precise movement pattern while following a precise contact pressure pattern. The object is moved against the working surface of the finishing machine using a robot with an articulating arm. Other movement is provided by a dynamic platform upon which the robot rests. The dynamic platform includes a linear slide that enables the robot to reciprocally move. The dynamic platform also includes an active contact flange that acts upon the linear slide. The active contact flange is programmable and imparts the contact pressure pattern to the object through the linear slide and the robot. A rotary table can also be provided that selectively rotates the robot, the linear slide and the active contact flange.

Control method and control system of manipulator

A control method of a manipulator is provided. The method includes photographing a target using a camera and detected the target using the photographed data. A holding motion for the target is set based on the detected target and a robot is operated to hold the target based on the set holding motion.

Autonomous unknown object pick and place

A set of one or more potentially graspable features for one or more objects present in a workspace area are determined based on visual data received from a plurality of cameras. For each of at least a subset of the one or more potentially graspable features one or more corresponding grasp strategies are determined to grasp the feature with a robotic arm and end effector. A score associated with a probability of a successful grasp of a corresponding feature is determined with respect to each of a least a subset of said grasp strategies. A first feature of the one or more potentially graspable features is selected to be grasped using a selected grasp strategy based at least in part on a corresponding score associated with the selected grasp strategy with respect to the first feature. The robotic arm and the end effector are controlled to attempt to grasp the first feature using the selected grasp strategy.

COMPUTER CONTROLLED POSITIONING OF DELICATE OBJECTS WITH LOW-CONTACT FORCE INTERACTION USING A ROBOT
20220314453 · 2022-10-06 ·

A computer positions an object using a computer-controlled positioning device. The computer is operatively associated with the positioning device via a control interface. The positioning device has a substantially-hollow interior chamber. The computer identifies a selected object located at a primary location within the interior chamber and having a primary orientation with respect thereto. The computer identifies a first array of elements constructed and arranged to generate contact-free support forces sufficient to maintain the selected object at the primary location. The computer identifies a second array of elements constructed and arranged to provide contact-free interaction forces sufficient to move the selected object within the interior chamber. The computer interacts with the selected object, using the control interface to adjust at least one of either the supporting forces and the interaction forces, to place the selected object into at least one of a secondary location or a secondary orientation.

Control device and machine learning device
10864630 · 2020-12-15 · ·

A control device and a machine learning device enable control for gripping an object having small reaction force. The machine learning device included in the control device includes a state observation unit that observes gripping object shape data related to a shape of the gripping object as a state variable representing a current state of an environment, a label data acquisition unit that acquires gripping width data, which represents a width of the hand of the robot in gripping the gripping object, as label data, and a learning unit that performs learning by using the state variable and the label data in a manner to associate the gripping object shape data with the gripping width data.

AUTONOMOUS UNKNOWN OBJECT PICK AND PLACE

A set of one or more potentially graspable features for one or more objects present in a workspace area are determined based on visual data received from a plurality of cameras. For each of at least a subset of the one or more potentially graspable features one or more corresponding grasp strategies are determined to grasp the feature with a robotic arm and end effector. A score associated with a probability of a successful grasp of a corresponding feature is determined with respect to each of a least a subset of said grasp strategies. A first feature of the one or more potentially graspable features is selected to be grasped using a selected grasp strategy based at least in part on a corresponding score associated with the selected grasp strategy with respect to the first feature. The robotic arm and the end effector are controlled to attempt to grasp the first feature using the selected grasp strategy.