Patent classifications
B25J9/161
ROBOT CONTROLLER, ROBOT CONTROL METHOD, AND STORAGE MEDIUM STORING ROBOT CONTROL PROGRAM
A robot controller includes: axis motor control units that control motors for driving axes of a robot; and an action command generation unit that generates a first action command having the shortest action time when the robot is moved from an action start point to an action goal point without considering an obstacle, and selects, from among the axes, a major axis having the longest action time when the action is performed in accordance with the first action command. The first action command includes another axis command, and a major axis command, and the action command generation unit adjusts the other axis command so as to reduce an action time according to the other axis command and outputs a second action command including the major axis command and the adjusted other axis command and corresponding to a first trajectory when determining that the first trajectory avoids a clash between the robot and the obstacle.
DEVICE AND METHOD FOR CONTROLLING A ROBOT
A method for controlling a robot device. The method includes acquiring an image(s) of in a workspace of the robot device; determining, by a neural network, object hierarchy information specifying stacking relations of the objects with respect to each other in the workspace of the robot device and confidence information for the object hierarchy information from the image(s); if the confidence information indicates a confidence above a confidence threshold, manipulating an object of the objects; if the confidence information indicates a confidence lower than the confidence threshold, acquiring an additional image of the objects and determining, by the neural network, additional object hierarchy information specifying stacking relations of the objects with respect to each other in the workspace of the robot device and additional confidence information for the additional object hierarchy information from the additional image and control the robot using the additional object hierarchy information.
Teleoperation system, method, apparatus, and computer-readable medium
Embodiments of the present disclosure provide a system, method, apparatus and computer-readable medium for teleoperation. An exemplary system includes a robot machine having a machine body, at least one sensor, at least one robot processor, and at least one user processor operable to maintain a user simulation model of the robot machine and the environment surrounding the robot machine, the at least one user processor being remote from the robot machine. The system further includes at least one user interface comprising a haptic user interface operable to receive user commands and to transmit the user commands to the user simulation model, a display operable to display a virtual representation of the user simulation model.
ROBOT AND METHOD FOR CONTROLLING THEREOF
A robot is provided. The robot includes a microphone, a camera, a communication interface including a circuit, a memory storing at least one instruction, and a processor, wherein the processor is configured to acquire a user voice through the microphone, identify a task corresponding to the user voice, determine whether the robot can perform the identified task, and control the communication interface to transmit information on the identified task to an external robot based on the determination result.
METHOD OF DETERMINING VALUE OF PARAMETER FOR CONTROLLING WEARABLE DEVICE AND ELECTRONIC DEVICE PERFORMING THE METHOD
An electronic device may receive log information regarding a motion of a wearable device from the wearable device, determine a value of at least one of one or more mobile parameters to be applied to a robot parameter algorithm for calculating a value of a robot parameter used to control the wearable device based on the log information, and determine the value of the robot parameter based on the robot parameter algorithm and the determined value of at least one of the mobile parameters.
Object handling control device, object handling device, object handling method, and computer program product
An object handling control device includes one or more processors configured to acquire at least object information and status information representing an initial position and a destination of an object; set, when a grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.
User controller with user presence detection and related systems and methods
The present invention is relates to a user controller having a thumb sheath with an open side defined in the thumb sheath. Further embodiments relate to thumb presence sensors and sensory feedback components associated with the thumb sheath. Additional embodiments relate to an adjustable thumb sheath. Still other embodiments relate to systems comprising such user controllers.
Moving robot
Disclosed is a moving robot including: a voice input unit configured to receive a voice input of a user; a first display capable of receiving a touch input; a second display larger than the first display; and a controller configured to perform control such that a screen to be displayed in response to the voice input or the touch input is displayed on at least one of the first display or the second display based on a type and an amount of information included in the screen, and accordingly, it is possible to provide information and services more effectively using the two displays.
Grasp generation using a variational autoencoder
In at least one embodiment, a system determines a set of possible grasp poses that allow a robot to successfully grasp an object by generating a set of potential grasp poses, and then evaluating the performance of each potential grasp pose. In at least one embodiment, the system performs a refinement operation on the grasp poses, and based on an evaluation of the poses, creates an improved set of possible grasps for the object.
SYSTEM AND APPARATUS FOR ANATOMY STATE CONFIRMATION IN SURGICAL ROBOTIC ARM
A surgical robotic system includes a surgical console having a display and a user input device configured to generate a user input and a surgical robotic arm, which includes a surgical instrument configured to treat tissue and being actuatable in response to the user input and a video camera configured to capture video data that is displayed on the display. The system also includes a control tower coupled to the surgical console and the surgical robotic arm. The control tower is configured to process the user input to control the surgical instrument and to record the user input as input data; communicate the input data and the video data to at least one machine learning system configured to generate a surgical process evaluator; and execute the surgical process evaluator to determine whether the surgical instrument is properly positioned relative to the tissue.