Y10S901/04

SYSTEM AND METHOD FOR REINFORCING PROGRAMMING EDUCATION THROUGH ROBOTIC FEEDBACK
20240351192 · 2024-10-24 · ·

A method for toy robot programming, the toy robot including a set of sensors, the method including, at a user device remote from the toy robot: receiving sensor measurements from the toy robot during physical robot manipulation; in response to detecting a programming trigger event, automatically converting the sensor measurements into a series of puppeted programming inputs; and displaying graphical representations of the set of puppeted programming inputs on a programming interface application on the user device.

System and method for flexible human-machine collaboration

Methods and systems for enabling human-machine collaborations include a generalizable framework that supports dynamic adaptation and reuse of robotic capability representations and human-machine collaborative behaviors. Specifically, a method of feedback-enabled user-robot collaboration includes obtaining a robot capability that models a robot's functionality for performing task actions, specializing the robot capability with an information kernel that encapsulates task-related parameters associated with the task actions, and providing an instance of the specialized robot capability as a robot capability element that controls the robot's functionality based on the task-related parameters. The method also includes obtaining, based on the robot capability element's user interaction requirements, user interaction capability elements, via which the robot capability element receives user input and provides user feedback, controlling, based on the task-related parameters, the robot's functionality to perform the task actions in collaboration with the user input; and providing the user feedback including task-related information generated by the robot capability element in association with the task actions.

SYSTEM AND CALIBRATION, REGISTRATION, AND TRAINING METHODS
20180126547 · 2018-05-10 ·

One variation of a method for manipulating a multi-link robotic arm includes: accessing a virtual model of the target object; extracting an object feature representing the target object from the virtual model; at the robotic arm, scanning a field of view of an optical sensor for the object feature, the optical sensor arranged on a distal end of the robotic arm proximal an end effector; in response to detecting the object feature in the field of view of the optical sensor, calculating a physical offset between the target object and the end effector based on a position of the object feature in the field of view of the optical sensor and a known offset between the optical sensor and the end effector; and driving a set of actuators in the robotic arm to reduce the physical offset.

SYSTEM AND CALIBRATION, REGISTRATION, AND TRAINING METHODS
20180126553 · 2018-05-10 ·

A method for manipulating a multi-link robotic arm includes: at a first time, recording a first optical image through an optical sensor arranged proximal a distal end of the robotic arm proximal an end effector; detecting a global reference feature in a first position in the first optical image; virtually locating a global reference frame based on the first position of the global reference feature in the first optical image; calculating a first pose of the end effector within the global reference frame at approximately the first time based on the first position of the global reference feature in the first optical image; and driving a set of actuators within the robotic arm to move the end effector from the first pose toward an object keypoint, the object keypoint defined within the global reference frame and representing an estimated location of a target object within range of the end effector.

Generating a trained robot path based on physical manipulation of the robot and based on training user interface input(s) associated with the physical manipulation

Methods, apparatus, systems, and computer-readable media are provided for training a path of a robot by physically moving the robot, wherein the particular trained path and/or particular robot component movements to achieve the trained path are dependent on which of a plurality of available user interface inputs are selected for the training. The trained path defines a path to be traversed by a reference point of the robot, such as a path to be traversed by a reference point of an end effector of the robot. The particular robot component movements to achieve the trained path include, for example, the orientations of various robot components at each of a plurality of positions along the path, the velocity of various components at each of a plurality of positions along the path, etc.

Methods and systems for providing feedback during teach mode
09919416 · 2018-03-20 · ·

Example implementations may relate to a robotic system that provides feedback. The robotic system is configured to receive information related to a path in an environment of the robotic system. The robotic system is also configured to initiate a recording process for storing data related to motion of a component in the environment. The robotic system is additionally configured to detect, during the recording process, movement of the component along the path in the environment, where the movement results from application of an external force to the robotic system. The robotic system is further configured to determine, during the recording process, deviation of the movement away from the path by at least a threshold amount and responsively provide feedback including one or more of (i) resisting the deviation of the movement away from the path and (ii) guiding the at least one component back towards the path.

TEACHING APPARATUS FOR MANIPULATOR
20180056504 · 2018-03-01 ·

A teaching apparatus for a manipulator is provided, including: a fixing member for fixing to the manipulator; a main body connected to the fixing member; and a handle connected to the main body. The main body includes a plurality of sensors and an operation display member, wherein at least one of the sensors is disposed on one side of the main body adjacent the fixing member and configured for sensing a force, a stress or a torque applied to the main body, and the operation display member is disposed on the other side of the main body and includes a plurality of function keys and a display screen disposed thereon. The teaching apparatus allows the path-teaching process for a manipulator to be completed quickly and easily.

Robot teaching apparatus, method, and robot system
09902059 · 2018-02-27 · ·

A robot teaching apparatus for teaching an operation of a robot measures a state of an action of a mechanism on a target object while the mechanism is acting on the target object. The mechanism has a shape or a function corresponding to a hand unit of the robot. The robot teaching apparatus generates an operation instruction for the robot based on the measured state, and records the generated operation instruction.

Interface for use with trainable modular robotic apparatus

Apparatus and methods for a modular robotic device with artificial intelligence that is receptive to training controls. In one implementation, modular robotic device architecture may be used to provide all or most high cost components in an autonomy module that is separate from the robotic body. The autonomy module may comprise controller, power, actuators that may be connected to controllable elements of the robotic body. The controller may position limbs of the toy in a target position. A user may utilize haptic training approach in order to enable the robotic toy to perform target action(s). Modular configuration of the disclosure enables users to replace one toy body (e.g., the bear) with another (e.g., a giraffe) while using hardware provided by the autonomy module. Modular architecture may enable users to purchase a single AM for use with multiple robotic bodies, thereby reducing the overall cost of ownership.

Method of teaching robot and robot system

A robot system includes a robot, a vision sensor, and a controller. The vision sensor is configured to be detachably attached to the robot. The controller is configured to measure a reference object by using the vision sensor and calibrate a relative relationship between a sensor portion of the vision sensor and an engagement portion of the vision sensor, and teach the robot by referring to the relative relationship and by using the vision sensor, after the vision sensor is attached to the robot.