Patent classifications
G05B2219/45108
ENHANCING ROBOT LEARNING
Methods, systems, and apparatus, including computer-readable media storing executable instructions, for enhancing robot learning. In some implementations, a robot stores first embeddings generated using a first machine learning model, and the first embeddings include one or more first private embeddings that are not shared with other robots. The robot receives a second machine learning model from a server system over a communication network. The robot generates a second private embedding for each of the one or more first private embeddings using the second machine learning model. The robot adds the second private embeddings to the cache of the robot and removes the one or more first private embeddings from the cache of the robot.
Enhancing robot learning
Methods, systems, and apparatus, including computer-readable media storing executable instructions, for enhancing robot learning. In some implementations, a robot stores first embeddings generated using a first machine learning model, and the first embeddings include one or more first private embeddings that are not shared with other robots. The robot receives a second machine learning model from a server system over a communication network. The robot generates a second private embedding for each of the one or more first private embeddings using the second machine learning model. The robot adds the second private embeddings to the cache of the robot and removes the one or more first private embeddings from the cache of the robot.
Exoskeleton robot and controlling method for exoskeleton robot
The present disclosure provides a method for controlling an exoskeleton robot. The method comprises checking that a first signal is triggered by a first button, checking a tilt angle after the first signal is triggered, setting an action based on the tilt angle, and executing the action to move the exoskeleton robot. The first signal indicates to change the exoskeleton robot from a standing posture to another posture, and the tilt angle is a leaning-forward angle of a waist assembly of the exoskeleton robot relative to a line vertical to ground. The method utilizes the tilt angle to judge the intent of the user, and thus can simplify the controlling buttons to one or two buttons. Further, the controlling method also monitors the tilt angle to choose a suitable action.
Performance evaluation apparatus and performance evaluation method for wearable motion assistance device
A performance evaluation apparatus and performance evaluation method capable of efficiently evaluating the performance of a wearable motion assistance device is provided, which assists motions of a wearer's lower back part. In a state where the wearable motion assistance device is secured and mounted on both femur links and a trunk link, torque acting on an axis line of a pitch direction relative to the trunk link for each hip joint is detected while controlling driving forces by first and second driving sources so that a posture of the trunk link and rotation angles of each hip joint and each knee joint virtually match motions of the lower back part of the wearer; and performance of an assist force by the wearable motion assistance device is evaluated based on a detection result of the torque according to drive control of the first and second driving sources.
Data Processing Method for Care-Giving Robot and Apparatus
A data processing method for a care-giving robot and an apparatus comprises receiving data from a target object comprising a capability parameter of the target object, generating a growing model capability parameter matrix of the target object that includes the capability parameter, a capability parameter adjustment value, and a comprehensive capability parameter that is calculated based on the capability parameter; adjusting the capability parameter adjustment value in the growing model capability parameter matrix, to determine an adjusted capability parameter adjustment value; determining whether the adjusted capability parameter adjustment value exceeds a preset threshold; and sending the adjusted capability parameter adjustment value to a machine learning engine when the adjusted capability parameter adjustment value is within a range of the preset threshold.
Remote control system and remote control method
A remote control system includes: an imaging unit that shoots an environment in which a device to be operated including an end effector is located; a recognition unit that recognizes objects that can be grasped by the end effector based on a shot image of the environment shot by the imaging unit; an operation terminal that displays the shot image and receive handwritten input information input to the displayed shot image; and an estimation unit that, based on the objects that can be grasped and the handwritten input information input to the shot image, estimates an object to be grasped which has been requested to be grasped by the end effector from among the objects that can be grasped and estimates a way of performing a grasping motion by the end effector, the grasping motion having been requested to be performed with regard to the object to be grasped.
Robot apparatus, method for controlling the same, and computer program
A robot apparatus includes an output unit that displays an image including an object on a screen, an input unit that receives an operation performed by a user for specifying information relating to an approximate range including the object in the image, an object extraction unit that extracts information regarding a two-dimensional contour of the object on the basis of the specification received by the input unit, and a position and attitude estimation unit that estimates information regarding a three-dimensional position and attitude of the object on the basis of the information regarding the two-dimensional contour.
Intelligent alerting method, terminal, wearable device, and system
An intelligent alerting method and a terminal to obtain human body status information and determine an alerting occasion and mode according to the human body status information, where the method includes: obtaining, by a terminal, human body status information of a user, where the human body status information is status information obtained after basic human body parameter data is combined, the human body status information representing a body status of the user, and the basic human body parameter data is vital sign data obtained through detection by the terminal or by a wearable device that has a communications connection relationship with the terminal determining, by the terminal, an alerting mode according to the human body status information and a predefined alerting policy, and providing, by the terminal, a corresponding alert according to the alerting mode.
SHARING LEARNED INFORMATION AMONG ROBOTS
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for sharing learned information among robots. In some implementations, a robot obtains sensor data indicating characteristics of an object. The robot determines a classification for the object and generates an embedding for the object using a machine learning model stored by the robot. The robot stores the generated embedding and data indicating the classification for the object. The robot sends the generated embedding and the data indicating the classification to a server system. The robot receives, from the server system, an embedding generated by a second robot and a corresponding classification. The robot stores the received embedding and the corresponding classification in the local cache of the robot. The robot may then use the information in the cache to identify objects.
EXOSKELETON ROBOT AND CONTROLLING METHOD FOR EXOSKELETON ROBOT
The present disclosure provides a method for controlling an exoskeleton robot. The method comprises checking that a first signal is triggered by a first button, checking a tilt angle after the first signal is triggered, setting an action based on the tilt angle, and executing the action to move the exoskeleton robot. The first signal indicates to change the exoskeleton robot from a standing posture to another posture, and the tilt angle is a leaning-forward angle of a waist assembly of the exoskeleton robot relative to a line vertical to ground. The method utilizes the tilt angle to judge the intent of the user, and thus can simplify the controlling buttons to one or two buttons. Further, the controlling method also monitors the tilt angle to choose a suitable action.