Patent classifications
G05B2219/40099
SYSTEM AND METHOD FOR SEQUENCING ASSEMBLY TASKS
One embodiment can provide a method and system for configuring a robotic system. During operation, the system can present to a user on a graphical user interface an image of a work scene comprising a plurality of components and receive, from the user, a sequence of operation commands. A respective operation command can correspond to a pixel location in the image. For each operation command, the system can determine, based on the image, a task to be performed at a corresponding location in the work scene and generate a directed graph based on the received sequence of operation commands. Each node in the directed graph can correspond to a task, and each directed edge in the directed graph can correspond to a task-performing order, thereby facilitating the robotic system to perform a sequence of tasks based on the sequence of operation commands.
ROBOTIC KITCHEN HUB SYSTEMS AND METHODS FOR MINIMANIPULATION LIBRARY ADJUSTMENTS AND CALIBRATIONS OF MULTI-FUNCTIONAL ROBOTIC PLATFORMS FOR COMMERCIAL AND RESIDENTIAL ENVIORNMENTS WITH ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
The present disclosure is directed to methods, computer program products, and computer systems of a robotic kitchen hub for calibrations of multi-functional robotic platforms for commercial and residential environments with artificial intelligence and machine learning. The multi-functional robotic platform includes a robotic kitchen for calibration with either a joint state trajectory or in a coordinate system like a cartesian coordinate for mass installation of robotic kitchens. Calibration verifications and minimanipulation library adaptation and adjustment of any serial model or different models provide scalability in the mass manufacturing of a robotic kitchen system. A robotic kitchen with multi-mode provides a robot mode, a collaboration mode and a user mode which a particular food dish can be prepared by the robot, a collaboration on sharing tasks between the robot and a user, or the robot serves as an aid for the user to prepare a food dish.
Generation of robotic user interface responsive to connection of peripherals to robot
Methods and systems for connection-driven generation of robotic user interfaces and modification of robotic properties include detecting a connection of a robotic peripheral to a robot; obtaining a peripheral property set corresponding to the robotic peripheral, wherein the peripheral property set includes one or more properties of the robotic peripheral; modifying, based on the peripheral property set, a robotic property set that includes one or more properties of the robot to provide a modified robotic property set; generating, during runtime, a robotic graphical user interface (“RGUI”) dynamically based on the peripheral property set, wherein the RGUI provides at least one user-accessible interface to control the robot and the robotic peripheral; and controlling, based on the modified robotic property set, the robot and the robotic peripheral in response to user input received via the RGUI.
ROBOT PROGRAMMING
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for robot programming. One of the methods comprises generating an interactive user interface that includes an illustration of a first virtual robot, the first virtual robot having an initial pose that defines respective joint angles of one or more joints of the first virtual robot; receiving user input data specifying a target pose of the first virtual robot; and generating an animation of the first virtual robot transitioning between the initial pose and the target pose.
GENERATION OF ROBOTIC USER INTERFACE RESPONSIVE TO CONNECTION OF PERIPHERALS TO ROBOT
Methods and systems for connection-driven generation of robotic user interfaces and modification of robotic properties include detecting a connection of a robotic peripheral to a robot; obtaining a peripheral property set corresponding to the robotic peripheral, wherein the peripheral property set includes one or more properties of the robotic peripheral; modifying, based on the peripheral property set, a robotic property set that includes one or more properties of the robot to provide a modified robotic property set; generating, during runtime, a robotic graphical user interface (“RGUI”) dynamically based on the peripheral property set, wherein the RGUI provides at least one user-accessible interface to control the robot and the robotic peripheral; and controlling, based on the modified robotic property set, the robot and the robotic peripheral in response to user input received via the RGUI.
ROBOT CONTROL DEVICE, AND ROBOT SYSTEM
Provided is a robot control device capable of facilitating the work of setting a control center for controlling the operation of a robot. The robot control device controls a robot manipulator which is equipped with an end effector. The robot control device includes: an image processing unit that, by using a feature extraction model for detecting images of the robot manipulator, and position/posture information of the robot manipulator, detects, from images (M1, M2) in which at least part of the robot manipulator is captured, a position in a three-dimensional space which corresponds to designated positions (P1, P2) designated on the images, as a position relative to the robot manipulator; and a coordinate system determination unit that sets a control center, for controlling the operation of the robot manipulator, to the position detected in the three-dimensional space.
Integrating robotic process automations into operating and software systems
Disclosed herein is a computing system that includes a memory and a processor coupled to the memory. The memory storing processor executable instructions for an interface engine that integrates robotic processes into a graphic user interface of the computing system. The processor executes the interface engine to cause the computing system to receive inputs via a menu of the graphic user interface and to automatically determine the robotic processes for display in response to the inputs. The interface engine further generates a list including selectable links corresponding to the robotic processes and displays the list in association with the menu.
SYSTEMS AND METHODS FOR REAL-TIME CONTROL OF A ROBOT USING A ROBOT ANIMATION SYSTEM
Various systems and methods for controlling a target device are disclosed. For example, a system includes a user computing device including a user interface, a user motion database communicatively coupled to the user computing device, a controller, a controller motion database communicatively coupled to the controller, and a target device communicatively coupled to the controller. The user computing device can be configured to connect the user motion database and the controller motion database to share corresponding sets of motion instructions in real-time in response to receiving a sync indication from the user computing device. The target device can be configured to implement the corresponding sets of motion instructions in real-time on the target device.
Input device, method for providing movement commands to an actuator, and actuator system
An input control for providing motion commands to an actuator, including an input screen on which a plurality of movement symbols are arranged, each of which is associated with a motion command for the actuator, and which includes a sequence track for lining up copies of the movement symbols along an alignment direction, having a processor which is configured to interrogate the sequence track in order to determine a sequence of motion commands and to output the sequence of motion commands and/or an actuator control signal sequence which is dependent on the sequence of motion commands, and wherein a sensor signal track is arranged in parallel with the sequence track, which is configured for displaying a sensor signal sequence of at least one sensor signal of a sensor system assigned to the actuator.
INPUT DEVICE, METHOD FOR PROVIDING MOVEMENT COMMANDS TO AN ACTUATOR, AND ACTUATOR SYSTEM
An input control for providing motion commands to an actuator, including an input screen on which a plurality of movement symbols are arranged, each of which is associated with a motion command for the actuator, and which includes a sequence track for lining up copies of the movement symbols along an alignment direction, having a processor which is configured to interrogate the sequence track in order to determine a sequence of motion commands and to output the sequence of motion commands and/or an actuator control signal sequence which is dependent on the sequence of motion commands, and wherein. a sensor signal track is arranged in parallel with the sequence track, which is configured for displaying a sensor signal sequence of at least one sensor signal of a sensor system assigned to the actuator.