Patent classifications
G05B19/427
DETERMINATION OF EXTENTS OF A VIRTUAL REALITY (VR) ENVIRONMENT TO DISPLAY ON A VR DEVICE
A computer-implemented method, according to one approach, includes identifying machines involved in performance of a manufacturing process at a manufacturing location, and identifying a workflow sequence of execution of the machines. Conditions associated with remote operators using virtual reality (VR) devices to remotely control the machines to perform the workflow sequence of execution at the manufacturing location are received. The method further includes determining, for each of the VR devices, an extent of a VR collaborative environment to display. The extents are determined based on the conditions, thereby reducing latency in performance of the workflow sequence of execution at the manufacturing location. The method further includes outputting the extents to the VR devices.
Robot system
A robot system according to the present disclosure includes a robot installed in a work area, a manipulator configured to be gripped by an operator and manipulate the robot, a sensor disposed at a manipulation area and configured to wirelessly detect positional information and posture information on the manipulator, and a control device which calculates a locus of the manipulator based on the positional information and the posture information on the manipulator detected by the sensor, and operates the robot on real time.
Controller including means for confirmation in preparation of synchronous operation teaching data
A controller teaches a teaching point of a slave axis corresponding to a master axis so as to perform a synchronous operation. The controller calculates a teaching range based on one moving speed pattern selected from a plurality of moving speed patterns of the master axis which are preliminarily registered, a preliminarily-set allowable speed in an operation of the slave axis, and a calculated teaching range, in which teaching can be performed, of a following teaching point, so as to display the teaching range on a display device.
Controller including means for confirmation in preparation of synchronous operation teaching data
A controller teaches a teaching point of a slave axis corresponding to a master axis so as to perform a synchronous operation. The controller calculates a teaching range based on one moving speed pattern selected from a plurality of moving speed patterns of the master axis which are preliminarily registered, a preliminarily-set allowable speed in an operation of the slave axis, and a calculated teaching range, in which teaching can be performed, of a following teaching point, so as to display the teaching range on a display device.
TELE-OPERATIVE SURGICAL SYSTEMS AND METHODS OF CONTROL AT JOINT LIMITS USING INVERSE KINEMATICS
Devices, systems, and methods for controlling manipulator movements include a manipulator arm coupled to a proximal base. The manipulator arm is configured to support an end effector and robotically move the end effector relative to the proximal base. The manipulator arm includes a plurality of joints between the end effector and the proximal base and a processor. The processor is configured to calculate joint movements of the plurality of joints that provide a desired position of the end effector using inverse kinematics of the manipulator arm and when a first set of one or more joints of the plurality of joints is at corresponding joint range of motion limits: determine a constraint based on a relationship between joint movement of the first set and a second set of one or more joints of the plurality of joints and apply the constraint within the inverse kinematics to provide haptic feedback.
Programming of a robotic arm using a motion capture system
An example method includes receiving position data indicative of position of a demonstration tool. Based on the received position data, the method further includes determining a motion path of the demonstration tool, wherein the motion path comprises a sequence of positions of the demonstration tool. The method additionally includes determining a replication control path for a robotic device, where the replication control path includes one or more robot movements that cause the robotic device to move a robot tool through a motion path that corresponds to the motion path of the demonstration tool. The method also includes providing for display of a visual simulation of the one or more robot movements within the replication control path.
Programming of a robotic arm using a motion capture system
An example method includes receiving position data indicative of position of a demonstration tool. Based on the received position data, the method further includes determining a motion path of the demonstration tool, wherein the motion path comprises a sequence of positions of the demonstration tool. The method additionally includes determining a replication control path for a robotic device, where the replication control path includes one or more robot movements that cause the robotic device to move a robot tool through a motion path that corresponds to the motion path of the demonstration tool. The method also includes providing for display of a visual simulation of the one or more robot movements within the replication control path.
Robot control device, robot system, and robot control method
A robot control device includes: a trained model built by being trained on work data; a control data acquisition section which acquires control data of the robot based on data from the trained model; base trained models built for each of a plurality of simple operations by being trained on work data; an operation label storage section which stores operation labels corresponding to the base trained models; a base trained model combination information acquisition section which acquires combination information when the trained model is represented by a combination of a plurality of the base trained models, by acquiring a similarity between the trained model and the respective base trained models; and an information output section which outputs the operation label corresponding to each of the base trained models which represent the trained model.
Haptic virtual fixture tools
Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.
Haptic virtual fixture tools
Apparatus and methods for defining and utilizing virtual fixtures for haptic navigation within real-world environments, including underwater environments, are provided. A computing device can determine a real-world object within a real-world environment. The computing device can receive an indication of the real-world object. The computing device can determine a virtual fixture that corresponds to the real-world object based on the indication, where aspects of the virtual fixture are configured to align with aspects of the real-world object. The computing device can provide a virtual environment for manipulating the robotic tool to operate on the real-world object utilizing the virtual fixture. The virtual fixture is configured to provide haptic feedback based on a position of a virtual robotic tool in the virtual environment that corresponds to the robotic tool in the real-world environment.