Patent classifications
B25J9/1656
PICKING SYSTEM AND METHOD
Provided is an picking system which can suitably extract a workpiece by machine learning. The picking system is provided with: a robot which has a hand; an acquisition unit which acquires a two-dimensional camera image of an area where a plurality of workpieces are present; a teaching unit which can display the two-dimensional camera image and teach an picking position of a target workpiece to be extracted by the hand from among the plurality of workpieces; a training unit which generates a trained model on the basis of the two-dimensional camera image and the taught picking position; an inference unit which infers the picking position of the target work on the basis of the trained model and the two-dimensional camera image; and a control unit which controls the robot to extract the target workpiece by means of the hand on the basis of the inferred picking position.
ROBOT PROGRAM GENERATION METHOD FROM HUMAN DEMONSTRATION
A method for teaching a robot to perform an operation based on human demonstration using force and vision sensors. The method includes a vision sensor to detect position and pose of both the human's hand and optionally a workpiece during teaching of an operation such as pick, move and place. The force sensor, located either beneath the workpiece or on a tool, is used to detect force information. Data from the vision and force sensors, along with other optional inputs, are used to teach both motions and state change logic for the operation being taught. Several techniques are disclosed for determining state change logic, such as the transition from approaching to grasping. Techniques for improving motion programming to remove extraneous motions by the hand are also disclosed. Robot programming commands are then generated from the hand position and orientation data, along with the state transitions.
OPERATION SYSTEM FOR INDUSTRIAL MACHINERY
An operation system for industrial machinery comprises: an information acquisition unit which acquires machine identification information corresponding to an industrial machine; a machine identification unit which identifies the industrial machine on the basis of the acquired machine identification information; a model projection unit which projects a model corresponding to the identified industrial machine into a virtual space; a distance/direction calculation unit which calculates the distance and direction, of a user observing the model, with respect to the projected model; a gesture observation unit which observes the gesture of the user as an instruction from the user to the identified industrial machine; an instruction determination unit which determines whether or not a user can give an instruction; and an operation instruction unit which operates the identified industrial machine on the basis of the observed gesture of the user, when the determination result is positive.
METHOD FOR CONTROLLING ROBOT, ROBOT, AND RECORDING MEDIUM
A robot detects, through a sensor, the location and movement direction of a user and an object near the user, sets a nearby ground area in front at the feet of the user according to the detected location and movement direction of the user, controls an illumination device in the robot to irradiate the nearby ground area with light while driving at least one pair of legs or wheels of the robot to cause the robot to accompany the user, specifies the type and the location of the detected object, and if the object is a dangerous object and is located ahead of the user, controls the illumination device to irradiate a danger area including at least a portion of the dangerous object with light in addition to irradiating the nearby ground area with light.
Registration System and Method for robot-oriented augmented reality teaching system
A registration system for robot-oriented augmented reality teaching system, comprising: a physical robot unit, a registration unit, a virtual robot generation unit and a computer; the physical robot unit comprising a physical robot, a physical robot controller and a robot point-to-point intermittent movement control program; the physical robot provided thereon with a physical robot base coordinate system; the physical robot controller connected with the physical robot and the computer respectively; the robot point-to-point intermittent movement control program installed in the computer; the registration unit comprising a registration marker, a camera and a conversion calculation unit; the registration marker arranged on the physical robot body; the camera fixed in a physical environment except the physical robot; the camera connected with the computer, and the conversion calculation unit arranged in the computer; the virtual robot generation unit arranged in the computer and used for generating a virtual robot model.
Control method and robot system
A control method for a robot system having a robot arm and executing an operation mode of the robot arm having an execution mode in which a motion program is executed and a teaching mode in which the motion program is taught, includes setting an upper limit velocity of a motion velocity of the robot arm to a first velocity when the operation mode is the execution mode, and setting the upper limit velocity to a second velocity lower than the first velocity when the operation mode is the teaching mode.
Systems and Methods for Doubles Detection and Mitigation
The technology is directed to training a system to generate pick instructions. A teleoperator system may receive data corresponding to a robot attempting a picking task including picking an item of an identified product type from a container. The data may include imagery of an end effector of the robot after the attempted picking task. The teleoperator system may display the imagery on a display and an input indicating whether the picking task was successfully or unsuccessfully performed by the robot may be received. The data may be labeled based on the input and transmitted to a processor for training a learning algorithm for use in generating future pick instructions.
User-assisted robotic control systems
Exemplary embodiments relate to user-assisted robotic control systems, user interfaces for remote control of robotic systems, vision systems in robotic control systems, and modular grippers for use by robotic systems. The systems, methods, apparatuses and computer-readable media instructions described interact with and control robotic systems, in particular pick and place systems using soft robotic actuators to grasp, move and release target objects.
Modular robot system
A modular robot system can be formed by assembling a plurality of cube type unit robots. The modular robot system includes N cube type unit robots, wherein: one of the N cube type unit robots serves as a central control terminal, the cube type unit robot serving as the central control terminal assigns a distinguishable ID number to each of the N cube type unit robots; each robot including a cube-shaped housing, a step motor and a control unit installed inside the housing; the housing has one surface including a mounting groove to allow a rotational body rotating by a rotation shaft of the step motor to be mounted therein, and the other surface including a connection groove in the same shape as the mounting groove; and different cube type unit robots can be connected to each other by means of a connection body mounted in the connection groove.
METHOD OF GENERATING CONTROL PROGRAM FOR ROBOT, STORAGE MEDIUM, AND TEACHING APPARATUS
A method of generating a control program for a robot includes generating a trajectory in which a robot arm moves between a plurality of teaching points based on a first constraint condition with respect to a movement time of the robot arm and a second constraint condition with respect to a drive condition for driving the robot arm by a processor, displaying the trajectory generated by the processor and accumulated power consumption when the robot arm moves along the trajectory by a display unit, and, when receiving an instruction to employ the trajectory, generating a control program for the robot based on the trajectory by the processor.