Patent classifications
G05B2219/37425
Laser rangefinder having common optical path
The present invention relates to a laser rangefinder having common optical path comprising a housing assembly, a control and display assembly mounted in the housing assembly, a mounting device mounted in the housing assembly, an imaging module mounted on the mounting device and a laser module movably mounted on the mounting device longitudinally next to the imaging module, the laser module being electrically connected with the control and display assembly, the laser module comprising a laser emitting apparatus, a laser receiving apparatus and a reference point indicating apparatus, wherein a light path of the laser emitting apparatus and a light path of the laser receiving apparatus are configured to be independent to a light path of the reference point indicating apparatus, an optical axis of light path of the laser emitting apparatus shooting at the target object and an optical axis of the reference point indicating apparatus are collinear.
NUMERICAL CONTROL DEVICE, NUMERICAL CONTROL SYSTEM, PROGRAM, AND NUMERICAL CONTROL METHOD
This numerical control device comprises a distance control unit, a filter unit, a determination unit, and a setting unit. The distance control unit controls the distance between a first object and a second object to be close to a target distance. The filter unit filters a signal indicating the distance. The determination unit determines, on the basis of the target distance, a time constant obtained from the relationship between the distance and an output signal from a distance sensor that measures the distance. The setting unit sets a time constant of the filter to the time constant determined by the determination unit.
Sensorized Robotic Gripping Device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Sensorized robotic gripping device
A robotic gripping device is provided. The robotic gripping device includes a palm and a plurality of digits coupled to the palm. The robotic gripping device also includes a time-of-flight sensor arranged on the palm such that the time-of-flight sensor is configured to generate time-of-flight distance data in a direction between the plurality of digits. The robotic gripping device additionally includes an infrared camera, including an infrared illumination source, where the infrared camera is arranged on the palm such that the infrared camera is configured to generate grayscale image data in the direction between the plurality of digits.
Control device, robot, and robot system
A control device includes a processor that is configured to execute computer-executable instructions so as to control an arm included in a robot. The processor is configured to perform work on a target using a tool that performs work on the target. A distance meter measures a measurement value according to a relative distance between a first position of the target and the tool. The first position includes a portion overlapping with the tool when viewed from a direction toward the target from the tool.
Robot path planning method with static and dynamic collision avoidance in an uncertain environment
The present disclosure relates to robot path planning. Depth information of a plurality of obstacles in an environment of a robot are obtained at a first time instance. A static distance map is generated based on the depth information. A path is computed for the robot based on the static distance map. At a second time instant, depth information of one or more obstacles is obtained. A dynamic distance map is generated based on the one or more obstacles, wherein for each obstacle that satisfies a condition: a vibration range of the obstacle is computed based on a position of the obstacle and the static distance map, and the obstacle is classified as a dynamic obstacle or a static obstacle based on a criterion associated with the vibration range. A repulsive speed of the robot is computed based on the dynamic distance map to avoid the dynamic obstacles.
LEARNING DEVICE, LEARNING METHOD, LEARNING MODEL, DETECTION DEVICE AND GRASPING SYSTEM
An estimation device includes a memory and at least one processor. The at least one processor is configured to acquire information regarding a target object. The at least one processor is configured to estimate information regarding a location and a posture of a gripper relating to where the gripper is able to grasp the target object. The estimation is based on an output of a neural model having as an input the information regarding the target object. The estimated information regarding the posture includes information capable of expressing a rotation angle around a plurality of axes.
Tool posture control apparatus
A tool posture control apparatus includes a robot which supports a tool for performing a predetermined task on a target object, the robot capable of changing posture of the tool; a sensor supported by the robot; and a control device which changes the posture of the tool by controlling the robot, where the sensor measures a distance between the target object and a respective at a plurality of measurement reference positions around the tool, and the control device performs posture control process of controlling the robot in such a way that a measured-distance difference that is a difference between the distances measured by the sensor comes close to a target value.
Learning device, learning method, learning model, detection device and grasping system
An estimation device includes a memory and at least one processor. The at least one processor is configured to acquire information regarding a target object. The at least one processor is configured to estimate information regarding a location and a posture of a gripper relating to where the gripper is able to grasp the target object. The estimation is based on an output of a neural model having as an input the information regarding the target object. The estimated information regarding the posture includes information capable of expressing a rotation angle around a plurality of axes.
Motion trajectory generation apparatus
An operation processor of the motion trajectory generation apparatus specifies the target object by extracting first point cloud data that corresponds to the target object from a depth image in the vicinity of the target object acquired by a depth image sensor, excludes the first point cloud data from second point cloud data, which is point cloud data in the vicinity of the target object, in the depth image, estimates, using the second point cloud data after the first point cloud data has been excluded, third point cloud data, which is point cloud data that corresponds to an obstacle that is present in a spatial area from which the first point cloud data is excluded in the depth image, and supplements the estimated third point cloud data in the spatial area from which the first point cloud data is excluded, and generates the plan of the motion trajectory.