G05B2219/39393

THREE-DIMENSIONAL IMAGE-CAPTURING DEVICE AND IMAGE-CAPTURING CONDITION ADJUSTING METHOD
20230040615 · 2023-02-09 · ·

A 3D image-capturing device that includes at least one camera that acquires a 2D image and distance information of an object, a monitor that displays the 2D image acquired by the camera, and at least one processor including hardware. The processor acquires a first area for which the distance information is not required in the 2D image displayed on the monitor, and sets an image-capturing condition so that the amount of distance information acquired by the camera in the acquired first area is less than or equal to a prescribed first threshold and the amount of distance information acquired by the camera in a second area, which is at least part of an area other than the first area, is greater than a prescribed second threshold that is larger than the first threshold.

Robot device controller for controlling position of robot
11679508 · 2023-06-20 · ·

A first characteristic portion of a first workpiece and a second characteristic portion of a second workpiece are previously determined. A characteristic amount detection unit detects a first characteristic amount related to the position of the first characteristic portion and a second characteristic amount related to the position of the second characteristic portion in an image captured by a camera. A calculation unit calculates, as a relative position amount, the difference between the first characteristic amount and the second characteristic amount. A command generation unit generates a movement command for operating a robot based on a relative position amount in the image captured by the camera and a relative position amount in a predetermined reference image.

Machine learning device, robot control device and robot vision system using machine learning device, and machine learning method

A machine learning device includes a state observation unit for observing, as state variables, an image of a workpiece captured by a vision sensor, and a movement amount of an arm end portion from an arbitrary position, the movement amount being calculated so as to bring the image close to a target image; a determination data retrieval unit for retrieving the target image as determination data; and a learning unit for learning the movement amount to move the arm end portion or the workpiece from the arbitrary position to a target position. The target position is a position in which the vision sensor and the workpiece have a predetermined relative positional relationship. The target image is an image of the workpiece captured by the vision sensor when the arm end portion or the workpiece is disposed in the target position.

Automated 3-D modeling of shoe parts

Manufacturing of a shoe is enhanced by creating 3-D models of shoe parts. For example, a laser beam may be projected onto a shoe-part surface, such that a projected laser line appears on the shoe part. An image of the projected laser line may be analyzed to determine coordinate information, which may be converted into geometric coordinate values usable to create a 3-D model of the shoe part. Once a 3-D model is known and is converted to a coordinate system recognized by shoe-manufacturing tools, certain manufacturing steps may be automated.

Generation of tool paths for shoe assembly

A tool path for treating a shoe upper may be generated to treat substantially only the surface of the shoe bounded by a bite line. The bite line may be defined to correspond to the junction of the shoe upper and a shoe bottom unit. Bite line data and three-dimensional profile data representing at least a portion of a surface of a shoe upper bounded by a bite line may be utilized in combination to generate a tool path for processing the surface of the upper, such as automated application of adhesive to the surface of a lasted upper bounded by a bite line.

ROBOT DEVICE CONTROLLER FOR CONTROLLING POSITION OF ROBOT
20230256615 · 2023-08-17 · ·

A first characteristic portion of a first workpiece and a second characteristic portion of a second workpiece are previously determined. A characteristic amount detection unit detects a first characteristic amount related to the position of the first characteristic portion and a second characteristic amount related to the position of the second characteristic portion in an image captured by a camera. A calculation unit calculates, as a relative position amount, the difference between the first characteristic amount and the second characteristic amount. A command generation unit generates a movement command for operating a robot based on a relative position amount in the image captured by the camera and a relative position amount in a predetermined reference image.

Generation of tool paths for shoe assembly

A tool path for treating a shoe upper may be generated to treat substantially only the surface of the shoe bounded by a bite line. The bite line may be defined to correspond to the junction of the shoe upper and a shoe bottom unit. Bite line data and three-dimensional profile data representing at least a portion of a surface of a shoe upper bounded by a bite line may be utilized in combination to generate a tool path for processing the surface of the upper, such as automated application of adhesive to the surface of a lasted upper bounded by a bite line.

Work coordinate generation device

A work coordinate generation device includes a shape register section configured to register shape information about a shape of a work region optically defined on a target which is a work target of a work robot; a first recognition section configured to acquire first image data; a first coordinate generation section configured to generate a first work coordinate which represents the work region of the first target based on a result of recognition of the first recognition section; a second recognition section configured to acquire second image data; and a second coordinate generation section configured to generate a second work coordinate which represents the work region of the second target based on the first work coordinate and a result of recognition of the second recognition section.

Information processing apparatus, information processing method, and information processing system

There is provided an information processing apparatus to estimate a position of a distal end of a movable unit with a reduced processing load, the information processing including a position computer that computes, on the basis of first positional information obtained from reading of a projected marker by a first visual sensor and second positional information including positional information obtained from reading of the marker by a second visual sensor that moves relative to the first visual sensor, a position of a movable unit in which the second visual sensor is disposed. This makes it possible to estimate the position of the distal end of the movable unit with a reduced processing load.

Work-implement external-shape measurement system, work-implement external-shape display system, work-implement control system and work machine

A measurement controller (20): computes the position of a plane (S1) representing a side surface of a work implement (1A) in an image-capturing-device coordinate system (Co1) on the basis of an image of the side surface of the work implement captured by an image-capturing device (19) and an internal parameter of the image-capturing device; computes the coordinate values of a point on the work implement in the image-capturing-device coordinate system (Co1), the point corresponding to any pixel constituting the work implement on the captured image, on the basis of positional information on the pixel on the captured image and the position of the plane (S1); and converts the coordinate values of the point on the work implement in the image-capturing-device coordinate system, the point corresponding to the pixel, to coordinate values in a work-implement coordinate system (Co3) to output the coordinate values in the work-implement coordinate system (Co3) to a work-machine controller (50) of a hydraulic excavator (1).