Patent classifications
G05B2219/39393
AUTOMATED 3-D MODELING OF SHOE PARTS
Manufacturing of a shoe is enhanced by creating 3-D models of shoe parts. For example, a laser beam may be projected onto a shoe-part surface, such that a projected laser line appears on the shoe part. An image of the projected laser line may be analyzed to determine coordinate information, which may be converted into geometric coordinate values usable to create a 3-D model of the shoe part. Once a 3-D model is known and is converted to a coordinate system recognized by shoe-manufacturing tools, certain manufacturing steps may be automated.
GENERATION OF TOOL PATHS FOR SHOE ASSEMBLY
A tool path for treating a shoe upper may be generated to treat substantially only the surface of the shoe bounded by a bite line. The bite line may be defined to correspond to the junction of the shoe upper and a shoe bottom unit. Bite line data and three-dimensional profile data representing at least a portion of a surface of a shoe upper bounded by a bite line may be utilized in combination to generate a tool path for processing the surface of the upper, such as automated application of adhesive to the surface of a lasted upper bounded by a bite line.
WORK COORDINATE GENERATION DEVICE
A work coordinate generation device includes a shape register section configured to register shape information about a shape of a work region optically defined on a target which is a work target of a work robot; a first recognition section configured to acquire first image data; a first coordinate generation section configured to generate a first work coordinate which represents the work region of the first target based on a result of recognition of the first recognition section; a second recognition section configured to acquire second image data; and a second coordinate generation section configured to generate a second work coordinate which represents the work region of the second target based on the first work coordinate and a result of recognition of the second recognition section.
ROBOTIC SYSTEM WITH ERROR DETECTION AND DYNAMIC PACKING MECHANISM
A method for operating a robotic system includes determining a discretized object model based on source sensor data; comparing the discretized object model to a packing plan or to master data; determining a discretized platform model based on destination sensor data; determining height measures based on the destination sensor data; comparing the discretized platform model and/or the height measures to an expected platform model and/or expected height measures; and determining one or more errors by (i) determining at least one source matching error by identifying one or more disparities between (a) the discretized object model and (b) the packing plan or the master data or (ii) determining at least one destination matching error by identifying one or more disparities between (a) the discretized platform model or the height measures and (b) the expected platform model or the expected height measures, respectively.
UTILIZING OPTICAL DATA TO CONTROL OPERATION OF A SNAKE-ARM ROBOT
The present disclosure is related to methods and systems for controlling a snake-arm robot. The method includes receiving real-time image data associated with an operating environment or a location of a workpiece from optical sensor(s) mounted on a robot head of the robot; receiving input data describing a desired pose of the robot head; computing and translating a desired displacement of the robot head; computing a position of each of the links of the snake-arm robot to follow motion of the robot head, a current position of each the links, and data required to move joints connecting the links to move the robot to the desired pose; generating movement instructions; and transmitting the movement instructions to a drive motor associated with an introduction device or controllers associated with servo-motors operably connected to joints connecting the links of the snake-arm causing the robot head to move to the desired pose.
Utilizing optical data to dynamically control operation of a snake-arm robot
Methods and systems for controlling a snake-arm robot. In an embodiment, a server computer receives real-time image data associated with at least one of an operating environment and a location of a workpiece from an optical sensor mounted on a robot head of a snake-arm robot, and receives, input data describing a desired pose of the robot head from a user device. The server computer then computes a desired velocity of the robot head using an image Jacobian, translates the desired velocity of the robot head into incremental displacement data and rotation data within a control cycle, computes a position of each of a plurality of links comprising a snake-arm of the snake-arm robot to follow motion of the robot head, computes a current position of each of the plurality of links utilizing a forward dynamics model, and computes force and torque data required to move at least one of a plurality of joints connecting the links to move the snake-arm robot to the desired pose. The method also includes generating movement instructions based on the force and torque data, and transmitting the movement instructions to at least one of a drive motor associated with an introduction device and a plurality of controllers associated with servo-motors operably connected to joints connecting the links of the snake arm causing the robot head to move to the desired pose.
WORK-IMPLEMENT EXTERNAL-SHAPE MEASUREMENT SYSTEM, WORK-IMPLEMENT EXTERNAL-SHAPE DISPLAY SYSTEM, WORK-IMPLEMENT CONTROL SYSTEM AND WORK MACHINE
A measurement controller (20): computes the position of a plane (S1) representing a side surface of a work implement (1A) in an image-capturing-device coordinate system (Co1) on the basis of an image of the side surface of the work implement captured by an image-capturing device (19) and an internal parameter of the image-capturing device; computes the coordinate values of a point on the work implement in the image-capturing-device coordinate system (Co1), the point corresponding to any pixel constituting the work implement on the captured image, on the basis of positional information on the pixel on the captured image and the position of the plane (S1); and converts the coordinate values of the point on the work implement in the image-capturing-device coordinate system, the point corresponding to the pixel, to coordinate values in a work-implement coordinate system (Co3) to output the coordinate values in the work-implement coordinate system (Co3) to a work-machine controller (50) of a hydraulic excavator (1).
Direct client initiated CNC tool setting
Computer numerical control (CNC) machines execute a process automatically unless a condition occurs that triggers one or more alarms that terminate the process. Accordingly, CNC laser cutting post-process inspection is usually non-existent or minimal. However, with CNC laser welding it is more common for a visual inspection or automated inspection to be performed to verify that the process was completed. Similar issues occur when single piece parts are required in addition to which executing an offline inspection requires additional complexity in re-working any piece part. Accordingly, embodiments of the invention provide enterprises and facilities employing CNC laser cutting/welding systems with a means to overcome these limitations. Further, providing intuitive user interfaces allows the user to perform tasks directly through a touch screen interface they are viewing the work piece/piece-parts upon.
Method and robotic system for manipulating instruments
An approach relates to manipulation of tools or instruments in the performance of a task by a robot. In accordance with this approach, sensor data is acquired and processed to identify a subset of instruments initially susceptible to manipulation. The instruments are then manipulated in the performance of the task based on the processed sensor data.
Robotic system with error detection and dynamic packing mechanism
A method for operating a robotic system includes determining a discretized object model based on source sensor data; comparing the discretized object model to a packing plan or to master data; determining a discretized platform model based on destination sensor data; determining height measures based on the destination sensor data; comparing the discretized platform model and/or the height measures to an expected platform model and/or expected height measures; and determining one or more errors by (i) determining at least one source matching error by identifying one or more disparities between (a) the discretized object model and (b) the packing plan or the master data or (ii) determining at least one destination matching error by identifying one or more disparities between (a) the discretized platform model or the height measures and (b) the expected platform model or the expected height measures, respectively.