METHOD AND SYSTEM FOR PROGRAMMING A ROBOT
20220410394 · 2022-12-29
Inventors
Cpc classification
G06T7/246
PHYSICS
B25J9/1664
PERFORMING OPERATIONS; TRANSPORTING
G05B19/42
PHYSICS
B25J9/1671
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/39449
PHYSICS
G05B2219/39443
PHYSICS
G05B2219/40607
PHYSICS
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
International classification
B25J13/08
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A method comprising identifying a robotic device and a calibration fixture in a vicinity of the robotic device; referencing the calibration fixture to a base of the robotic device to determine a first pose of the robotic device; receiving a 3D image of the environment, wherein the 3D image includes the calibration fixture; determining a second pose of the calibration fixture relative to the sensor; determining a third pose of the robotic device relative to the sensor based on the first pose and the second pose; receiving a plurality of trajectory points; determining a plurality of virtual trajectory points corresponding to the plurality of trajectory points based on the 3D image and the third pose; providing for display of the plurality of virtual trajectory points; and providing an interface for manipulating the virtual trajectory points.
Claims
1. A computer-implemented method comprising: identifying, within an environment, a robotic device and a calibration fixture in a vicinity of the robotic device; referencing the calibration fixture to a predetermined portion of the robotic device to determine a first pose of the robotic device relative to the calibration fixture; receiving, from a sensor, a 3D image of the environment wherein the 3D image includes the calibration fixture; determining, based on the 3D image, a second pose of the calibration fixture relative to the sensor; determining a third pose of the robotic device relative to the sensor based on the first pose and the second pose; receiving a plurality of trajectory points from a display interface or a device interface; determining a plurality of virtual trajectory points corresponding to the plurality of trajectory points based on the 3D image and the third pose.
2. The method according to claim 1, wherein the determining of a second pose of calibration fixture relative to the sensor is based on recognizing 3-dimensional feature of the calibration fixture in the 3D image.
3. The method according to claim 1, further comprising: providing for display of an overlaid virtual representation of a trajectory for the robotic device, wherein the trajectory comprises a sequence of at least some of the plurality of trajectory points.
4. The method according to claim 1, further comprising: generating and displaying a graphical representation of a trajectory orientation in one or a plurality of the virtual trajectory points; generating and displaying a graphical representation of a tool performing a process along the trajectory.
5. The method according to claim 1, further comprising: receiving, from the display interface, input data indicating one or more adjustments to one or more properties of one or more of the plurality of the virtual trajectory points, the properties being one or more selected from the group consisting of a trajectory position, a trajectory orientation, an end-effector state, a trajectory speed, an electronic signal input and an electronic signal output; receiving, from the display interface, input data to create, duplicate or delete one or more of the virtual trajectory points; determining, based on the received input data on the display interface, one or more adjusted properties of one or more of the trajectory points that correspond to the one or more of the plurality of virtual trajectory points.
6. The method according to claim 1, further comprising: receiving, from the display interface or the device interface, input data that provides instructions for the robotic device to move in accordance to one or more of the trajectory points or virtual trajectory points and their respective properties; transmitting the aforementioned instructions to the robotic device.
7. The method according to claim 6, wherein transmitting the aforementioned instructions to the robotic device includes: translating the virtual trajectory points to physical trajectory points and sending the physical trajectory points to the robotic device.
8. The method according to claim 1, further comprising: providing the display interface for manipulating the virtual trajectory points; receiving, from the display interface, input data indicating creation of and/or adjustment to a trajectory pattern that comprises the plurality of virtual trajectory points arranged in a specified pattern; based on the trajectory pattern created or adjusted, converting the two-dimensional trajectory pattern into a three-dimensional trajectory pattern on the display interface; Receiving, from the display interface, input data to translate and/or rotate the three-dimensional trajectory pattern on the display interface; receiving, from the display interface, input data to project the three-dimensional trajectory pattern onto a portion of the 3D image shown on the display interface.
9. The method according to claim 1, wherein the trajectory points are in the form of user entry via the display interface or in the form of robotic scripts via the device interface.
10. The method according to claim 1, further comprising: interpolating or extrapolating, based on the 3D image, one or a plurality of trajectory points along a surface of an object and an environment.
11. The method according to claim 1, further comprising: transmitting robot information such as robot joint angles, robot status and an end-effector state from the robotic device to a mobile device, a laptop or desktop computer; generating and displaying the received robot information on a virtual robot that is overlaid onto a representation of the real robotic device on the display interface.
12. The method according to claim 1, further comprising: receiving a 2D image of the environment; providing for display of the 2D image of the environment overlaid with the virtual trajectory points.
13. The method according to claim 12, further comprising: providing for display of the 2D image of the environment superimposed with the 3D image received from the sensor, wherein the 3D image being superimposed on the 2D image of the environment can appear visible or invisible on the display interface; in response to changes in orientation of the display device, updating the display to show at least a portion of the 2D image superimposed with 3D image from a corresponding point of view.
14. A system for programming a robotic device, the system comprising: a sensor; and a computing system communicably coupled to the sensor and configured to: identify, within an environment, a robotic device and a calibration fixture in a vicinity of the robotic device; Reference the calibration fixture to a predetermined portion of the robotic device to determine a first pose of the robotic device relative to the calibration fixture; receive, from the sensor, a 3D image of the environment wherein the 3D image includes the calibration fixture; Determine, based on the 3D image, a second pose of the calibration fixture relative to the sensor; determine a third pose of the robotic device relative to the sensor based on the first pose and the second pose; receive a plurality of trajectory points from a display interface or a device interface; determine a plurality of virtual trajectory points corresponding to the plurality of trajectory points based on the 3D image and the third pose.
15. The system according to claim 14, wherein the computing system is further configured to: determine, based on recognition of 3D features of the calibration fixture in the 3D image, the second pose of the calibration fixture relative to the sensor.
16. The system according to claim 14, wherein the computing system is further configured to: provide the display interface to adjust one or more properties of one or more of the plurality of virtual trajectory points, the properties being one or more selected from the group consisting of a trajectory position, a trajectory orientation, an end-effector state, a trajectory speed, an electronic signal input and an electronic signal output; provide the said display interface to create, duplicate or delete the virtual trajectory points; determine one or more adjusted properties of one or more of the trajectory points that correspond to the one or more of the plurality of virtual trajectory points.
17. The system according to claim 14, wherein the computing system is further configured to: generate and display a graphical representation of a trajectory orientation in one or a plurality of the virtual trajectory points; and generate and display a graphical representation of a tool performing a process along the trajectory.
18. The system according to claim 14, wherein the computing system is further configured to: providing the display interface for manipulating the virtual trajectory points; receiving, from the display interface, input data indicating creation of and/or adjustment to a trajectory pattern that comprises the plurality of virtual trajectory points arranged in a specified pattern; based on the trajectory pattern created or adjusted, converting the two-dimensional trajectory pattern into a three-dimensional trajectory pattern on the display interface; receiving, from the display interface, input data to translate and/or rotate the three-dimensional trajectory pattern on the display interface; receiving, from the display interface, input data to project the three-dimensional trajectory pattern onto a portion of the 3D image shown on the display interface.
19. The system according to claim 14, further comprising: receiving, from the display interface or the device interface, input data that provides instructions for the robotic device to move in accordance to one or more of the trajectory points or virtual trajectory points and their respective properties; transmitting the aforementioned instructions to the robotic device, including translating the virtual trajectory points to physical trajectory points and sending the physical trajectory points to the robotic device.
20. A non-transitory computer-readable medium having stored therein instructions that, when executed by a computing system, cause the computing system to perform the method of claim 1.
21. A method comprising: identifying, within an environment, a robotic device and a calibration fixture in a vicinity of the robotic device; referencing the calibration fixture to a predetermined portion of the robotic device to determine a first pose of the robotic device relative to the calibration fixture; receiving, from a sensor, a 3D image of the environment wherein the 3D image includes the calibration fixture; determining, based on the 3D image, a second pose of the calibration fixture relative to the sensor; determining a third pose of the robotic device relative to the sensor based on the first pose and the second pose; providing a display interface for creating a virtual trajectory point; and determining a trajectory point corresponding to the virtual trajectory point based on the 3D image and the third pose.
Description
[0037] For a better understanding of the embodiments, together with other and further features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying drawings. The scope of the invention will be pointed out in the appended claims.
[0038]
[0039]
[0040]
[0041]
[0042] Referring to the figures,
[0043] As shown by block 102 of
[0044] Referring to
[0045] Method 100 optionally further comprises determining poses of trajectory points for robotic device relative to the sensor, as shown in block 106 if augmented reality presentation is desired. When the pose of robotic device relative to the sensor is determined, the robot may be used as a reference point to determine where to virtually overlay trajectory points as part of an optional augmented reality presentation. As the sensor is coupled to the display device with known relative displacement, the relative pose between robot and visual camera can be derived from the relative pose between visual camera and sensor through pose compensation based on known relative displacement between sensor and visual camera on display device. Therein the pose of trajectory points relative to visual camera of display device can be determined based on the relative pose between robotic device and sensor. These locations, e.g. relative displacement between sensor and visual camera on display device, may be used to virtually overlay the trajectory points into 2D image captured by visual camera of display device.
[0046] In a step not depicted in
[0047] As shown in block 108, the method 100 comprises a step of providing for display of the 3D image which was received in block 102. Optionally, the 3D image may be superimposed on the 2D image of the environment.
[0048] In a step not depicted in
[0049] As shown in block 110, the method 100 further comprises providing for display of the 3D image, overlaid with the determined plurality of virtual trajectory points. Optionally, the method 100 further comprises providing for display of the 2D image of the environment, overlaid with the determined plurality of virtual trajectory points.
[0050] As shown by block 112 of
[0051] As shown by block 112 of
[0052] The functionality described in connection with the flowcharts described herein can be implemented as special-function and/or configured general function hardware modules, portions of program code for achieving specific logical functions, determinations, and/or steps described in connection with the flowchart shown in
[0053] Functions in the flowchart shown in
[0054]
[0055] The robotic device 202 includes a predetermined portion, e.g. base 204 which may be stationary base or mobile base. The robotic device may be controlled to operate and move along trajectory 220 which includes trajectory points 210-218. Additionally, the robotic device may include end-effector 226 that may take the form of gripper such as finger gripper or different type of gripper such as suction gripper. The end effector may take the form of tool such as drill, brush or paint gun. The end effector may include sensors such as force sensors, proximity sensors or camera. Other examples may also be possible.
[0056] The display device 222 may be a device that includes an interface and optionally a visual camera that captures 2D image of the environment. For instance, the display device may be a tablet computer, a handheld smartphone, or part of a mobile, laptop, notebook, or desktop computer.
[0057] The sensor 224 may be a depth sensor and/or 3D sensor that acquires 3D image of the environment. The 3D image may be a composition of a series of Infra-red images, a series of structured light images, a series of still images, and/or video stream. The 3D image may be a single still infra-red and/or structured light images. The sensor 224 may be physically secured to the display device 222 through fixture or adhesive medium. The fixture to couple sensor to display device may possess a detachable mechanism or a non-detachable mechanism. For the display device 222 to receive 3D image from sensor 224, the sensor 224 may be connected to the display device 222 through a cable (wired) or a wireless connection.
[0058] The calibration fixture 228 is an object with 3-dimensional features that is placed in the environment of robotic device. According to the example embodiment in
[0059] By recognizing the asymmetrical geometric features 208 on calibration fixture 228 in 3D image acquired by sensor 224, the pose of calibration fixture 228 relative to sensor 224 may be determined. As the pose of robotic device 202 relative to calibration fixture 228 may be known, the pose of robotic device 202 relative to sensor 224 may thereby be determined by recognizing the calibration fixture 228 in 3D image.
[0060] In an alternative embodiment, the calibration fixture may have a base and 3D symmetrical geometric features attached to the base. Examples of symmetrical geometric features may include the aforementioned shapes or features applicable to asymmetrical geometric features. In an alternative embodiment, the calibration fixture may have a base and 3D non-geometric or irregular-shaped features attached to the base. Such non-geometric features may be symmetrical or asymmetrical. In an alternative embodiment, the count of 3D features may be one or more than one.
[0061]
[0062] With reference to
[0063] With reference to block 108 of
[0064] With reference to
[0065]
[0066] The trajectory pattern 404 may undergo a process 406 that converts the two-dimensional trajectory pattern 404 on user interface 402 into three-dimensional trajectory pattern 408. The three-dimensional trajectory pattern 408 may be translated and rotated along its three-dimensional coordinate frame upon input data received on display interface 430. The three-dimensional trajectory pattern 408 may be projected onto 3D image of object 412, resulting in projected trajectory pattern 414 that conforms to the surface of object 412. Each virtual trajectory point on three-dimensional pattern 408 may be projected onto the surface of object 412 via respective projection path 410, wherein the projection path 410 includes origin at their respective position on the three-dimensional trajectory pattern 408 and a projection orientation that may be orthogonal to the plane of the three-dimensional trajectory pattern 408. The projection of three-dimensional trajectory pattern 408 may not be limiting to the object 412 shown in the example embodiment. For instance, the three-dimensional trajectory pattern 408 may be projected on other objects not shown in the example embodiment such as turbine blade, aerofoil, metal sheet or other manufacturing component. The three-dimensional trajectory pattern 408 may be projected on the environment wherein the environment is not inclusive of object 412 or robotic device, for example table, floor, wall, fixture or conveyor system.
[0067] The present disclosure is not to be limited in terms of particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations can be made without departing from its spirit and scope, as will be apparent to those skilled in the art. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims.
[0068] The above detailed description describes various features and functions of the disclosed system and method with reference to the accompany figures. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The example embodiments described herein and in the figures are not meant to be limiting. Other embodiments can be utilized, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a variety of different configurations, all of which are explicitly contemplated herein.
[0069] A block that represents a processing of information, such as a block of method described above, may correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique. Alternatively, or additionally, a block that represents a processing of information may correspond to a module, a segment, or a portion of program code (including related data). The program code may include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique. The program code and/or related data may be stored on any type of computer readable medium such as storage device including a disk or hard drive or other storage medium.
[0070] A block that represents one or more information transmissions may correspond to information transmission between software and/or hardware modules in the same physical device. However other information transmissions may be between software modules and/or hardware modules in different physical devices.
[0071] The computer readable medium may also include non-transitory computer readable media such as computer readable media that stores data for short periods of time like register memory, processor cache, and random-access memory (RAM). The computer readable media may also include non-transitory computer readable media that stores program code and/or data for longer periods of time, such as secondary or persistent long-term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM). The computer readable medium may be considered a computer readable storage medium or a tangible storage device.
[0072] The particular arrangements shown in the figures should not be viewed as limiting. It should be understood that other embodiments can include more or less of each element shown in a given figure. Some of the illustrated elements can be combined or omitted.
[0073] It should be understood that the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements. The term “and/or” includes any and all combinations of one or more of the associated feature or element. The terms “comprising”, “including”, “involving”, and “having” are intended to be open-ended and mean that there may be additional features or elements other than the listed ones. Identifiers such as “first”, “second” and “third” are used merely as labels, and are not intended to impose numerical requirements on their objects, nor construed in a manner imposing any relative position or time sequence between limitations. The term “coupled” may refer to physically coupling, electrically coupling, and/or communicably coupling. The term “coupled” when applied to two objects may refer to the two objects being coupled directly or indirectly through a third object.
[0074] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims.
REFERENCES
[0075] 100 method [0076] 102 block [0077] 104 block [0078] 106 block [0079] 108 block [0080] 110 block [0081] 112 block [0082] 202 robotic device [0083] 204 base [0084] 206 base [0085] 208 asymmetrical geometric features [0086] 210-218 trajectory points [0087] 220 trajectory [0088] 222 display device [0089] 224 sensor [0090] 226 end-effector [0091] 228 fixture [0092] 302 robotic device [0093] 306 base [0094] 308 asymmetrical geometric features [0095] 310-328 virtual trajectory points [0096] 330 display interface [0097] 332 object [0098] 334 calibration fixture [0099] 402 user interface [0100] 404 trajectory pattern [0101] 406 process [0102] 408 trajectory pattern [0103] 410 projection path [0104] 412 object [0105] 414 trajectory pattern [0106] 430 display interface