Object handling control device, object handling device, object handling method, and computer program product
11559894 · 2023-01-24
Assignee
- Kabushiki Kaisha Toshiba (Tokyo, JP)
- TOSHIBA INFRASTRUCTURE SYSTEMS & SOLUTIONS CORPORATION (Kawasaki, JP)
Inventors
Cpc classification
B25J9/1633
PERFORMING OPERATIONS; TRANSPORTING
G05B2219/40517
PHYSICS
B25J9/161
PERFORMING OPERATIONS; TRANSPORTING
B25J9/1666
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
An object handling control device includes one or more processors configured to acquire at least object information and status information representing an initial position and a destination of an object; set, when a grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.
Claims
1. An object handling control device, comprising: one or more processors configured to: acquire at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; set, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region, wherein the third region includes: a first control region in which a pressing control is performed as the force control, and a second control region in which a repulsive control is performed as the force control, and in the pressing control, the object is pressed against a surface of the obstacle substantially orthogonally under a first pressing condition, and in the pressing control, the one or more processors change the destination and recalculates the moving route when a contact area between a pressing surface of the object and a pressing surface of the obstacle is smaller than a contact threshold.
2. The object handling control device according to claim 1, wherein in the second region, at least the grasper is restricted from approaching the obstacle from upward to downward.
3. An object handling device comprising: the object handling control device according to claim 1; a grasper; and a motion controller that causes the grasper grasping the object to move along the calculated moving route.
4. An object handling method, comprising: acquiring at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; setting, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; calculating a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region; and moving the grasper grasping the object along the moving route wherein the third region includes: a first control region in which a pressing control is performed as the force control, and a second control region in which a repulsive control is performed as the force control, and in the pressing control, the object is pressed against a surface of the obstacle substantially orthogonally under a first pressing condition, and in the pressing control, further changing the destination and recalculating the moving route when a contact area between a pressing surface of the object and a pressing surface of the obstacle is smaller than a contact threshold.
5. A computer program product including programmed object-handling instructions embodied in and stored on a non-transitory computer readable medium, wherein the instructions, when executed by a computer, cause the computer to perform: acquiring at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; setting, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; and calculating a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region; and moving the grasper grasping the object along the moving route wherein the third region includes: a first control region in which a pressing control is performed as the force control, and a second control region in which a repulsive control is performed as the force control, and in the pressing control, the object is pressed against a surface of the obstacle substantially orthogonally under a first pressing condition, and in the pressing control, further changing the destination and recalculating the moving route when a contact area between a pressing surface of the object and a pressing surface of the obstacle is smaller than a contact threshold.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
DETAILED DESCRIPTION
(38) According to one embodiment, in general, an object handling control device includes one or more processors. The one or more processors are configured to acquire at least object information and status information, the object information representing an object grasped by a grasper, the status information representing an initial position and a destination of the object; set, when the grasper grasping the object moves from the initial position to the destination, a first region, a second region, and a third region in accordance with the object information and the status information, the first region being a region in which the grasper is allowed to move without being restricted by an obstacle present in a space between the initial position and the destination, the second region being a region in which the grasper is restricted from moving due to the obstacle, the third region at least part of which is set below the second region, the third region being a region in which the grasper is operated under force control; and calculate a moving route along which the object is moved from the initial position to the destination with reference to the first region, the second region, and the third region.
(39) The following will describe embodiments with reference to the accompanying drawings. Features of embodiments and functions and results or effects attained by the features are presented by way of example only, and are not intended to limit the scope of the embodiments.
(40) Overall System Configuration
(41)
(42) The manipulator 20 includes an arm 21 and a hand 22, i.e., an end effector or a grasper. For example, the arm 21 serves as a multi-joint robot including rotational parts 21a to 21f that are rotated by a plurality of (for example, six) servomotors. The manipulator 20 may include a replaceable hand 22. In this case, the hand 22 may be replaced using a mechanism such as a tool changer (not illustrated). The arm 21 is not limited to the multi-joint robot, and may be a SCARA robot, a linear motion robot, or a combination of at least two of a multi-joint robot, a SCARA robot, and a linear motion robot.
(43) In the example of
(44) The grasp mechanism for the object OBJ is not limited to the suction mechanism, and may include a jamming mechanism, a pinching mechanism, or a multi-finger mechanism, for example. The hand 22 may include at least two of a suction mechanism, a jamming mechanism, a pinching mechanism, and a multi-finger mechanism.
(45) The hand 22 includes a bend 22b. The hand 22 can further include a movable part such as a rotational part or an extendable part in addition to the bend 22b, for example.
(46) The sensor 30 of the hand 22 represents a six-axis, i.e., three-axis translational force and three-axis moment, force sensor 31. The position of the force sensor 31 is not limited to the one illustrated in the drawing. The hand 22 may include various sensors other than the force sensor 31. As described later, in the present embodiment, to fill a given space with objects OBJ without a gap, the hand 22 can feed back a detected value of the force sensor 31 for placing the object OBJ, and press the object OBJ against an obstacle at a moving destination under force control. Such force control over pressing the object OBJ against the obstacle at the moving destination is referred to as pressing control. Additionally, in the embodiment, if the hand 22 or the object OBJ interferes with an obstacle under the force control based on the detected value of the force sensor 31, the hand 22 and the object OBJ can be moved away from the interfering obstacle. Such force control over the hand 22 and the object OBJ to move away from the interfering obstacle is referred to as repulsive control.
(47) The sensor 30 may be a torque sensor, a rotation sensor, or a current sensor of a motor included in each of the rotational parts 21a to 21f in addition to the force sensor 31. In this case, the control device 10 may find force acting on the hand 22 from the detected value of the force sensor 31, or may calculate the force from a current value or a rotation value (detected value of the rotation sensor) of each motor.
(48) The sensor 30 of the object handling system 1 includes a plurality of cameras 32a and 32b and a plurality of laser range scanners 33a and 33b, for example. The cameras 32a and 32b are, for example, RGB image cameras. The laser range scanners 33a and 33b are two-dimensional scanning, optical ranging sensors that measure a distance to the object OBJ with scanning light, and may also be called a laser rangefinder (LRF) or a light detection and ranging (LIDAR).
(49) The camera 32a is located in the initial position to image the object OBJ and the surroundings thereof from above at the initial position HP of the object OBJ to be grasped and moved or conveyed and acquire object information (such as a shape or a size) and status information (such as a stationary pose) on the object OBJ. At the initial position HP, the object OBJ is housed in a container 14a such as a stowage or a palette. In such a case, the camera 32a generates an image of all or part of inside the container 14a. The initial position HP may also be referred to as a motion start position or a departure position of the object OBJ. In
(50) The camera 32b is located in the moving destination RP for releasing, arranging, or placing the object OBJ, to image the moving destination RP and the surroundings thereof, and acquire the object information on the object OBJ (such as a shape or a size of a previously set object) and status information (such as a pose of a previously set object). At the moving destination RP, the object OBJ is housed in a release-purpose container 14b such as a stowage or a palette, for example. In such a case, the camera 32b generates an image of all or part of inside the container 14b. The moving destination RP can also be referred to as a moving stop position or an arrival position of the object OBJ. In
(51) The laser range scanners 33a and 33b acquire the object information such as the size of the object OBJ (including a size of a surface that cannot be captured by the camera 32a) and the status information such as a hand 22's grasping pose of the object OBJ while the arm 21 moves the object OBJ grasped with the hand 22 from the initial position HP to the moving destination RP. The object information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b can be used in positioning control over dense placement of the objects OBJ in the container 14b at the moving destination RP.
(52) The housing 40 may house, for example, various components or elements and devices of the object handling system 1 such as a power supply for driving an electric actuator such as a motor, a cylinder for driving a fluid actuator, a tank, a compressor, and various safety mechanisms. The housing 40 can house the control device 10.
(53) Control Device
(54)
(55) The integrator 51 serves to generate, operate, and manage a work plan of the object handling system 1 on the basis of user input information from an external interface (I/F) 71, the state of the object handling system 1, and the detected value of the sensor 30.
(56) The image processor 52 receives and processes images from the cameras 32a and 32b serving as the sensor 30 to generate information used in motion planning, motion control, error detection, and learning.
(57) The signal processor 53 receives and processes information, i.e., detected values from the laser range scanners 33a and 33b serving as the sensor, to generate information used in motion planning, motion control, and error detection. The image processor 52 and the signal processor 53 are examples of an information acquirer.
(58) The grasp plan generator 54 calculates a grasping method and a grasping pose of the object OBJ at the initial position HP, and a moving route and via points along which the manipulator 20 or hand 22 is moved to the initial position HP. The grasp plan generator 54 also calculates a moving route and via points of the hand 22 to grasp a next intended object OBJ after releasing the object OBJ at the moving destination RP. In these cases, the object information acquired by the camera 32a is utilized in calculation of the moving route and via points to move the hand 22 without interfering with surrounding obstacles such as wall surfaces of the containers 14a and 14b or an object or objects other than the currently moved object OBJ.
(59) To move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 sets regions in a space between the initial position HP and the moving destination RP with reference to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. The region setter 55 sets a first region, a second region, and a third region, for example. In the first region the hand 22 is allowed to move without being restricted by obstacles such as the containers 14a and 14b and a previously set object in the space between the initial position HP and the moving destination RP. In the second region the hand 22 is restricted from moving due to presence of obstacles. At least part of the third region is set below the second region, and in the third region the hand 22 is moved under force control. In the first region the hand 22 is movable at a higher speed, for example. In the second region the hand 22 is restricted or prohibited from passing. In the third region the force sensor 31 detects force, allowing the hand 22 to correct the moving route under repulsive control if the object OBJ or the hand 22 interferes with the obstacle. Additionally, in the third region, the moving speed of the hand 22 (object OBJ) may be lowered, or the force sensor 31 may be temporarily improved in terms of sensing accuracy, for example.
(60) The above force control includes pressing control and repulsive control. The region setter 55 can set part of the third region as a subject of the pressing control of the force control and set the rest thereof as a subject of the repulsive control of the force control. In this case, in the region subjected to the pressing control, the hand 22 may be generally restricted or prohibited as in the second region, and the object OBJ and the hand 22 may be allowed to pass therethrough under exceptional pressing control.
(61) The route calculator 56 calculates the moving route of the object OBJ from the initial position HP to the moving destination RP with reference to the first region, the second region, and the third region. For example, the route calculator 56 calculates the moving route including a moving method and a speed of the hand 22 to smoothly move the grasped object OBJ from the initial position HP to the moving destination RP. Setting the first region, the second region, and the third region and switching the behavior of the hand 22 moving along the moving route among the regions makes it possible to efficiently move or convey the object OBJ at a higher speed while lowering a risk of interfering with the obstacles. Setting the first region, the second region, and the third region, and moving the hand 22 or object OBJ in the regions will be described in detail later.
(62) The robot controller 57 serves to control speed, orientation, and pose of motion of the manipulator 20 including the hand 22 so as to move the hand 22 grasping the object OBJ along the moving route.
(63) The peripheral and I/O controller 58 controls inputs and outputs for controlling various conveyance devices and a peripheral device 70 such as a safety door, acquiring various kinds of sensor information, and controlling lighting.
(64) The learning controller 59 controls a learning function of the object handling device including robot model learning for improving the accuracy of motion of the manipulator 20 such as vibration suppression, grasping control parameter learning and grasp database learning for improving capability to grasp the object OBJ, and error detection learning for improving feasibility of a work plan. In the embodiment, the learning controller 59 selects an optimum parameter value in the force control, depending on a situation. The operator may empirically set such values, however, applying results of machine learning enables more efficient processing with work saving.
(65) The error detector 60 monitors a state of the object handling system 1, progress of a work plan, a controlled state of the system 1, a state of the object OBJ while grasped, moved, or conveyed for error detection. The error detector 60 can implement this error detection, for example, by monitoring a fingertip coordinate value of the hand 22a to determine error when the value exceeds a predefined value. The fingertip coordinate value is converted from an output of the force sensor 31 and passes through a low-pass filter. As a result, the object handling system 1 can interrupt the work in progress and proceed to a recovery operation.
(66) The internal database (DB) 61 contains, for example, a robot database, a hand database, an article database, a grasp database, and an environment database, none of which are illustrated.
(67) The robot database (DB) stores, for example, the structure of the object handling system 1, dimensions and weights and moment of inertia of the respective elements, an operational range, speed, and torque performance of each driver.
(68) The hand database (DB) stores, for example, functions of the hand 22 and information as to grasping characteristics of the hand 22.
(69) The article database (DB) stores, for example, various kinds of information on the object OBJ such as a name, an identification number, a category, image information of the entire surface, CAD model information, weight information, and grasping characteristic information (e.g., soft, fragile, or deformable). In the embodiment, the objects OBJ have different shapes and sizes, and are introduced into the initial position HP in a mixed state. Densely arranging the mixed objects OBJ at the moving destination RP can increase the amount of the objects OBJ to be accommodated in the container 14b at a time, and enhance the efficiency of physical distribution.
(70) The grasp database (DB) stores, as to the object OBJ, score information such as graspable position and pose, and capability to grasp, a possible pressing amount at the time of grasping, a grasp determination threshold, and a determination threshold for error detection for each grasping method of the hand 22, for example. Examples of the grasping method include suctioning, parallel two-finger method, parallel four-finger method, and multi-joint method.
(71) The environment database (DB) stores, for example, workbench information compatible with the object handling system 1 and surrounding information representing an operational range of the object handling system 1 and surrounding obstacles.
(72) The external I/F 71 serves to transmit and receive data between the integrator 51 (control device 10) and an external apparatus (not illustrated).
(73) Schematic Control and Operation of Object Handling System or Device
(74) The object handling system 1 operates in accordance with a moving work plan for all of the objects OBJ provided from a high-order system via the external I/F 71.
(75)
(76) First, the integrator 51 performs an order check with reference to a moving work plan instructed by the high-order system, as the planning operation. In this case, the camera 32a recognizes the objects OBJ in the container 14a at the initial position HP, and the camera 32b recognizes previously set objects in the container 14b at the moving destination RP to acquire the object information and the status information. The grasp plan generator 54 then generates a grasp plan including the order of grasping the objects OBJ. The grasp plan generator 54 also generates a plan of a moving or grasping route to the object OBJ. As the robot arm operation, the robot controller 57 moves the hand 22 from a grasp standby position (home position) to the initial position HP along the generated grasp route. The robot controller 57 causes the hand 22 or the suction pad 22a to grasp the object OBJ at the grasp position by the suction pad operation. The robot controller 57 causes the hand 22 grasping the object OBJ to move to a moving standby position (for example, in the measuring regions of the laser range scanners 33a and 33b), and causes, while the hand is moving, the laser range scanners 33a and 33b (for example, LRFs) to estimate the pose of the object OBJ including a size and a grasping pose, as the recognition operation.
(77) As the planning operation, the region setter 55 sets the first region, the second region, and the third region on the basis of the object information and the status information acquired by the recognition operation, and the route calculator 56 generates a moving-route plan for the hand 22 grasping the object OBJ to place the object OBJ.
(78) As the robot arm operation, the robot controller 57 then causes the hand 22 grasping the object OBJ at the moving standby position to move to the moving destination RP along the moving route. As the suction pad operation, the robot controller 57 releases the suction pad 22a from grasping at the moving destination RP, and sets the object OBJ at the moving destination RP. Then, the robot controller 57 causes the hand 22 having released the object OBJ to return to the grasp standby position, and completes a series of motions, transfers, or placement of the object OBJ, and repeatedly performs the operations to move a next object OBJ.
(79) Outline of Object Placement Plan
(80)
(81) To create the placement plan, for example, the route calculator 56 acquires, from the camera 32a and the laser range scanners 33a and 33b, input information including the grasping pose and the size of the object OBJ currently grasped by the hand 22 or suction pad 22a and to be moved and placed at the moving destination RP (container 14b) (S100). Subsequently, the route calculator 56 calculates the pose of the currently grasped object OBJ to place the object OBJ in the container 14b in accordance with the grasping pose and the size of the object OBJ (S102). The route calculator 56 calculates a placeable position and a pose of the grasped object OBJ, that is, a candidate for the moving destination RP on the basis of the status information of previously set objects OBJs in the container 14b (S104). The route calculator 56 calculates a plurality of patterns of position and pose candidates of a fingertip TCP of the hand 22 in placing the object OBJ, on the basis of information on the previously set object OBJs in the container 14b (S106), and selects an optimal position and pose candidate from the patterns (S108). The route calculator 56 also sets information including a target force value, a position of a pressing surface, and a pressing direction used in the force control by the manipulator 20 (hand 22) during movement or placement (S110). The route calculator 56 calculates via points (via position and pose RAP) on the moving route of the object OBJ from the size and the grasping pose of the object OBJ, a state of the previously set objects OBJs, the position and pose of the fingertip TCP at the time of placing the object OBJ (S112). The moving destination RP and the via position and pose RAP calculated as the candidates are associated with scores such as preset priority, and the route calculator 56 selects an optimum moving destination RP and via position and pose RAP according to the scores. After the route calculator 56 succeeds in generating the moving route not to interfere with obstacles such as the previously set objects OBJs or the container 14b, the robot controller 57 causes the manipulator 20 to operate.
(82) In the embodiment, the position and pose of the object OBJ or the hand 22 is represented by a homogeneous transformation matrix T of four rows and four columns. For example, a position and pose .sup.wT.sub.RP of the moving destination RP is represented by the following Expression 1:
(83)
where upper left symbols represent a coordinate system before transformation, lower right subscripts represent a coordinate system after transformation, R represents a three-row, three-column rotation matrix of the pose, t represents a three-row, one-column translation vector of the position, and W represents a world coordinate system of the moving destination RP.
(84) In coordinate transformation, the following Expression 2 holds:
.sup.UT.sub.V=.sup.UT.sub.S.sup.ST.sub.V. (Expression 2)
where S, U, and V each represent an optional coordinate system.
(85) Expression 3 also holds as follows:
(86)
where A represents a matrix, A(i, j) represents an operation of extracting an element of the i-th row and the j-th column, and A(i, :) and A(:, j) represent operations of extracting a i-th row vector and a j-th column vector, respectively.
(87) Similarly, the following Expression 4 holds:
u=[u.sub.1 . . . u.sub.n].sub.T,u(i)=u.sub.i (Expression 4)
where u represents a vector and u(i) represents an operation of extracting the i-th element from the vector.
(88) In the embodiment, a rectangular parallelepiped region is represented by Expression 5, similar to the coordinate system and size of the object:
(89)
where .sup.UT.sub.Area represents a coordinate system of the centroid position and a vector .sup.US.sub.Area represents a size thereof.
(90) The object handling system 1 or object handling device is intended to move and arrange the objects OBJ as many as possible in the container 14b being a moving destination. That is, packing density is to be improved. With an improved packing density, the object handling system 1 can reduce the number of containers 14b for shipment, contributing to reduction in transportation cost. The object handling system 1 is likely to increase the packing density by arranging the objects OBJ in a small space So, as illustrated in
(91) To avoid such a collision, for example, the size of the object OBJ can be set to a larger value including a margin than an actual size at the time of calculating the placement position or the moving route. Setting a uniform large margin may however make it difficult to generate or set up a placement plan of the object OBJ in a small space in the container 14b.
(92) In view of this, in the embodiment, the object handling system 1 or object handling device performs the pressing control and the repulsive control under the force control to effectively utilize a small space. As illustrated in
(93) To efficiently calculate the moving route for moving and placing the object OBJ, the object handling system 1 of the embodiment sorts out possible interference with the obstacle, such as the wall 14W of the container 14b or the previously set object OBJs, into vertical interference and horizontal interference. Vertical Interference refers to interference between the bottom of the object OBJ and the top surface of the previously set object OBJs or the top end of the container 14b as a moving destination in the case of moving the object OBJ in the direction of arrow R, referring to the upper drawing of
(94) Thus, to move the hand 22 grasping the object OBJ from the initial position HP to the moving destination RP, the region setter 55 of the object handling system 1 of the embodiment classifies the space between the initial position HP and the moving destination RP into the first region, the second region, and the third region with different characteristics.
(95) In the first region the hand 22 is allowed to move without being restricted by obstacles such as the wall 14W of the container 14b or the previously set object in the space between the initial position HP and the moving destination RP.
(96) In the second region the hand 22 is restricted from moving due to presence of obstacles. For example, the hand 22 is restricted from entering or moving from above to below the obstacle. To avoid interference, the fingertip TCP of the hand 22 is prohibited from passing the second region while moving from upward to downward following a moving-route plan. In the second region the hand 22 may be additionally restricted from moving from downward to upward or moving laterally. The second region may be set to a motion prohibited region in which the hand 22 or the object OBJ is prohibited from entering or moving.
(97) At least part of the third region is set below and adjacent to the second region, and includes a first control region and a second control region, for example. In the third region the hand 22 is operated under the force control as described later. In setting the third region below the second region, at least part of the first region may be set between the second region and the third region. The first control region of the third region is set below the second region along the obstacle, and includes a less margin with respect to the obstacle than the second region, that is, has a smaller lateral width than the second region. In the lateral direction the first control region is set entirely adjacent to the bottom of the second region. The hand 22 enters the first control region from the lateral direction with respect to the obstacle. In other words, setting the first control region below the second control region prevents the object OBJ or the hand 22 from approaching the first control region from above the second region. Also, the object OBJ or the hand 22 enters and approaches the inside of the first control region from the lateral direction alone. The first control region is laterally adjacent to the obstacle below the second region, and the hand 22 is operated under the pressing control of the force control therein. As in a second region 100, the hand 22 may be prohibited in principle from moving in or entering the first control region, such as from upward to downward motion. In this case, if such motion or entry prohibition makes it difficult to create the route or a created route is inefficient, the hand 22 may be exceptionally allowed to move in or enter the first control region under the condition that the hand 22 is operated under the pressing control.
(98) The second control region of the third region is adjacent to a side of the first control region opposite to the obstacle, and serves to an entry route to the first control region. At least part of the second control region on the first control region side is located below the second region. Preferably, the entire second control region is located below the second region in the lateral direction. In the second control region the hand 22 is operated under the repulsive control of the force control. That is, in the second control region, the hand 22 is repulsively operated to move away from an interfering obstacle which occurs due to a different actual size of the object OBJ grasped by the hand 22.
(99) In the second control region, behavior of at least the hand 22 may be limited so as to allow the pressing control in the first control region. In this case, the limitation to the behavior of the hand 22 in the second control region refers to, for example, limitation to the moving speed of the hand 22. In the second control region, the moving speed of the hand 22 is set lower than the passing speed in the first region, to allow switching to the pressing control of the force control at appropriate timing, that is, at the time of entering the first control region. Alternatively, the third region may exclude the second control region in which the hand 22 is operated under the repulsive control and it may be entirely set to the first control region in which the hand 22 is operated under the pressing control.
(100) The region setter 55 serves to set the first region, the second region, and the third region including the first control region and the second control region according to the object information and the status information acquired by the cameras 32a and 32b and the laser range scanners 33a and 33b. If there is an overlapping region between the second region and the third region, the region setter 55 gives priority to the third region, and handles the overlapping region as the third region. With an overlapping region between the first control region of the third region and the second region found, the region setter 55 gives priority to the first control region, and handles the overlapping region as the first control region.
(101) For example, when the object OBJ approaches the previously set object OBJs along the wall 14W of the container 14b from above as illustrated in the upper drawing of
(102) With reference to
(103) First, with reference to
(104) Specifically, for avoiding vertical interference with each previously set object OBJs, the position and pose .sup.TOTET.sub.KeepoutZ,i and size .sup.TOTES.sub.KeepoutZ,i of the fingertip TCP in the passage restricted region are calculated by the following Expression 6 and Expression 7:
(105)
where .sup.TOTET.sub.Obstacle,i represents the position and pose of the previously set object OBJs (i=1, . . . , n: n is the number of the objects OBJs), and .sup.TOTES.sub.Obstacle,i represents the size thereof.
(106) With reference to
(107) Specifically, for avoiding horizontal interference with each previously set object OBJs, the position and pose .sup.TOTET.sub.KeepoutXY,i,j and the size .sup.TOTES.sub.KeepoutXY,i,j of the fingertip TCP in the passage restricted region are calculated by the following Expression 8 and Expression 9:
(108)
where .sup.TOTET.sub.Obstacle,i represents the position and pose of the previously set object OBJs (i=1, n: n is the number of the objects OBJs), .sup.TOTES.sub.Obstacle,i represents the size thereof and j=1, 2, 3, and 4 where the value of j indicates ±X-direction and ±Y-direction.
(109) As described above, the object OBJ horizontally approaching can be regarded as being less likely to interfere with the obstacle than the object OBJ vertically approaching. Thus, the respective margin values can be set such that δ.sub.x>ε.sub.x, δ.sub.y>ε.sub.y, and δ.sub.z>ε.sub.z.
(110) Subsequently, the second region 100 is calculated, considering the position and the size of the grasped object OBJ.
(111) First, as illustrated in
(112) In this case, as to vertical interference, the bottom of the grasped object OBJ interfering with the obstacle matters. Regarding horizontal interference, when the bottom of the object OBJ is not interfering with the obstacle, the side surfaces can be regarded as not interfering with the obstacle. That is, the second region 100 can be set, considering the position and the size of the bottom of the object OBJ alone. Thus, as to the second region 100 considering the position and the size of the grasped object OBJ, a second region 100d.sub.1 can be calculated in the X and Y directions by widening the margin of the second region 100d.sub.0, not reflecting the sizes of the grasped object OBJ and the hand 22, by Δx.sub.Item− and Δx.sub.Item+ in the X-direction and by Δy.sub.Item− and Δy.sub.Item+ in the Y-direction in the top view of
(113) The position and pose .sup.TOTET.sub.KeepoutItem,i and the size .sup.TOTES.sub.KeepoutItem,i of the second region 100, reflecting the grasped object OBJ, through which the fingertip TCP is prohibited from passing, are calculated by the following Expression 10 and Expression 11:
(114)
where .sup.TOTET.sub.Keepout,i represents the position and pose of the second region 100 calculated by Expression 6 to Expression 9 (i=1, . . . , n: n is the number of the objects OBJs and .sup.TOTES.sub.Keepout,i represents the size thereof. The displacement amounts of the side surface of the object OBJ from the fingertip TCP Δx.sub.Item− and Δx.sub.Item+ in the X-direction and Δy.sub.Item− and Δy.sub.Item+ in the Y-direction are calculated by the following Expression 12:
(115)
where .sup.TOTET.sub.RP represents the position and pose of the fingertip TCP when placed, .sup.TOTET.sub.Item represents the position and pose of the object OBJs, and .sup.TOTES.sub.Item represents the size of the object OBJs.
(116) Subsequently, the second region 100 is calculated, considering the position and the size of the hand 22 grasping the object OBJ.
(117) First, as illustrated in
(118) As with vertical interference of the object OBJ, vertical interference of the bottom surface of the hand 22 matters. Regarding horizontal interference, when the bottom surface of the hand 22 interferes with no obstacle, the side surface thereof can be regarded as interfering with no obstacle. That is, the second region 100 can be set, considering the position and the size of the bottom of the hand 22 alone. Thus, in the case of setting the second region 100 considering the position and the size of the hand 22, the second region 100 (second region 100d.sub.2) can be calculated in the X and Y directions by widening the second region 100d.sub.0, reflecting the position and the size of the object OBJ, by margins Δx.sub.hand− and Δx.sub.hand+ in the X-direction and Δy.sub.hand− and Δy.sub.hand+ in the Y-direction, as illustrated in
(119) In this case, the position and pose .sup.TOTET.sub.KeepoutHand,i and the size .sup.TOTES.sub.KeepoutHand,i of the second region 100, set considering the hand 22, through which the fingertip TCP is prohibited from passing, are calculated by the following Expression 13 and Expression 14.
(120)
where .sup.TOTET.sub.Keepout,i represents the position and pose of the second region 100 through which the fingertip TCP is prohibited from passing, set by Expression 6 to Expression 9 (i=1, . . . , n: n is the number of the objects OBJs), and .sup.TOTES.sub.Keepout,i represents the size thereof.
(121) Positions of side surfaces of a circumscribing rectangular parallelepiped of the hand 22 from the fingertip TCP are represented by Δx.sub.Hand− and Δx.sub.Hand+ in the X-direction and Δy.sub.Hand− and Δy.sub.Hand+ in the Y-direction and calculated by the following Expression 15:
(122)
where the circumscribing rectangular parallelepiped is perpendicular to and parallel with the coordinate system of the container 14b being a moving destination, .sup.TOTET.sub.RP represents the position and pose of the fingertip TCP at the placement position, .sup.TOTET.sub.Hand represents the position and pose of the circumscribing rectangular parallelepiped of the hand 22, and .sup.TOTES.sub.Hand represents the size thereof.
(123) The second region 100 (second region 100d.sub.1) reflecting the object OBJ grasped by the hand 22 and the second region 100 (second region 100d.sub.2) reflecting the hand 22 can be combined together to set or calculate a definitive second region 100.
(124) The following describes setting the second control region 102b of the third region 102. As described above, due to the second region 100 above and adjacent to the first control region 102a, the object OBJ and the hand 22 do not approach the first control region 102a of the third region 102 from the vertical direction or above but approach there from the horizontal direction or laterally. In the second control region 102b of the third region 102, if the object OBJ or the hand 22 interferes or contacts with the obstacle, the object OBJ and the hand 22 can be prevented from being damaged under the repulsive control of the force control. In the repulsive control, force acting on the object OBJ and the hand 22 is accurately controlled and the measured value of the force sensor 31 is fed back thereto not to apply abrupt external force, thereby causing the object OBJ and the hand 22 to approach the obstacle at a lower speed.
(125) Meanwhile, the pressing control of the force control in the first control region 102a includes first pressing control and second pressing control. The first pressing control is to be performed to the wall 14W or the side surface of the previously set object OBJs while the object OBJ is moving. The second pressing control is to be performed to the object OBJ for placement at the moving destination RP. The first pressing control enables the object OBJ to pass in the first control region 102a by setting the margin of the first control region 102a to substantially zero in generating the moving route in a small space as described above. Thus, in the first pressing control, the object OBJ is pressed against the surface of the wall 14W in a substantially orthogonal direction, for example, by pressing force of a given threshold or less at a moving speed of a given threshold below, which are defined as a first pressing condition. The pressing force of the given threshold or less and the moving speed of the given threshold or less can be predetermined through testing in accordance with a type, a shape, material, or resistance of the object OBJ.
(126) The second pressing control is to cause the hand 22 to slowly approach the first control region 102a from obliquely upward to set the object OBJ at the moving destination RP by setting the margin of the first control region 102a to substantially zero as described above. Thus, in the second pressing control, the object OBJ is pressed against the side surface of the object OBJs or the surface of the wall 14W, for example, at a moving speed of a given threshold or less at a given approach angle which are defined as a second pressing condition. The given approach angle and the moving speed of the given threshold can be predetermined through testing in accordance with a type, a shape, material, or resistance of the object OBJ.
(127) As described above, the second control region 102b is set laterally adjacent to the first control region 102a opposite to the obstacle. The second control region 102b is part of the third region 102 in which the force control is performed, and the repulsive control of the force control is performed the second control region 102b. In the second control region 102b, the object OBJ and the hand 22 are allowed to move under the repulsive control even after interference with the obstacle is detected. In the second control region 102b, the behavior of at least the hand 22 may be limited, to prepare for the pressing control in the first control region 102a. In this case, the behavioral limitation to the hand 22 refers to, for example, limiting the moving speed to the one lower (for example, ½ or ⅓) than that in the first region 104.
(128) In the repulsive control of the force control, the hand 22 grasping the object OBJ moves while the measured value of the force sensor 31 is fed back to the object OBJ and the hand 22. Under the repulsive control, the hand 22 is controlled to approach the wall 14W or the side surface of the object OBJs at the moving speed of the given threshold or less and move apart from the contact surface before substantial reaction force occurs, that is, upon detecting contact. For example, along a calculated moving route not interfering with the obstacle, the hand 22 may come into contact with the wall 14W or the side surface of the object OBJs due to measurement error of the sensor 30. In such a case, the measurement error can be eliminated by the repulsive control, makes it possible to avoid the hand 22 or the object OBJ from being damaged due to a collision with the wall 14W or the side surface of the object OBJs.
(129) With reference to
(130) In
(131) In the example of
(132)
(133) To set the moving route R as illustrated in
(134) Specifically, the height list for interference check .sup.TOTEZ.sub.Check,j(j=1 . . . , 2n) can be calculated by the following Expression 16:
(135)
where .sup.TOTET.sub.Keepout,i represents the position and pose of the second region 100 through which the fingertip TCP is prohibited from passing (i=1 . . . , n), and .sup.TOTES.sub.Keepout,i represents the size thereof.
(136) The height list for interference check .sup.TOTEZ.sub.check,j contains the heights in ascending order from a lowest height, such as H1 to H4.
(137) Next, the following describes calculation of the fingertip position TCP for avoiding the second region 100 in which the hand 22 is restricted from passing. The hand 22 grasping the object OBJ follows the moving route R to enter the container 14b from upward, for example, however, at the time of planning the moving route R, the via points on the moving route R are set by back calculation from the moving destination RP to an entry position.
(138) Next, a next via point is set from RAP[0] as the initial via position or pose. In this case, the route calculator 56 extracts the second region 100 at each of heights for interference check, described with reference to
(139) First, among the apexes of the second region 100, apexes not included in another second region 100 are defined to be candidates for avoidance position Ps (in
(140)
where .sup.TOTET.sub.Keepout,i represents the position and pose of the second region 100 through which the fingertip TCP is prohibited from passing (i=1 . . . , n), and .sup.TOTES.sub.Keepout,i represents the size thereof
(141) Subsequently, among intersections between the respective sides of the second region 100 and straight lines extending in the X-direction and the Y-direction from a pre-avoidance position, i.e., .sup.TOTEt.sub.RAP[0] in
(142)
(143) .sup.TOTEt.sub.AvoidCorner,i,j calculated by Expression 17 and .sup.TOTEt.sub.AvoidParallel,i,j calculated by Expression 18 are combined to set an avoidance-position candidate list .sup.TOTEt.sub.AvoidAll,i. In this candidate list, the point closest to the position .sup.TOTEt.sub.RAP[0] (pre-avoidance position) of RAP[0] is defined to be an avoidance position .sup.TOTEt.sub.Avoid. The avoidance position .sup.TOTEt.sub.Avoid can be calculated by the following Expression 19:
(144)
(145) Subsequently, the route calculator 56 checks whether the linear moving route connecting .sup.TOTEt.sub.Avoid calculated in
(146) In
(147) The following describes the above processing in more detail. A determination may be made on whether there is any intersection between a line segment L.sub.RAP[0],Avoid connecting the position .sup.TOTEt.sub.RAP[0] of RAP[0] and the avoidance position .sup.TOTEt.sub.Avoid and the top and bottom of each second region 100. Z-coordinates .sup.TOTEZ.sub.KeepoutTop,i and .sup.TOTEZ.sub.KeepoutBottom,i of the top and bottom of each second region 100 can be calculated by Expression 20:
(148)
where .sup.TOTET.sub.Keepout,i represents the position and pose of the second region 100 through which the fingertip TCP is prohibited from passing (i=1 . . . , n), and .sup.TOTES.sub.Keepout,i represents the size thereof.
(149) To cause the line segment L.sub.RAP[0],Avoid to intersect with .sup.TOTEZ.sub.KeepoutTop,i and .sup.TOTEZ.sub.KeepoutBottom,i, at least the following Expression 21 is to be satisfied:
(150)
(151) The coordinates .sup.TOTEt.sub.CrossTop,i and .sup.TOTEt.sub.CrossBottom,i of the line segment L.sub.RAP[0],Avoid at .sup.TOTEZ.sub.KeepoutTop,i and .sup.TOTEZ.sub.KeepoutBottom,i can be calculated by the following Expression 22:
(152)
(153) These intersections .sup.TOTEt.sub.CrossTop,i and .sup.TOTEZ.sub.CrossBottom,i satisfying the condition represented by the following Expression 23 are included in the second region 100:
(154)
(155) If the line segment L.sub.RAP[0],Avoid is found to intersect with the second region 100, the coordinates of the line segment L.sub.RAP[0],Avoid at a minimum height of the heights .sup.TOTEZ.sub.KeepoutTop,i and .sup.TOTEZ.sub.KeepoutBottom,i of the top or the bottom of the intersecting second region 100 is set to a new pre-avoidance position. The avoidance position is calculated through the procedure described with reference to
(156) To calculate the moving route through the above-described procedure, in a small space to the moving destination RP, the via point may not be able to be generated using the margin set considering sensing error of the sensor (the camera 32b and the laser range scanners 33a and 33b). In this case, in the embodiment, as illustrated in
(157)
(158) After failing to calculate the avoidance position at S208 (No at S208), the route calculator 56 generates a passage restricted region, ignoring the X- and Y-directional margins (S212). After succeeding in calculating the avoidance position by ignoring the margins in the X- and Y-directional margins (Yes at S214), the route calculator 56 sets the X- and Y-directions to pressing directions (S216), temporarily ending the procedure. After failing to calculate the avoidance position at S214 (No at S214), the route calculator 56 excludes the moving destination RP at which the object OBJ is to be placed from location candidates for releasing the object OBJ in S218. In this case, the route calculator 56 sets the moving destination RP at another position, and recalculates the moving route of the object OBJ thereto.
(159) To generate the passage restricted region (second region 100), ignoring the X- or Y-directional margin, the equation δ.sub.x=σ.sub.x=0 or δ.sub.y=σ.sub.y=0 may be established at the time of setting the second region 100 described with reference to
(160) First, as illustrated in the left-side drawing of
(161)
(162) From the calculated X-coordinates, the X-coordinate of a plane included in a region having the same Y- and Z-coordinates as the pre-avoidance position Pb (=.sup.TOTEt.sub.RAP[0]) is extracted and combined with the Y- and Z-coordinates of .sup.TOTEt.sub.RAP[0] to be set to a possible pressing position PP (=.sup.TOTEt.sub.PressX,i).
(163) Subsequently, as illustrated in the middle drawing of
(164) From among the found possible pressing positions PP .sup.TOTEt.sub.PressX,i in the left-side and middle drawings of
(165) Next, as illustrated in the right-side drawing of
(166) In the pressing start position Pstart, the hand 22 continues pressing until reaching a force target value, and the pressing start position can be set to the top end .sup.TOTEt.sub.PressX+ of the pressing surface calculated in the middle drawing of
(167) The pressing standby position Pr is located apart from the pressing surface with a margin in order to allow the hand 22 to move to the pressing start position Pstart. The X-direction of the pressing standby position Pr can be from the pressing start position Pstart toward the pre-avoidance position Pb (=.sup.TOTEt.sub.RAP[0]). The Z-direction thereof can be offset vertically upward by the margin. In this case, the pressing standby position Pr=.sup.TOTEt.sub.PressXApp can be calculated by Expression 25:
(168)
where .sup.TOTEX.sub.PressMargin represents positive margin values in X-direction and .sup.TOTEZ.sub.PressMargin represents positive margin values in Z-direction.
(169) When the area of the pressing surface PT is equal to or smaller than a given threshold with respect to the pressing surface of the object OBJ, that is, when the contact area between the pressing surface of the object OBJ and the pressing surface of the obstacle is smaller than a contact threshold, the pressing control (force control) is not performed. In this case, while pressing, the hand 22 may change in the grasping pose of the object OBJ due to an insufficient contact surface, or the object OBJ may be damaged or deformed, applied with external force. In such a case the route calculator 56 changes the moving destination RP and recalculates the moving route R not to forcibly perform the pressing control.
(170) The following describes an example of generating the via points on the moving route of the object OJB or hand 22 after the region setter 55 sets the region in the object handling system 1 or object handling device configured as above, referring to the flowchart in
(171) First, the region setter 55 calculates the passage restricted region for the fingertip TCP, that is, the second region 100 (S300). For example, as illustrated in (a) of
(172) Subsequently, in (b) of
(173) The route calculator 56 repeatedly performs the following operations to successively set the via points. That is, the route calculator 56 successively decides the via points following the procedures described with reference to
(174) After failing to calculate new via points (No at S312, the route calculator 56 calculates the avoidance via points in the pressing control (S314), as described with reference to
(175) (f) of
(176) Although not illustrated in the flowchart in
(177) In the example above, the second region 100 is set on the top surface of the object OBJs. However, if the objects can be placed on the top of each other, the top surface of the previously set object OBJs may be set to another region different from the second region 100 to generate the moving route R which enables stack placement. The route calculator 56 may appropriately change the region to set depending on an placement condition at the moving destination RP.
(178) As described above, the object handling system 1 or object handling device of the embodiment includes the hand 22, information acquirers such as the image processor 52 and the signal processor 53, the region setter 55, the route calculator 56, and the robot controller 57 (motion controller). The hand 22 can grasp the object OBJ. The information acquirers such as the image processor 52 and the signal processor 53 acquire at least the object information representing the object OBJ grasped by the hand 22, and the status information representing the initial position HP and the moving destination RP of the object OBJ. The region setter 55 sets the first region 104, the second region 100, and the third region 102 in accordance with the object information and the status information. In the first region 104 the hand 22 grasping the object OBJ is allowed to move without being restricted by the obstacle (the container 14b or the previously set object OBJs) present in the space between the initial position HP and the moving destination RP. In the second region 100 the hand 22 is restricted from moving due to presence of the obstacle. The third region 102 is at least partially set below the second region 100, and in the third region 102 the hand 22 is operated under the force control. The route calculator 56 calculates the moving route R along which the object OBJ is moved from the initial position HP to the moving destination RP with reference to the first region 104, the second region 100, and the third region 102. The robot controller 57 controls the manipulator 20 to move the hand 22 grasping the object OBJ along the moving route R.
(179) As configured above, the object handling system 1 or object handling device can avoid a collision with the obstacle, which would otherwise inflict damage to the hand 22 or the object OBJ grasped by the hand 22, and generate the moving route R for a small space, to efficiently and safely move the object OBJ. That is, the object handling system 1 or object handling device can easily maximize the packing density of the objects OBJ in the container 14b, reducing transportation cost. Further, the object handling system 1 or object handling device can move the object OBJ at a higher speed, utilizing the first region 104 and safely move the object OBJ without interfering with the second region 100, utilizing the third region 102, thereby improving transfer or placement efficiency (transfer cycle) of the object OBJ.
(180) As the second region 100, the region setter 55 of the object handling system 1 or object handling device may set the second region 100 in which the hand 22 is restricted from approaching from above to below the obstacle, and the first control region 102a with the smaller margin than the second region 100 with respect to the obstacle. The first control region 102a is set along the obstacle below the second region 100 and therein, the hand 22 is restricted from approaching the obstacle from the lateral direction. Owing to this configuration, the second region 100 can be easily set in accordance with the object OBJ or the operation of the hand 22.
(181) The region setter 55 of the object handling system 1 or object handling device may set the second control region 102b adjacent to the first control region 102a on the opposite side of the obstacle. Owing to this configuration, the second control region 102b can be easily set, allowing the hand 22 to appropriately operate under the pressing control in the first control region 102a of the third region 102.
(182) As the force control, the object handling system 1 or object handling device may perform the first pressing control under the first pressing condition, to press the object OBJ against the surface of the obstacle substantially orthogonally, when the hand 22 is to pass the first control region 102a along the moving route R calculated by the route calculator 56. In this case, the object handling system 1 or object handling device can generate another moving route R along which the object OBJ can be moved without being damaged, after failing to generate the moving route due to the second region 100.
(183) As the force control, the object handling system 1 or object handling device may perform the second pressing control in the first control region 102a under the second pressing condition, to press the object OBJ against the edge of a placement region including the moving destination RP to land the object OBj therein along the moving route R calculated by the route calculator 56. Owing to this configuration, the object handling system 1 or object handling device can set the object OBJ in tight contact with the wall 14W of the container 14b or the previously set object OBJs. Further, the object OBJ or the hand 22 can be prevented from being damaged.
(184) As the force control, the object handling system 1 or object handling device may perform the repulsive control over the hand 22 grasping the object OBJ in contact with the obstacle in the second control region 102b along the moving route R, to control the hand 22 to repulsively move the object OBJ apart from the surface of the obstacle and pass therethrough. In this case, the object handling system 1 or object handling device can eliminate measurement error in the sensor 30, if it occurs, and prevent the object OBJ and the hand 22 from colliding with the wall 14W or the side surface of the object OBJs and being damaged.
(185) The embodiments have described an example of dividing the third region 102 into the first control region 102a and the second control region 102b, however, the third region 102 may be not divided into the first control region 102a and the second control region 102b but be set as one region. In this case, in the third region 102 where the force control is performed, either of the pressing control and the repulsive control of the force control is performed. Alternatively, in the entire third region 102 a combination of the pressing control and the repulsive control may be performed, such that the pressing control and the repulsive control are switched in accordance with a detected value of the force sensor 31, for example.
(186) In the first pressing control, the route calculator 56 of the object handling system 1 or object handling device may not perform the force control but change the moving destination RP and recalculate the moving route R when the contact area between the pressing surface of the object OBJ and the pressing surface of the obstacle is smaller than a contact threshold. Due to insufficient contact surface, the hand 22 grasping the object OBJ may change in grasping pose, or the object OBJ may receive external force and be damaged or deformed. In view of this, the route calculator 56 changes the moving destination RP and recalculates the moving route R not to forcibly perform the pressing control.
(187) The object handling method of the embodiment includes acquiring at least the object information representing the object OBJ grasped by the hand 22 and the status information representing the initial position HP and the moving destination RP of the object OBJ. The object handling method further includes setting, when the hand 22 grasping the object OBJ moves from the initial position HP to the moving destination RP, the first region 104 in which the hand 22 is allowed from moving without being restricted by the obstacle present in the space between the initial position HP and the moving destination RP, the second region 100 in which the hand 22 is restricted from moving due to presence of the obstacle, and the third region 102 in which the hand 22 is operated under the force control, in accordance with the object information and the status information. The object handling method also includes calculating the moving route R along which the object OBJ is moved from the initial position HP to the moving destination RP with reference to the first region 104, the second region 100, and the third region 102. The object handling method further includes moving the hand 22 grasping the object OBJ along the moving route R.
(188) According to such a method, the moving route R in a small space can be generated while a collision with the obstacle is avoided, which would otherwise damage the hand 22 or the object OBJ grasped by the hand 22, making it possible to efficiently and safely move the object OBJ. This can lead to easily maximizing the packing density of the objects OBJ in the container 14b, lowering transportation cost. Additionally, the hand 22 can be moved at a higher speed utilizing the first region 104 and safely moved without interfering with the second region 100 utilizing the third region 102, enabling improvement in motion and placement efficiency (transfer cycle) of the objects OBJ.
(189) A handling program executed by the object handling device of the embodiment is recorded and provided in an installable or executable file format in a semiconductor memory device such as a CD-ROM, a USB memory, and an SSD, or on a computer-readable recording medium such as a digital versatile disc (DVD).
(190) The handling program executed by the object handling device of the embodiment may be stored and provided in a computer connected to a network such as the Internet by being downloaded via the network. The handling program executed by the object handling device of the embodiment may be provided or distributed via a network such as the Internet. Furthermore, the handling program executed by the object handling device according to the embodiment may be incorporated in a ROM.
(191) While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.