Selective sorting method
09789517 · 2017-10-17
Assignee
Inventors
- Jérémy Doublet (Ayguesvives, FR)
- Christophe Gambier (Blagnac, FR)
- Hervé Henry (Saint-Cyr-Les-Vignes, FR)
- Julien Reynard (Montmorin, FR)
Cpc classification
B07C7/005
PERFORMING OPERATIONS; TRANSPORTING
B07C5/34
PERFORMING OPERATIONS; TRANSPORTING
B65G47/90
PERFORMING OPERATIONS; TRANSPORTING
International classification
B07C5/34
PERFORMING OPERATIONS; TRANSPORTING
B07C5/02
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A selective sorting method in order to identify and sort material objects of different natures, sizes and shapes having the form of a pile is described. The method is characterized in that the choice of a gripping zone, associated with an object contained in the pile to be sorted, is carried out automatically, and in that the defining of the nature of the object associated with this gripping zone is carried out using at least one sensor that measures at least one electromagnetic radiation emitted by this object. A device able to implement such a method is also described.
Claims
1. A selective sorting method in order to identify and sort material objects of the waste type, of different natures, shapes and sizes and having the form of a pile, the method comprising the following steps: a. supplying a flow of objects in the form of a pile, to a zone of vision comprising at least two sensors for measuring electromagnetic radiation, the zone of vision being located in a zone of action of a robot provided with one or several gripping members, a transponder associated with a type of nature being affixed or built into each object; b. capturing at least two two-dimensional images of the pile contained in the zone of vision using the sensors for measuring electromagnetic radiation, in order to reconstruct a virtual or electronic image of the pile of objects in the zone of vision; c. processing the information resulting from the two-dimensional images, and identifying all possible gripping zones associated with objects present in the pile for the gripping member or members of the robot; d. locating, in position and orientation, the possible gripping zones, and e. choosing one of the gripping zones; f. defining automatically, for a given gripping member, a trajectory for gripping a unitary object corresponding to the chosen gripping zone; g. grasping the corresponding unitary object according to the defined trajectory; h. displacing the corresponding unitary object that is grasped to a receiving zone; h′ defining a nature of the unitary object located in the receiving zone using at least one sensor that measures at least one electromagnetic radiation emitted by the transponder in the receiving zone, i. displacing the unitary object located in the receiving zone to an outlet according to the nature that has been attributed to the unitary object, wherein the step (e) of choosing a gripping zone is carried out automatically, wherein the step h′ of defining the nature of the object is carried out by using a radio frequency interrogator to recover a radio frequency signal transmitted by the transponder, with the signal being compared to a set of data recorded in a memory, by using a computer program.
2. The selective sorting method according to claim 1, wherein the step (h′) is carried out by a recovery of measurements captured by at least one sensor that measures at least one electromagnetic radiation of the visible or non-visible spectrum, and by comparing these measurements with a set of data recorded in a memory, by using a computer program.
3. The selective sorting method according to claim 1, wherein the step (h′) is carried out by recovery of images of the unitary object by at least one camera used as a sensor and by comparing images of the unitary object obtained with a set of data recorded in a memory, by using a computer program.
4. The selective sorting method according to claim 1, wherein the automatic choosing of one of the gripping zones of the step (e) is defined by the use of an algorithm.
5. The selective sorting method according to claim 4, wherein the algorithm calculates the probability of success of the gripping of each gripping zone by one of the gripping members of the robot.
6. The selective sorting method according to claim 5, wherein the grasping occurs, with priority, to the gripping zone associated with a highest probability of success for gripping.
7. The method according to claim 1 wherein a device able to implement the method comprises: means for supplying a flow of objects having the form of a pile; sensors for measuring electromagnetic radiation in the zone of vision; image processing and calculating software for processing the information resulting from the captured images in the zone of vision and for identifying and for locating gripping zones of the objects of the pile; a mechanical robot provided with at least one gripping member to grip an object defined by one or several gripping zones in the pile and displace the object from a zone of vision to a receiving zone; means for identifying the nature of the unitary object in the receiving zone; processing and calculating software in order to process the information resulting from the identifying means; means for removing the object placed in the receiving zone.
8. The method according to claim 7, comprising wherein the device comprises means for processing and for calculating in order to automatically define a gripping trajectory of the object by the robot.
9. The method according to claim 8, wherein the sensors for measuring electromagnetic radiation in the zone of vision are cameras or laser scanning distance sensors.
10. The method according to claim 7, wherein the identifying means is chosen from among sensors of the spectrometric or spectroscopic type that are able to reconstruct a spectral or multispectral or hyperspectral image, optical sensors, radio electric or radio frequency antennas, cameras or sensors for measuring electromagnetic radiation in the visible or non-visible spectrum.
11. The method according to claim 10, wherein the means for removing the unitary object from the receiving zone are chosen from among a belt conveyor, a mechanical robot, a vibrating table, or the same mechanical robot used for displacing a unitary object from the zone of vision towards the receiving zone.
Description
BRIEF DESCRIPTION OF THE DRAWING
(1) Other characteristics and advantages of this invention shall appear more clearly when reading the following description given by way of a non-limiting example and made in reference to annexed figure.
(2)
DESCRIPTION
(3) In the example shown hereinafter, for the purposes of information and in a non-limiting manner, in a device according to the invention shown in
(4) However, preferably, the robot can include at least two gripping members, with the first using a technology referred to as “suction” and the other a technology referred to as “clamp”.
(5) The robot shown in
(6)
(7) The pile of objects comprises a bulk volume of heterogeneous objects placed randomly in such a way that the objects are entangled.
(8) As shown in
(9) This first belt conveyor 11 is able to supply a zone, called the zone of vision 12, with a pile of material objects.
(10) The device of
(11) In these conditions, the camera 19a is configured to acquire successive two-dimensional images of the pile located in the zone of vision 12. Note that the images captured cover the entire pile of objects.
(12) One or several of said images of the pile of objects captured are then possibly processed and analyzed in order to allow for the identification and the locating of each possible gripping zone by a gripping member 18 of the poly-articulated robot 14.
(13) To do this, the camera 19a is, for example, coupled to means for processing, which can be calculators and other software, configured to process the images coming from said sensors.
(14) The combined uses of calculating software and of image processing software make it possible to weight each gripping zone identified beforehand with a coefficient which is according to the probabilities of success of the gripping by a corresponding gripping member, and to choose the gripping zone that has the highest probability of success for gripping.
(15) The weighting coefficient can be according to any type of parameter. It can, for example, be according to the distance that separates the possible localized gripping zone from the corresponding gripping member 18. The weighting coefficient can also be calculated according to the orientation of the zone to be gripped. The probabilities of success are according to the gripping member 18 used.
(16) According to the device 10 of
(17) The device of
(18) Note that in order to allow for the obtaining of images that represent reality, the speed of the flow of objects being directed to the zone of vision 12, through the use of a belt conveyor 11 according to
(19) One of the gripping members 18 of a robot 14 can therefore can take care of gripping a particular object associated with the specific zone considered as having priority as explained hereinabove.
(20) After each gripping, the sensors capture new images of the pile of objects. In this way, the object to be gripped, which may have been displaced by the gripping of a preceding object, will even so be located and gripped.
(21) The numbered steps hereinabove are reproduced until the pile contains no more objects.
(22) When there are no more objects in the zone of vision 12, or on the order of an operator, the first belt conveyor 11 resumes operation in order to convey into this zone of vision 12 a new pile of objects to be sorted.
(23) However, the band conveyor 11 could very well have functioned continuously, at a reasonable speed in order to allow the robot 14 to grip a particular object.
(24) Then, in the device 10 of the invention shown in
(25) Note that here, no operator intervenes. The choosing of any one of the gripping zones is carried out automatically according to the various coefficients that have been associated with each one of the specific grippings.
(26) Furthermore, a removable abutment is optionally arranged at the end of the belt conveyor 11 in order to prevent a portion of the pile of objects to be sorted from being projected outside of the sorting device when this first belt conveyor 11 is operating.
(27) Then, all of the objects gripped by any gripping member 18, as shown hereinabove, is arranged, according to
(28) In this receiving zone is carried out the operation of attributing a nature to a unitary object gripped beforehand by said robot 14. This attribution does not require the intervention of an operator and can therefore be entirely automatic. For example, the operation of attributing a nature to a particular object can be carried out thanks to the use of sensors of the spectrometric or spectroscopic type, referenced as 19b in
(29) According to
(30) As shown in
(31) Furthermore, means can be used to raise and follow the movements and the positions of a particular object, between the gripping device 18 of a robot 14 and an outlet 16, according to time.