Handheld device using a light guide and method for use thereof for determining a plant status
11486869 · 2022-11-01
Assignee
Inventors
Cpc classification
G06T7/80
PHYSICS
G01J3/0205
PHYSICS
G01J3/0291
PHYSICS
G01J3/10
PHYSICS
International classification
G06T7/80
PHYSICS
Abstract
The invention relates to a handheld device and method for determining a status of a plant. The device includes a multi pixel digital colour sensor, a light source arranged for providing broadband illumination, wherein the light source and the multi pixel digital colour sensor are arranged in substantially the same plane, a light guide for guiding the light from said light source into the direction of the multi pixel digital colour sensor, a sample space, provided between the multi pixel digital colour sensor and the light source, for insertion of at least a part of the plant therein, and a processing unit configured for controlling at least the multi pixel digital colour sensor and the light source.
Claims
1. A handheld device for determining a status of a plant comprising: a multi pixel digital colour sensor, configured for obtaining a colour image of a part of the plant which the status is to be determined, comprising at least a red (R), green (G) and blue (B) colour component, together forming a set of colour components; a light source arranged for providing broadband illumination, wherein the light source and the multi pixel digital colour sensor are arranged in substantially a same plane; a light guide for guiding the light from said light source into the direction of the multi pixel digital colour sensor; a sample space, provided between the multi pixel digital colour sensor and a portion of the light guide, for insertion of at least a part of the plant which the status is to be determined, therein; a processing unit configured for controlling at least the multi pixel digital colour sensor and the light source, characterised in that the processing unit is configured for: switching the light source on and off, obtaining a first image of the part of the plant with the multi pixel digital colour sensor with the light source switched on and transmitting broadband illumination using the light guide through the part of the plant into the multi pixel digital colour sensor, obtaining a second image of the part of the plant with the multi pixel digital colour sensor, with the light source switched off and not illuminating the part of the plant, (i) determining a first colour value representative of a difference in intensity values in the first and the second image for a first of the colour components, (ii) determining a second colour value representative of a difference in intensity values in the first and the second image for a second of the colour components, (iii) calculating a value representative of the status of the plant using the first colour value and the second colour value.
2. The handheld device according to claim 1, wherein the device is a smartphone, a laptop or a tablet.
3. The handheld device according to claim 1, wherein the sample space is a slit.
4. The handheld device according to claim 1, wherein the sample space allows the insertion of an unprocessed part of the plant which the status is to be determined.
5. The handheld device according to claim 1, wherein the multi pixel digital colour sensor is selected from a CMOS image sensor and a CCD image sensor.
6. The handheld device according to claim 1, wherein the light source is a flash light of the handheld device.
7. The handheld device according to claim 1, wherein the light guide includes a light diffusor.
8. The handheld device according to claim 1, wherein the light guide is detachably attached to the handheld device.
9. The handheld device according to claim 1, wherein the light guide is part of a cover of the handheld device.
10. The handheld device according to claim 1, wherein the processing unit is further configured for determining a third colour value representative of a difference in intensity values in the first and the second image for a third of the colour components; and calculating the value representative of the status of the plant using the first colour value, the second colour value, and the third colour value.
11. The handheld device according to claim 1, wherein the processing unit is configured for, in step (iii), calculating said value representative of the status of the plant based on a ratio of the first colour value and the second colour value.
12. The handheld device according to claim 1, wherein the processing unit is configured for calibrating the first colour value and the second colour value.
13. The handheld device according to claim 1, wherein the processing unit is configured to use at least one colour component that is less sensitive to changes in the status of the plant than the other of the colour components.
14. The handheld device according to claim 1, wherein the processing unit is configured for obtaining the first and second image in response to a single user command.
15. The handheld device according to claim 1, the device further comprising a communications unit configured for communicating the determined status of the plant, or a parameter derived therefrom, to an applicator system.
16. A method for determining the status of a plant using the handheld device as defined in claim 1, the method including: obtaining, using the multi pixel digital colour sensor, the first image of the part of the plant while the light source is switched on and transmits the broadband illumination through the part of the plant onto the multi pixel digital colour sensor, obtaining, using the multi pixel digital colour sensor, the second image of the part of the plant while the light source is switched off and does not illuminate the part of the plant, and calculating the status of the plant by having the processing unit: (i) determine the first colour value representative of the difference in intensity values in the first and the second image for the first of the colour components; (ii) determine the second colour value representative of the difference in intensity values in the first and the second image for the second of the colour components; and (iii) calculate the value representative of the status of the plant using the first colour value and the second colour value.
17. The method according to claim 16 including taking an action, as the action being one of watering, fertilizing, harvesting, shielding, ventilating, or heating, on a basis of the value calculated in step (iii).
18. The method according to claim 16, wherein the part of the plant is a leaf or a part thereof.
19. The handheld device according to claim 7, wherein the light diffusor is manufactured from a diffusively translucent material.
20. The handheld device according to claim 15, wherein the applicator system is one of a fertilization, fertigation, or watering system.
21. The handheld device according to claim 1, wherein the sample space is an open sample space in contact with ambient air and ambient light.
Description
BRIEF DESCRIPTION OF THE DRAWING
(1) The invention will further be elucidated on the basis of exemplary embodiments which are represented in a drawing. The exemplary embodiments are given by way of non-limitative illustration. It is noted that the figures are only schematic representations of embodiments of the invention that are given by way of non-limiting example.
(2) In the drawing:
(3)
(4)
(5)
(6)
(7)
(8)
DETAILED DESCRIPTION
(9)
(10) In this example the light source 6 is a white light emitting diode, LED. Here, the light source 6 is placed below the camera 2.
(11) In this example, the device 1 includes a light guide 8. The light guide 8 is positioned to receive light emitted by the light source 6. The light guide 8 guides light emitted by the light source 6 to be directed into the field of view of camera 2, i.e. the multi pixel digital colour sensor. In this example, a portion 10 of the light guide 8 is positioned opposite the camera 2. Between the portion 10 and the camera 2, a sample space 12 is formed. Into the sample space 12, a part of a plant, e.g. a part of a leaf, can be inserted. Here, the sample space forms a slit. The distance between the camera 2 and the portion 10 can be approximately 3 mm or less, e.g. about 2 mm or less.
(12) As shown in
(13) In this example, the device 1 includes a processing unit 18. The processing unit is communicatively connected to the camera 2, the light source 6 and the user interface 14. Here, the device includes a communications unit 20. The communications unit 20 is communicatively connected to the processing unit 18. In this example, the device 1 includes a position determination unit 22, here a global positioning system, GPS, unit. The position determination unit 22 is communicatively connected to the processing unit 18. In this example, the device 1 includes a memory 24, suitable for storing a computer program, images, etc.
(14) The device 1 as described so far, can be used as follows.
(15) Before the device 1 is used for determining a plant status, the device 1 can be calibrated. Thereto, the sample space 12 is simply left empty. Alternatively, a reference object of known and preferably spectrally uniform transmittance is inserted into the sample space 12. Via the user interface 14 a calibration measurement sequence can be started, e.g. by pressing a “calibrate” button on the touch screen. After activation of the calibration measurement sequence, the processing unit 18 instructs the light source 6 to switch on (if it is off) and instructs the camera 2 to take a first reference image. Next, the processing unit 18 instructs the light source 6 to switch off and instructs the camera 2 to take a second reference image. The first and second reference images can be, e.g. temporarily, stored in the memory 24 of the device 1. It will be appreciated that it is also possible that the second reference image (with light source off) is obtained before the first reference image is obtained (with light source on).
(16) In this example, the processing unit 18 determines an average intensity value Gr.sub.1=
(17) Then, a calibration constant C is calculated using equation EQ15:
(18)
(19) Referring now to
(20) Here, the processing unit 18 automatically causes the device to take the two images in response to a single user command. The processing unit 18 causes the two images to be taken in fast succession. In this example the images are taken with an exposure time of 1/5000 s (200 μs) and a delay time between the images of 100 ms. The light source 6 is activated to be on during a period that is equal to or longer than the exposure time.
(21) In this example, the processing unit 18 determines an average intensity value G.sub.1 for all green pixel intensity values of the first image. In this example, the processing unit 18 determines an average intensity value G.sub.0 for all green pixel intensity values of the second image. In this example, the processing unit 18 determines an average intensity value R.sub.1 for all red pixel intensity values of the first image. In this example, the processing unit 18 determines an average intensity value R.sub.0 for all red pixel intensity values of the second image.
(22) Then, a value representative of a status of the plant is calculated. In this example, a spectral index, herein called FCCI (Flash Cam Chlorophyll Index), is calculated. Thereto a measured FCCI, is determined using equation 16:
(23)
(24) The measured FCCI′ value can be corrected with correction factor K.sub.D. The correction factor can be device dependent. The correction factor can e. g. be representative of an empirically determined relationship between the FCCI′ value determined by the processing unit 18 and an FCCI determined using a reference device. If no correction is needed, the correction factor can be equal to one (1).
FCCI=FCCI′.Math.K.sub.D EQ17
(25) This FCCI value is representative of the average greenness of the plant part 26. The device 1 can show the determined value to the user, e.g. on the user interface.
(26) It is also possible that the device 1 indicates information representative of the value on the user interface. It is also possible that the processing unit performs an agronomic calibration on the basis of the value representative of the status of the plant. In this example, the processing unit 18 determines a recommended amount of nitrogen (N) to be supplied to the plants as a function of the determined FCCI (e.g. as kg N per ha). The N-recommendation can e.g. be displayed in a field 30 of the user interface.
(27) It will be appreciated that the FCCI value calculated according to equation EQ17 is only one example of a plant status. More in general, the device can determine a plant status, such as a plant nutritional status, on the basis of various mathematical combinations of the available (i. e. R, G and B) colour values. For example, the processing unit 18 can determine a hue value on the basis of all three colour values.
(28) In this example, the device 1 stores information representative of the determined FCCI value in a record in the memory 24.
(29) In this example, the geographical position determination unit 22 determines a geographical position of the device 1 during the measurement. Information representative of the geographical position is stored in the record with the information representative of the determined FCCI value. The record can be stored for access and analysis.
(30) Alternatively, or additionally, the device 1 can transmit the determined status of the plants, or a parameter derived therefrom, e.g. in combination with the location information to an applicator system, e.g. a variable rate applicator system, such as a variable rate fertilizer system, using the communication unit 20. The applicator system can then adjust the rate of fertilizer application to the received status information. Hence, the use of fertilizers may be optimized, e.g. reduced, by precisely applying agricultural products to individual plants or locations to be treated.
(31) In the example of
(32) In the example of
(33) Optionally, the casing or light guide is provided in combination with a token. The token allows the dedicated software to be installed and/or used on the smartphone. The token can e.g. include an indication of a location, such as a URL, where the dedicated software can be downloaded. The location can be a secure location. The token can e.g. include an authentication allowing the location to be reached and/or allowing the dedicated software to be downloaded and/or installed, and/or allowing the software to be executed on the.
(34)
(35) The light guide body 8a can be transparent. In this example, the light guide body 8a is translucent. Here, the light guide body 8a is constructed of a diffusively translucent material, such as an opal glass or opal plastic. The diffusively translucent material provides that light emitted at the light output window can be homogeneous. Thus, the light guide body 8a acts as a diffusor. It is also possible that alternatively, or additionally, the light guide includes one or more diffusively translucent surfaces, e.g. at the light input window 8c and/or at the light output window 8d, to act as diffusor.
EXPERIMENTAL
(36) The performance of the device according to the invention was compared with the performance of a commercially available chlorophyll meter (Yara N-Tester, Yara International ASA, Norway). In the experiment, maize plants were grown at 5 different nitrogen levels. At 5 growth stages, individual leaves were picked and measured with both the chlorophyll meter and the device according to the invention. The results are shown in
(37) Herein, the invention is described with reference to specific examples of embodiments of the invention. It will, however, be evident that various modifications and changes may be made therein, without departing from the essence of the invention. For the purpose of clarity and a concise description features are described herein as part of the same or separate embodiments, however, alternative embodiments having combinations of all or some of the features described in these separate embodiments are also envisaged.
(38) In the example, the processing unit controls the multi pixel digital colour sensor and the light source to take two consecutive images in response to a user activation. It will be appreciated that it is also possible that the processing unit controls the multi pixel digital colour sensor and the light source to take more than two images in response to a user activation. For example, the device can consecutively take images without-with-without the light source active. Starting a sequence with an image without illumination may help in synchronizing multi pixel digital colour sensor and light source for the image with light source illumination with, e.g. in devices that may have difficulties in synchronizing such as certain smartphones. The initial image without illumination may be discarded in the determination of the status of the plants. It is also possible that the process unit controls the multi pixel digital colour sensor and the light source to take a plurality of pairs of images in response to a single user command. For each pair of images, the status of the plants can be determined. The statuses of the plants for the consecutive pairs of images can e.g. be stored and/or averaged.
(39) In the examples, the device is designed as a smartphone. It will be appreciated that the device can also be a dedicated handheld device. It is also possible that the device is designed as another handheld device such as a tablet, laptop, etc.
(40) In the examples, the processing unit determines the value representative of a status of the plants for the entire image. It will be appreciated that the processing unit can also determine a value, representative of a status of the plants for one or more parts of the image.
(41) In the examples, the processing unit determines the value representative of a status of the plants as a ratio of green and red image pixel intensities. It will be appreciated that also other mathematical combination of the available pixel intensities can be used.
(42) In the examples, the processing unit determines a colour value representative of a difference in intensity values in the first and the second image for one or more colour components. It will be appreciated that it is also possible that the sample space is shielded from ambient light, e.g. by a skirt, clamp, or the like. When the sample space is sufficiently shielded from ambient light, obtaining the second image with the light emitter switched off may be omitted. Hence, then the colour value can be determined from the first image with the light emitter switched on only. However, other modifications, variations, and alternatives are also possible.
(43) To conclude, the specifications, drawings and examples are, accordingly, to be regarded in an illustrative sense rather than in a restrictive sense.
(44) For the purpose of clarity and a concise description features are described herein as part of the same or separate embodiments, however, it will be appreciated that the scope of the invention may include embodiments having combinations of all or some of the features described.
(45) In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word ‘comprising’ does not exclude the presence of other features or steps than those listed in a claim. Furthermore, the words ‘a’ and ‘an’ shall not be construed as limited to ‘only one’, but instead are used to mean ‘at least one’, and do not exclude a plurality. The mere fact that certain measures are recited in mutually different claims does not indicate that a combination of these measures cannot be used to an advantage.