WO2019244156A1 - System for in-situ imaging of plant tissue - Google Patents
System for in-situ imaging of plant tissue Download PDFInfo
- Publication number
- WO2019244156A1 WO2019244156A1 PCT/IL2019/050692 IL2019050692W WO2019244156A1 WO 2019244156 A1 WO2019244156 A1 WO 2019244156A1 IL 2019050692 W IL2019050692 W IL 2019050692W WO 2019244156 A1 WO2019244156 A1 WO 2019244156A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plant
- images
- camera
- spacer
- imaging
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 58
- 238000011065 in-situ storage Methods 0.000 title claims abstract description 13
- 241000607479 Yersinia pestis Species 0.000 claims abstract description 62
- 125000006850 spacer group Chemical group 0.000 claims abstract description 51
- 238000000034 method Methods 0.000 claims abstract description 38
- 201000010099 disease Diseases 0.000 claims abstract description 25
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 25
- 238000004422 calculation algorithm Methods 0.000 claims description 23
- 239000000575 pesticide Substances 0.000 claims description 22
- 230000033001 locomotion Effects 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 17
- 238000012544 monitoring process Methods 0.000 claims description 14
- 238000000701 chemical imaging Methods 0.000 claims description 2
- 241000196324 Embryophyta Species 0.000 description 146
- 238000001514 detection method Methods 0.000 description 29
- 238000013459 approach Methods 0.000 description 14
- 206010061217 Infestation Diseases 0.000 description 13
- 238000009826 distribution Methods 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 10
- 235000013399 edible fruits Nutrition 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000012805 post-processing Methods 0.000 description 7
- 241000894007 species Species 0.000 description 7
- 239000013598 vector Substances 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 244000038559 crop plants Species 0.000 description 6
- 238000013480 data collection Methods 0.000 description 6
- 241000258937 Hemiptera Species 0.000 description 4
- 241001414989 Thysanoptera Species 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 235000013601 eggs Nutrition 0.000 description 4
- 241000237858 Gastropoda Species 0.000 description 3
- 241000238631 Hexapoda Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 238000010191 image analysis Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000007170 pathology Effects 0.000 description 3
- 241001415288 Coccidae Species 0.000 description 2
- 241001414830 Diaspididae Species 0.000 description 2
- 241001454295 Tetranychidae Species 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 241000238876 Acari Species 0.000 description 1
- 241000254124 Aleyrodidae Species 0.000 description 1
- 241001124076 Aphididae Species 0.000 description 1
- 241001465983 Aphidoidea Species 0.000 description 1
- 235000002566 Capsicum Nutrition 0.000 description 1
- 240000004160 Capsicum annuum Species 0.000 description 1
- 241001465828 Cecidomyiidae Species 0.000 description 1
- 241001465977 Coccoidea Species 0.000 description 1
- 241000254173 Coleoptera Species 0.000 description 1
- 229920000742 Cotton Polymers 0.000 description 1
- 241001517923 Douglasiidae Species 0.000 description 1
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 1
- 244000068988 Glycine max Species 0.000 description 1
- 235000010469 Glycine max Nutrition 0.000 description 1
- 241000219146 Gossypium Species 0.000 description 1
- 241000132456 Haplocarpha Species 0.000 description 1
- 241001058149 Icerya Species 0.000 description 1
- 241000255777 Lepidoptera Species 0.000 description 1
- 241001648788 Margarodidae Species 0.000 description 1
- 240000007594 Oryza sativa Species 0.000 description 1
- 235000007164 Oryza sativa Nutrition 0.000 description 1
- 241000758706 Piperaceae Species 0.000 description 1
- 241001415279 Pseudococcidae Species 0.000 description 1
- 244000061456 Solanum tuberosum Species 0.000 description 1
- 235000002595 Solanum tuberosum Nutrition 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 229930002875 chlorophyll Natural products 0.000 description 1
- 235000019804 chlorophyll Nutrition 0.000 description 1
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036571 hydration Effects 0.000 description 1
- 238000006703 hydration reaction Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004816 latex Substances 0.000 description 1
- 229920000126 latex Polymers 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 244000005700 microbiome Species 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 235000015097 nutrients Nutrition 0.000 description 1
- 244000052769 pathogen Species 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008635 plant growth Effects 0.000 description 1
- 244000062645 predators Species 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 235000009566 rice Nutrition 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 230000001954 sterilising effect Effects 0.000 description 1
- 238000004659 sterilization and disinfection Methods 0.000 description 1
- 231100000331 toxic Toxicity 0.000 description 1
- 230000002588 toxic effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/31—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
- G01N21/35—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
- G01N21/359—Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using near infrared light
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M31/00—Hunting appliances
- A01M31/002—Detecting animals in a given area
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/24—Optical objectives specially designed for the purposes specified below for reproducing or copying at short object distances
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N2021/8466—Investigation of vegetal material, e.g. leaves, plants, fruits
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/17—Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
Definitions
- the present invention relates to a system for in-situ imaging of plant tissue and, more particularly, to an imaging system for monitoring crop plants and identifying pests infestations, diseases and well as assessing the general state of a plant and crop.
- Pest infestation and plant diseases can lead to significant damage to crop plants and loss of crop yield. Insects that infest plant species are particularly problematic in agriculture since major crop plants such as rice, cotton, soybean, potato and com are particularly susceptible to insect infestations.
- Pest infestation of crop plants is traditionally controlled through the use of chemical pesticides.
- these chemicals can be toxic to other species and can cause significant environmental damage especially when overused.
- One approach for reducing use of pesticides in crops involves monitoring plants for pest infestation or disease and applying pesticides only when needed.
- Such monitoring is typically carried out by physically examining selected plants in a field and applying pesticides only when pests are identified. Such monitoring is time consuming and laborious and as such, approaches for identifying pest infestations via plant or pest trap imaging have been developed.
- a system for in situ imaging of plant tissue comprising: a camera having a macro lens for near field imaging; a spacer configured for setting a focal distance between the macro lens and a portion of a plant; and a device for positioning a distal surface of the spacer against the portion of the plant.
- the device is a manually operated boom.
- the device is an autonomous vehicle.
- the macro lens has a depth of field of 1-10 mm.
- the macro lens has an imaging area of 100-2000 mm 2 .
- the spacer is attached to a housing of the macro lens.
- the spacer is a frame having a cylindrical shape.
- the camera is a video camera.
- the video camera has an imaging sensor of at least 5 MP and a frame rate of at least 25 FPS.
- the spacer includes a mirror.
- the spacer is adjustable for setting the focal distance between the macro lens and the portion of a plant and/or an imaging area of the macro lens.
- the system further comprises a processing unit for processing images captured by the camera and identifying out-of-focus images.
- the system further comprises a light source.
- the system further comprises a pesticide reservoir fluidly connected to a nozzle positioned near the distal surface of the spacer.
- the system further comprises a motorized stage for moving a focal plane of the macro lens.
- the system further comprises an auto focus algorithm for actuating the motorized stage.
- the camera is a spectral imaging camera.
- the spacer includes a cover.
- the cover is a net.
- the distal surface of the spacer includes contact or proximity sensors.
- a method of identifying pests on plant tissue comprising positioning a camera having a macro lens for near field imaging at a focal distance between the macro lens and a portion of a plant using a spacer; and capturing a series of images of the portion of the plant via the camera.
- the method further comprises discarding out-of-focus images in the series of images to obtain in-focus images of the portion of the plant.
- the method further comprises analyzing the in-focus images to identify the pests on the plant tissue.
- the portion of the plant is a leaf and further wherein the series of images are of an underside of the leaf.
- a method of assessing a state of plant tissue comprising positioning a camera having a macro lens for near field imaging at a focal distance between the macro lens and a portion of a plant using a spacer; and capturing a series of images of the portion of the plant via the camera.
- the method further comprises discarding out-of-focus images in the series of images to obtain in-focus images of the portion of the plant.
- the method further comprises analyzing the in-focus images to identify the state of the plant tissue.
- the state of plant tissue is a healthy state or a disease/stress state.
- Implementation of the method and system of the present invention involves performing or completing selected tasks or steps manually, automatically, or a combination thereof.
- several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
- selected steps of the invention could be implemented as a chip or a circuit.
- selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
- selected steps of the method and system of the invention such as, for example, image processing could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
- FIG. 1 illustrates one embodiment of the system of the present invention.
- FIGs. 2A-C illustrates scanning of a top surface of plant leaves using the system of the present invention.
- FIGs. 3A-C illustrates scanning of a bottom surface of plant leaves using the system of the present invention.
- FIG. 4 illustrates an embodiment of the present system having a spacer fitted with a mirror.
- FIG. 5 illustrates scanning of a bottom surface of plant leaves using the system of Figure 4.
- FIGs. 6A-B illustrate an embodiment of the present system for manual (Figure 6A) and automatic ( Figure 6B) scanning of plants.
- FIG. 7 is a flow chart outlining image capture and analysis using the present system.
- FIG. 8 is a flow chart outlining field scanning using the present system.
- FIG. 9A illustrates a manual embodiment of the present system utilized for image capture from leaves of a greenhouse plant.
- FIGs. 9B-F are images of plant leaves and pests as captured using the present system.
- FIGs. 10A-I are distribution maps of pests ( Figures 10A-F) and pest eggs ( Figures 10G-I).
- FIGs. 11 A-C illustrates sensors, data collection and feature extraction according to one embodiment of the present invention.
- FIG. 12 is a flowchart illustrating performance evaluation according to one embodiment of the present invention.
- the present invention is of a system which can be used to identify pest infestations in crop plants as well as identify diseased or stressed plants. Specifically, the present invention can be used to image plant tissue in-situ in order to identify plant pests, plant diseases or assess the general state of a plant.
- the present inventors devised a system that includes an imaging/scanning spacer that maintains the imaged object within a preset focal plane of the imaging lens while the system is moved with respect to the object.
- Such a spacer enables rapid scanning of a plant including the underside of leaves and ensures that a large proportion of the images acquired during scanning of plant tissue are in-focus and thus optimized for pests and disease detection using object/pattem detection algorithms.
- in-situ imaging of plant tissue refers to in-field/in-greenhouse imaging of any portion of a whole plant. In other words, leaves, shoots, branches, fruit, flowers etc. forming a part of a whole plant are imaged directly in the field or greenhouse.
- Imaging of plant tissue is effected in order to identify plant pests such as Diaspididae (armored scales), Coccidae (soft scales), Pseudococcidae (mealybugs), Aleyrodidae (Whitefly), Aphidoidea (Aphids), Margarodidae (Icerya), Thysanoptera (Thrips), Lepidoptera, Tetranychidae (spider mites), leafminer, Hemiptera, Cecidomyiidae, Coleoptera and Gastropoda (snails & slugs) and plant diseases caused thereby or by microorganisms to assess a plant state (e.g. hydration state, nutrient state etc.)
- a plant state e.g. hydration state, nutrient state etc.
- the system of the present invention includes a housing for supporting a camera having a macro lens for near field imaging and a spacer configured for setting a focal distance between the imaging sensor and a portion of a plant.
- the camera can be a still or video camera suitable for capturing black and white, 3 color channels or hyperspectral in the visible range (400-800 nm), shortwave infrared (0.8-1.8 pm) or 3D images.
- the camera can utilize a CMOS, or CCD image sensor having a resolution of at least 5 MP and a frame rate of at least 25 FPS (in the case of a color video camera).
- the macro lens can include an imaging area of 100-2000 mm 2 , an F# larger than 5, focal length between 10-15 mm, a depth of field of 1-10 mm and a working distance between 2-5 cm.
- the camera lens can include a motorized stage and auto-focusing algorithms to correct/adjust focus if necessary.
- the system can include a memory device for storing captured still, time lapse and/or video images.
- the system can further include a device for positioning the distal surface of the spacer against the portion of the plant.
- a device can be manually operated boom or a remotely/autonomously operated vehicle (land or air).
- the spacer can be a box, cylindrical/conical or X-shaped frame having a proximal end attachable to the housing of the system or the housing of the lens and a distal end configured for contacting and moving against the plant tissue without disrupting or otherwise damaging the plant tissue.
- the spacer can be adjustable for length (setting focal length of lens) as well as field of view therethrough. While the area of imaging is set by the lens, the spacer can be configured for limiting the area of imaging if needed.
- Imaging through the spacer can be effected directly or indirectly via use of a mirror.
- the spacer can include a mirror that can be angled with respect to the plane of imaging of the lens and/or moved out of the plane of imaging to enable direct imaging of plant tissue.
- the mirror can be manually positioned (angled/moved) or it can be positioned via a motor/servo operatively attached thereto. In the latter configuration, positioning of the mirror can be effected locally (by actuating an on-board motor/servo controller) or remotely through wireless communication. Auto-focusing algorithms can also be used to automatically actuate mirror.
- a mirror can, for example, facilitate imaging of a bottom surface of leaves, flowers or fruit.
- a spacer having a removable/positionable mirror can also be used to image a top surface of plant portions.
- the spacer can include a cover on a distal end thereof for optimizing scans with specific target leaves.
- the shape of the cover is designed to allow a smooth motion of the sensor within the foliage. For this purpose, first, a smooth coating is used to reduce the friction with the leaves. Second, to avoid damages to the leaves such as scratches or punches, the shape of the cover is rounded and includes no comers or sharp edges.
- the present system can include an image processing algorithm for discarding out- of-focus image from a scan.
- the absolute value of the gradients along two orthogonal axes are calculated and summed per each image.
- Focused images present fine details (in the order of a single pixel) and therefore present high gradients. Setting a threshold for the absolute sum of the gradients is therefore an effective approach for separating focused and defocused frames.
- the distal surface of the spacer is moved along the plant surface in order to obtain a series of images (video, time lapse stills) from the plant.
- the distal surface of the spacer is covered with a smooth surface and has a rounded shape with no sharp edges.
- the cover can also be configured for minimizing transfer of pests and pathogens from one plant to the next.
- the cover can be disposable/serializable and be configured for replacement following a pre-defined number of scans.
- the used cover can be sterilized in any standard procedure on the spot (e.g. alcohol) and be prepared for a new scan. A cartridge of sterilized covers assures continuous scanning procedure. Both of the cover replacement and sterilization can be performed either manually or autonomously.
- the distal surface of the spacer can be fitted with contact/proximity sensors that can indicate contact (and optimal focus) as well as automatically switch imaging on and off thereby ensuring that only in-focus images are obtained.
- the present system can include a second camera having a high depth of field and zoom and auto-focus capabilities to provide a second image of the boom and plants.
- a camera can be used for imaging of the plant and for positioning, identifying and counting the plant parts imaged by the optical sensor.
- this camera can be used for additional purposes as well. It can provide images at different scales of the plant, leaves, fruits flowers etc. Those can be used for pest and disease monitoring as well as for plant growth monitoring. For example, parameters such as the plant height, number of buds flowers and fruits can be extracted from the overview large-scale images.
- the additional camera can also be used as a vision camera to guide the automatic robotic arm operation. In addition, it will be used to image the inner part of flowers by aiming the camera directly inside the flower and zooming-in. This will allow scanning the flowers without contact and without using the optical sensor.
- the present system can also include a processing unit for processing images captured by one or more cameras and identifying out-of-focus images. Since object/pattern recognition is processor intensive and prone to errors, discarding out-of-focus images can substantially enhance the results obtained.
- the processing unit can be on-board one or both cameras or inside the housing of the system. Alternatively, such a processing unit can be positioned remotely (e.g. cloud server) in which case images can be communicated between the memory device of the camera(s) and the server via wireless communication.
- the present system can further include a light source positioned on the housing of the system or the spacer.
- the light source can include LED lights of one or more wavelengths.
- the light source can be used to illuminate the plant tissue with white or yellow light or with lights of a specific wavelength that can enhance a contrast between the plant tissue and pests. For example, using blue or purple light enhances the surface of the leaves since it is strongly absorbed by the chlorophyll that is densely spread within the plant tissues. Green or white light is scattered from the entire cross section of the leave and as a result the details of the surface are less prominent.
- the in-focus images obtained by the present system can be processed to identify objects/pattems representing pests or pathologies.
- the present system can include a processing unit executing object/pattem detection algorithms.
- object/pattem detection algorithms One of several algorithms based on deep convolutional neural network can be used by the present system to accomplish pests and pathologies detections.
- Saliency Map and CNN - Identify the region in the image that contain insects or diseases (for example by means of color contrast). Scale and process the resulted section with a DCNN classifier for the classification itself.
- the processing unit executing the above algorithm can be included within the housing of the present system or in a remote server communicating with the local components of the present system (e.g. memory device of the camera).
- imaging of plant tissue for the purpose of identifying pests and diseases can be conducted in order to ascertain the need for pesticides.
- the present system can also be used to selectively apply pesticides to plant regions infested with pests or affected by disease.
- the present system can also include a pesticide reservoir fluidly connected to a nozzle positioned near the distal surface of the spacer.
- a pump either manual or electrical, keeps the pressure within the reservoir container higher than atmospheric pressure.
- a valve is opened to allow the pesticide to flow through the nozzle.
- the system is scanned along the plant to uniformly spread the pesticide. The total amount of pesticide applied on a plant is pre-determined by the specific crop, pests and pesticide.
- the valve can be opened during the pesticides scan whenever pesticides are detected.
- the present system can also include one or more sensors for tracking the performance of the system operator.
- sensors can include a boom-mounted camera that images the scanned leaves and various plant parts, a GPS unit and/or a 9-axis accelerometer.
- Such sensors can be used to track operator and boom position and movement to provide real-time feedback to the operator regarding data collection efficiency.
- the sensor data can also be used to tag images with location/position data (e.g., provide‘metadata’ regarding images captured such as position of boom and operator during capture etc.).
- a smartphone can be used to provide the sensor data.
- the boom can include a smartphone mount and the system can interface with the smartphone sensors and camera to provide the aforementioned data.
- the camera and sensors can also be used to guide the operator through a predetermined scan path.
- the system can use the image and position/movement data to instruct the operator which plant to scan next and when to move on to the next plant.
- the system tracks the speed of the sensor as it moves along the various plant parts. If too fast, the images might be blurry and not enough images will be captured from each part of the plant. In such a case the system provides a physical feedback (for example turn on a light on the sensor or boom, play a unique sound or an artificial speech engine to tell the scanner to slow down) in real-time to instruct the scanner to slow down.
- the number of scanned leaves and the side (lower or upper side of the leave) that was scanned is tracked by following the 9-axis sensor orientation. Once all the counters exceed the pre-determined thresholds and enough focused images are captured from the various plant parts the system announces the scanner that he can continue to the next plant of the scanning program.
- the system can indicate what type of plant parts still need to be scanned.
- This can be in the form of several light sources located on the boom, each indicating a different plant part.
- the scanner knows that he scanned enough flowers on this plant and he can continue scanning the remaining parts.
- the system can vocally announce (using a loudspeaker or earphones) the exact location of the next plant by indicating a row number and plant number along the row. While walking towards the next plant the 9-axis sensor is used for step counting and motion tracking.
- This information is merged with the GPS data and the map of the greenhouse (manually inserted into the SW tool) to estimate if the scanner indeed reached the correct plant before he starts scanning.
- the indication that the user is in the correct position can be in the form of a change in color of a light source on the sensor or boom stick or with an acoustical signal.
- system 10 One embodiment of the present system, referred to herein as system 10, is described below with reference to Figures 1-6B.
- Figure 1 is a side cutaway view of system 10 showing system housing 12, camera 14 including imaging sensor 16 and lens 18 and spacer 20.
- System 10 also includes an arm 22 connectable to a boom ( Figure 6A) or vehicle ( Figure 6B).
- Camera 14 is connected via a communications and power cable 24 to a power source (battery) and a communication module (configured for local and/or remote communication of images acquired by camera 14).
- a power source battery
- a communication module configured for local and/or remote communication of images acquired by camera 14.
- Imaging sensor 16 can be a rolling shutter CMOS with 2048X3072 pixels and a pixel size of 2.4pm.
- An example of camera 14 that can be used with the present system include IMX178 by Sony Inc.
- Spacer 20 is attached to housing 12 around lens 18 with a distal surface 34 thereof configured for contacting a surface of the plant. Spacer 20 sets a focal region within the DoF of lens 18 with respect to the focal plane (FP) thereof.
- FP focal plane
- Spacer 20 can be configured as an open box fabricated from finned/ribbed aluminum for dissipating heat from camera 14 (heat sink).
- Housing 12 can include light sources 30 (e.g. 2-10 LEDs) positioned such that beams (B) outputted therefrom converge within the focal region, i.e. light sources 30 are angled inward (at 15-50 degrees) such that the light beams outputted therefrom converge at the focal plane.
- light sources 30 e.g. 2-10 LEDs
- Figures 2A-C illustrate positioning of system 10 against leaves of a plant and a scanning motion that can be employed to capture a series of images from the surfaces of the leaves.
- Figure 2B when the distal surface of spacer 20 contacts a leaf surface, the light beams projected from light sources 30 converge within the DoF region of lens 18. This ensures maximal lighting for in-focus imaging of any region on the leaf surface and can also be used to discern in-focus images from out-of-focus images based on a lighting level.
- Figures 3A-C illustrate imaging of an underside (bottom surface) of leaves. Similar to that shown in Figures 2A-C, contacting spacer 20 with the leaf surface ensures that lens 18 is in focus and lighting is maximized.
- Spacer 20 can include one or more contact sensors 36 for indicating contact with a leaf surface.
- sensors can be a proximity sensor based on a photo-diode coupled to a light source or a sensitive mechanical pressure sensor.
- Sensor 36 can provide an indication to a user (open loop) or control image acquisition (closed loop) by switching image capture on and off according to contact sensed.
- FIGs 4 and 5 illustrate an embodiment of spacer 20 having a mirror 40.
- Mirror 40 can be a l0Xl5mm AREA mirror fitted on a movable arm 42 connected to spacer 20.
- Mirror 40 can be angled (A) with respect to the imaging plane (IP) of lens 18 and can be completely retracted out of the way (arrow R) when direct imaging is desired).
- IP imaging plane
- arrow R arrow R
- mirror 40 facilitates imaging of a bottom surface of plant tissue (leaves, flowers, fruit).
- Figures 6A-B illustrate manual ( Figure 6A) and automatic ( Figure 6B) configurations of system 10.
- the optical scanner is installed on a 0.5-1.5m long adjustable boom to allow the operator to reach the different parts of the plant from a single position.
- the angle of the sensor with respect to the boom is variable and allows for adaptations per each crop and the angles of its leaves with respect to the horizon.
- the operator scans the plant by performing motions as described in Figures 2A-3C to scan the leaves of each plant.
- the operator can scan other parts of the plant such as flowers, fruits and branches by placing the sensor next to it.
- the scanner can be installed on a robotic arm on either a remote-controlled or an autonomous vehicle. In this configuration the autonomous vehicle drives along the crops lines and stops or slows down near each plant that is to be scanned.
- Scanning of leaves is performed by the robotic arm by mimicking the motions of the human operator as described in Figures 2A-3C.
- the robotic arm should be directed with specific orientation and position.
- a regular and/or a 3D camera is installed on the system and the robotic arm and algorithms applied on the video feed detect the target part of the plant.
- the orientation and position of the target with respect to the system is estimated and the robotic arm is automatically moved to image the target from a close distance and with an optimized orientation (perpendicular to the target surface).
- a closed feedback loop can be applied the detection algorithms activated on the robotic arm camera and the positioning of the arm. This improves the accuracy of positioning and orientation in real-time as the arm gets closer to the target.
- system 10 employs local or remotely executed algorithms for processing images captured thereby in order to identify pests, pathologies and assess the state of a plant.
- a sorting algorithm is applied (box B) based on the contents of the image. Only if the image contains a part of a plant and it is in a sufficient focus level the processing continues. Otherwise, the image is deleted (box C).
- Detection algorithms are applied on the focused plant images (box D) to detect pests and diseases. In cases where the detection confidence reaches a pre-defined threshold the outputs are saved to a database (box E) together with the relevant meta-data: capture time, position in the field, position on the plant (based on the current height of the robotic arm) and any other relevant data that is collected by additional sensors in the system (temperature, humidity etc.).
- the images are saved for post processing analysis (box F).
- the detection outputs are sent to a server over wireless communication in real-time to present the grower the detection report (box G).
- the report can be generated off-line when the scanning is completed and the system returned with the saved data.
- Images that include undetected pests go through manual detection process by experts. After the detected pests and diseases are marked the detection results are added to the database and the marked images are added to the image-database for the purpose of training the neural- networks algorithms and allow automatic detection of the new pests in future scans.
- a scanning program needs to be defined prior to the scan that includes a density of scanned plants in each row, the speed of the scanning motion (in autonomous systems), the duration of scan of a single plant, and the specific rows to be scanned.
- the scanning process begins with the first plant of the first row (box B).
- the system will skip the pre-defined number of plants to scan the next plant according to a predefined scanning list (box C) until it reaches the end of the line. Next it will move to the next line on the scanning list until it finishes following the entire scanning program (box D).
- box E the detections list and the image database that was collected are copied for post processing
- FIGS 11 A- 12 illustrate data collection, feature extraction and performance monitoring according to an embodiment of the present invention.
- the present invention can minimize this dependency on performance and enables scanning by untrained users.
- the present invention provides two features - real-time feedback (during scan) to improve scanning and provide data for post analysis evaluation and validation of the scanning process and a comparison between expected and actual performance of the user. Both approaches can utilize various sensors that form a part of, or are utilized by, the present system ( Figure 11 A) and unique algorithms that are applied to data in order to extract information and features ( Figures 11B-C) that relate to the scan procedure.
- the extracted indications can cover various aspects of the scanning process as required to assure successful results.
- the required spatial resolution should be lower than the distance between adjacent plants, typically lower than 1 meter. GPS systems typically do not provide such accuracy and as such, additional sensors such as those listed in Figure 11A can also be utilized.
- Analysis of sensor information can be completely automatic. Motion related information is extracted from the 9-axis sensor. Data of the angular velocity vector can provide an accurate measure of step counting. Applying a Fast Fourier Transform (FFT) on the 3D components of the angular velocity provides information about the motion status of the user. In a walking state large motions take place periodically this is indicated in the data by an increase of the weights of specific frequencies with respect to the background level. Detecting the frequency components that reach a pre-defined threshold and comparing their values with a look-up table can provide a good estimation for the walking rate. The look-up table can be generated based on a collection of data from a wide group of scanners that were collected while walking in a pre-defined rate.
- FFT Fast Fourier Transform
- Another alternative is to calibrate the system per each scanner and provide more accurate step counting values. Integration by time over the stepping rate can provide an estimation for the walking distance between two points.
- the distance calculation requires the average step size of the scanner. This can be inserted to the system manually (after a measurement) or automatically by letting the system count the total number of steps along a pre-defined distance.
- the direction of advancement can be estimated by the compass coordinates of the 9-axis sensor. This provides a 2D vector of the scanner advancement between 2 points in time and allows to draw a map of his route and position each scanned plant on it. To further increase the accuracy of the distance, a GPS signal is used.
- the accuracy of the GPS coordinates is relatively low it is not used for small movements but for the end points of each part of the scan; for example, it can be used to estimate the vector between the two end points of a given row.
- the 2D coordinates are positioned on the digital map of the crops. The constraints of the map help to further increase the accuracy of the position coordinates. For example, if the crop rows are all in the same length, then the end-to-end distances of all the scanned rows will be similar. This can be used to correct data errors that might result from varying step size between different rows by normalizing the variations.
- markers can be installed in pre-defined locations along the crops. It can be a ring around the plant itself attached to a numbered sign, a color-coded sign etc. Reading the signs can be performed with one of the cameras of the system after pressing a dedicated button that informs the system to save this image separately from the plants scanning data. Automatic detection of the imaged signs by post-processing image analysis and locating their positions on the map based on their meta-data can be used for accurate normalization of step size variation along a single row and for correcting orientation errors of compass and GPS coordinates. The more signs that are scattered on the field, the higher the accuracy of the scanning map. In the extreme case of signing every plant to be scanned in the program this approach provides a 100% accuracy in positioning.
- the orientation of the optical head during image capture can be extracted from the 9-axis accelerometer coordination.
- the dominant contribution to the acceleration vector comes from the gravity force. Therefore, the orientation of the acceleration vector can provide a good estimation for the orientation of the optical sensor.
- This meta data is used in real-time to detect what part of the plant is scanned. For example, when the sensor scans the lower part of leaves it is oriented upwards and vice versa while when scanning the stem, the sensor is oriented horizontally.
- Applying image analysis algorithms on the optical sensor images can also provide information about the scanned part of the plant.
- Color and reflection-based algorithms can be used to differentiate between both sides of leaves, detect flowers, stem, fruits etc.
- Object detection based on neural-network algorithms can provide an additional independent indication for the various objects. Detection of the scanned objects in real-time can be used to provide feedback to the scanner regarding the remaining objects to scan in a specific plant according to a pre-defined plan.
- Scanned objects detection can also be used in post-processing analysis to compare the expected and actual performance of the scanner and to qualitatively grade the quality of the scanning process and the collected data.
- An optical distance meter placed at the front of the optical sensor provides the position of the scanned object with respect to the camera. This can be used in real-time as a trigger for the camera. Once the object is located within a pre-defined distance range, determined by the position of the focal plane and the depth of field of the camera, the camera is triggered to capture images until the trigger is turned-off or until enough images of the objects are taken.
- the imaging trigger is independent of the distance meter.
- the object position is saved for post processing analysis of the captured images.
- the magnification variation with the distance from the camera can be normalized to increase the scaling accuracy of the imaged objects.
- Defocused images can be automatically deleted based on their distance from the camera to the images object, either in real-time or in post-processing.
- Another parameter that determines the quality of the captured images is the velocity of the camera during the exposure.
- Using a high power illumination and short exposure can reduce the sensitivity of the camera to motion up to a certain speed.
- Analyzing the amplitudes of the angular velocity and acceleration vectors can provide an indication for the magnitude of the sensor velocity.
- threshold levels for the vectors magnitude the system can indicate the scanner in real-time if he should slow down the sensor and/or repeat the last measurement.
- Figure 12 is a flowchart illustrating data collection and analysis according to an embodiment of the present invention.
- a scanning program with a uniform distribution and 7% coverage can be used to scan every 3 rd row and every 6 th plant along the row.
- Each plant scan includes a total of 70 images that include leaves from both sides, at least one flower (in case there are flowers on a specific plant), at least one fruit (in case there are fruits on a specific plant) and 5 images of the stem.
- the system analyzes the performance of the scanner. First, every image that is captured is analyzed in real-time for quality estimation. In case the image is too blurred or contains mainly background it is immediately deleted. A real-time indication is provided to the scanner every time an image is passing the quality filter so he can intuitively understand when he scans correctly with respect to speed and distance from the target.
- the speed of the sensor during image capturing is calculated in real-time and alerts the scanner to slow down once it reaches the threshold. If the speed is too high the system deletes the collected images automatically and the scanner is required to re-scan in order to reach the expected number of images.
- Another indicator for image quality is the distance from the target object. When an image is captured while the target object is not located in the vicinity of the focal plane the image is automatically deleted.
- the sensor orientation analysis allows the system to detect the side of the scanned leave and to determine when the stem is scanned. In this example it is used to count at least 5 images of the stem. In other cases, it can be used to set a constraint on the required number of images from each side of the leave.
- a feedback loop based on real-time performance evaluation increases the efficiency of the scan (i.e. the amount of high quality collected data per time unit);
- Image quality analysis in real-time ensures that a pre-defined number of images at sufficient quality is collected from each plant;
- the system can change the scanning plan according to a lookup table. For example, if mites are detected on a plant, increase the number of collected images by 3, reduce the step size between plants to 1 for the whole row and scan adjacent rows. This will ensure a full mapping of a potentially infested zone in real time (without the need to repeat the whole scan).
- the present invention provides a disease/pest detection system that addresses the challenges to efficient and rapid pest and disease detection in the field, including:
- Detection - pests are hard to detect and identify in the field with the naked eye or with a magnifying glass (vary in size between -30 microns to centimeters and can be found above or under the foliage and inside flowers).
- the present solution enables collection of high quality statistical data from the crops by either untrained employees or by an autonomous robotic system at lower costs and higher confidence than an expert.
- the collected data enables high confidence detection of pests and diseases and additional parameters regarding the physiological status of the crops.
- the collected data is tagged with meta-data (e.g., location in the field, location on the plant, time, weather conditions etc.) and detailed pests distribution maps and statistical data are generated thereby providing the grower and experts with real-time information that supports decision making and does not require a second validation in the field.
- the focal plane and the spacer were set to 40mm from the sensor and provided a depth-of-field of 4mm.
- Six white LEDs operating at 3W each were used to illuminate the imaging plane.
- An edge detection algorithm was applied in real time on grabbed frames and was used to classify and save frames that include focused plant parts.
- the optical sensor was placed on a lm boom and was manually scanned over the leaves by a technician ( Figure 9A).
- Another technician was used to pull a cart that carried the pc and power sources of the system. Image analysis and detection was performed after the scan was complete. Peppers greenhouse that includes 8 30m long rows with 40cm separation between plants was scanned. The scanning was performed on every 3 rd plant and every 2 nd row, such that 1/6 of the plants were scanned each time. Each plant was scanned for l5-30s to cover between 5-10 randomly selected leaves.
- FIG. 9B An example of a leaf image is shown in Figure 9B.
- the duration of the scan of the entire greenhouse was about 1 hour. After the scan the collected images, about 20K per scan, were classified by technicians to differentiate images that include pests from images that include only leaves. To minimize false detections classification was performed in parallel by different technicians. Images that include pests were further examined by entomologists to determine the species. In addition, the entomologists visited the greenhouse to examine the pests in-situ and increase the certainty level of their detections. Once the all captured images were processed a series of pest distribution map were generated where each map represents one species that was detected in the greenhouse. Representative images of a number of species that were detected are shown in Figures 9C-F. A whitefly is shown in Figure 9C, and two types of Thrips are shown in Figures 9D-E. Figure 9F shows an Egg, potentially of a whitefly.
- Pest and disease monitoring is typically performed by an expert manually inspecting individual plants.
- the main challenge is to detect infestation as early as possible so as to control pest and disease and minimize potential crop damage.
- the performance of the present system was tested by an experiment performed on bell- pepper plants grown in a greenhouse.
- the greenhouse plot was scanned once a week for a period of 10 weeks.
- the scanning area was of 2du with a coverage of around 7% where every 6 th plant was scanned in every 3 rd row.
- Each plant scan included 70 images of flowers, both sides of leaves and fruits.
- the collected images were examined by agronomists who identified the different pests, beneficial and viruses, and marked their findings with an image- annotation software.
- the findings where used to generate distribution maps of annotated by agronomists for two purposes ( Figures 10A-I). First, to train the NN for detection of different pests, flowers etc.
- Figures 10A-C present distribution maps of Thrips pests during three successive weeks.
- Figures 10D-F present distribution maps of the beneficial predator Swirskii mite and
- Figures 10G-I present distribution maps of eggs of Swirskii mite.
- Figures 10A-C present distribution maps of a scan during week 1 and Figures 10D-F represent scans during week 2.
- a similar map was generated per each pest that was detected.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Pathology (AREA)
- Immunology (AREA)
- General Health & Medical Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Wood Science & Technology (AREA)
- Insects & Arthropods (AREA)
- Pest Control & Pesticides (AREA)
- Environmental Sciences (AREA)
- Zoology (AREA)
- Catching Or Destruction (AREA)
Abstract
A system for in situ imaging of plant tissue and methods of using same for identifying plant pests and diseases are provided. The system includes a camera having a macro lens for near field imaging and a spacer configured for setting a focal distance between the macro lens and a portion of a plant.
Description
SYSTEM FOR IN- SITU IMAGING OF PFANT TISSUE REFATED APPFIC ATION S
This application claims the benefit of priority of U.S. Provisional Patent Application No. 62/687,257 filed on June 20, 2018 and U.S. Provisional Patent Application No. 62/834,419 filed on April 16, 2019, the contents of which are incorporated herein by reference in their entirety.
BACKGROUND
The present invention relates to a system for in-situ imaging of plant tissue and, more particularly, to an imaging system for monitoring crop plants and identifying pests infestations, diseases and well as assessing the general state of a plant and crop.
Pest infestation and plant diseases can lead to significant damage to crop plants and loss of crop yield. Insects that infest plant species are particularly problematic in agriculture since major crop plants such as rice, cotton, soybean, potato and com are particularly susceptible to insect infestations.
Pest infestation of crop plants is traditionally controlled through the use of chemical pesticides. However, these chemicals can be toxic to other species and can cause significant environmental damage especially when overused.
One approach for reducing use of pesticides in crops involves monitoring plants for pest infestation or disease and applying pesticides only when needed.
Such monitoring is typically carried out by physically examining selected plants in a field and applying pesticides only when pests are identified. Such monitoring is time consuming and laborious and as such, approaches for identifying pest infestations via plant or pest trap imaging have been developed.
Monitoring pest infestations using plant imaging requires high resolution, focused images, with large magnification of plant surfaces and monitoring of a significant portion of the field in order to obtain an accurate indication of the state of infestation.
There remains a need for plant imaging systems capable of in-situ imaging of plant tissue in a manner which enables accurate identification of pests infestations using image processing approaches.
SUMMARY
According to one aspect of the present invention there is provided a system for in situ imaging of plant tissue comprising: a camera having a macro lens for near field imaging; a spacer configured for setting a focal distance between the macro lens and a portion of a plant; and a device for positioning a distal surface of the spacer against the portion of the plant.
According to embodiments of the present invention the device is a manually operated boom.
According to embodiments of the present invention the device is an autonomous vehicle.
According to embodiments of the present invention the macro lens has a depth of field of 1-10 mm.
According to embodiments of the present invention the macro lens has an imaging area of 100-2000 mm2.
According to embodiments of the present invention the spacer is attached to a housing of the macro lens.
According to embodiments of the present invention the spacer is a frame having a cylindrical shape.
According to embodiments of the present invention the camera is a video camera.
According to embodiments of the present invention the video camera has an imaging sensor of at least 5 MP and a frame rate of at least 25 FPS.
According to embodiments of the present invention the spacer includes a mirror.
According to embodiments of the present invention the spacer is adjustable for setting the focal distance between the macro lens and the portion of a plant and/or an imaging area of the macro lens.
According to embodiments of the present invention the system further comprises a processing unit for processing images captured by the camera and identifying out-of-focus images.
According to embodiments of the present invention the system further comprises a light source.
According to embodiments of the present invention the system further comprises a pesticide reservoir fluidly connected to a nozzle positioned near the distal surface of the spacer.
According to embodiments of the present invention the system further comprises a motorized stage for moving a focal plane of the macro lens.
According to embodiments of the present invention the system further comprises an auto focus algorithm for actuating the motorized stage.
According to embodiments of the present invention the camera is a spectral imaging camera.
According to embodiments of the present invention the spacer includes a cover.
According to embodiments of the present invention the cover is a net.
According to embodiments of the present invention the distal surface of the spacer includes contact or proximity sensors.
According to another aspect of the present invention there is provided a method of identifying pests on plant tissue comprising positioning a camera having a macro lens for near field imaging at a focal distance between the macro lens and a portion of a plant using a spacer; and capturing a series of images of the portion of the plant via the camera.
According to embodiments of the present invention the method further comprises discarding out-of-focus images in the series of images to obtain in-focus images of the portion of the plant.
According to embodiments of the present invention the method further comprises analyzing the in-focus images to identify the pests on the plant tissue.
According to embodiments of the present invention the portion of the plant is a leaf and further wherein the series of images are of an underside of the leaf.
According to yet another aspect of the present invention there is provided a method of assessing a state of plant tissue comprising positioning a camera having a macro lens for near field imaging at a focal distance between the macro lens and a portion of a plant using a spacer; and capturing a series of images of the portion of the plant via the camera.
According to embodiments of the present invention the method further comprises discarding out-of-focus images in the series of images to obtain in-focus images of the portion of the plant.
According to embodiments of the present invention the method further comprises analyzing the in-focus images to identify the state of the plant tissue.
According to embodiments of the present invention the state of plant tissue is a healthy state or a disease/stress state.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present invention, suitable methods and materials are described below. In case of conflict, the patent specification, including definitions, will control.
In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.
Implementation of the method and system of the present invention involves performing or completing selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention, such as, for example, image processing could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
In the drawings:
FIG. 1 illustrates one embodiment of the system of the present invention.
FIGs. 2A-C illustrates scanning of a top surface of plant leaves using the system of the present invention.
FIGs. 3A-C illustrates scanning of a bottom surface of plant leaves using the system of the present invention.
FIG. 4 illustrates an embodiment of the present system having a spacer fitted with a mirror.
FIG. 5 illustrates scanning of a bottom surface of plant leaves using the system of Figure 4.
FIGs. 6A-B illustrate an embodiment of the present system for manual (Figure 6A) and automatic (Figure 6B) scanning of plants.
FIG. 7 is a flow chart outlining image capture and analysis using the present system.
FIG. 8 is a flow chart outlining field scanning using the present system.
FIG. 9A illustrates a manual embodiment of the present system utilized for image capture from leaves of a greenhouse plant.
FIGs. 9B-F are images of plant leaves and pests as captured using the present system.
FIGs. 10A-I are distribution maps of pests (Figures 10A-F) and pest eggs (Figures 10G-I).
FIGs. 11 A-C illustrates sensors, data collection and feature extraction according to one embodiment of the present invention.
FIG. 12 is a flowchart illustrating performance evaluation according to one embodiment of the present invention.
DETAILED DESCRIPTION
The present invention is of a system which can be used to identify pest infestations in crop plants as well as identify diseased or stressed plants. Specifically, the present invention can be used to image plant tissue in-situ in order to identify plant pests, plant diseases or assess the general state of a plant.
The principles and operation of the present invention may be better understood with reference to the drawings and accompanying descriptions.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details set forth in the following description or exemplified by the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
Currently used plant monitoring approaches monitor the effects of pest infestations on plants in order to indirectly detect such infestations. Although approaches for in-situ imaging of plant pests have been proposed in the prior art, such approaches have not been commercially implemented.
While reducing the present invention to practice, the present inventors uncovered that in order to efficiently identify pests and diseases in plant tissue of crop plants a large proportion of the plant surface should be imaged and the images captured should be in-focus and highly magnified and of both sides of the plant leaves. Such functionality cannot be provided by
currently proposed imaging approaches possibly accounting for the lack of commercial implementation thereof.
In order to traverses these limitations of prior art in-situ imaging approaches, the present inventors devised a system that includes an imaging/scanning spacer that maintains the imaged object within a preset focal plane of the imaging lens while the system is moved with respect to the object.
Such a spacer enables rapid scanning of a plant including the underside of leaves and ensures that a large proportion of the images acquired during scanning of plant tissue are in-focus and thus optimized for pests and disease detection using object/pattem detection algorithms.
Thus, according to one aspect of the present invention there is provided a system for in situ imaging of plant tissue. As used herein, the phrase in-situ imaging of plant tissue refers to in-field/in-greenhouse imaging of any portion of a whole plant. In other words, leaves, shoots, branches, fruit, flowers etc. forming a part of a whole plant are imaged directly in the field or greenhouse.
Imaging of plant tissue is effected in order to identify plant pests such as Diaspididae (armored scales), Coccidae (soft scales), Pseudococcidae (mealybugs), Aleyrodidae (Whitefly), Aphidoidea (Aphids), Margarodidae (Icerya), Thysanoptera (Thrips), Lepidoptera, Tetranychidae (spider mites), leafminer, Hemiptera, Cecidomyiidae, Coleoptera and Gastropoda (snails & slugs) and plant diseases caused thereby or by microorganisms to assess a plant state (e.g. hydration state, nutrient state etc.)
The system of the present invention includes a housing for supporting a camera having a macro lens for near field imaging and a spacer configured for setting a focal distance between the imaging sensor and a portion of a plant.
The camera can be a still or video camera suitable for capturing black and white, 3 color channels or hyperspectral in the visible range (400-800 nm), shortwave infrared (0.8-1.8 pm) or 3D images.
The camera can utilize a CMOS, or CCD image sensor having a resolution of at least 5 MP and a frame rate of at least 25 FPS (in the case of a color video camera).
The macro lens can include an imaging area of 100-2000 mm2, an F# larger than 5, focal length between 10-15 mm, a depth of field of 1-10 mm and a working distance between 2-5 cm.
The camera lens can include a motorized stage and auto-focusing algorithms to correct/adjust focus if necessary.
The system can include a memory device for storing captured still, time lapse and/or video images.
The system can further include a device for positioning the distal surface of the spacer against the portion of the plant. Such a device can be manually operated boom or a remotely/autonomously operated vehicle (land or air).
The spacer can be a box, cylindrical/conical or X-shaped frame having a proximal end attachable to the housing of the system or the housing of the lens and a distal end configured for contacting and moving against the plant tissue without disrupting or otherwise damaging the plant tissue. The spacer can be adjustable for length (setting focal length of lens) as well as field of view therethrough. While the area of imaging is set by the lens, the spacer can be configured for limiting the area of imaging if needed.
Imaging through the spacer can be effected directly or indirectly via use of a mirror. In the latter case, the spacer can include a mirror that can be angled with respect to the plane of imaging of the lens and/or moved out of the plane of imaging to enable direct imaging of plant tissue.
The mirror can be manually positioned (angled/moved) or it can be positioned via a motor/servo operatively attached thereto. In the latter configuration, positioning of the mirror can be effected locally (by actuating an on-board motor/servo controller) or remotely through wireless communication. Auto-focusing algorithms can also be used to automatically actuate mirror.
Use of a mirror can, for example, facilitate imaging of a bottom surface of leaves, flowers or fruit. A spacer having a removable/positionable mirror can also be used to image a top surface of plant portions.
The spacer can include a cover on a distal end thereof for optimizing scans with specific target leaves. The shape of the cover is designed to allow a smooth motion of the sensor within the foliage. For this purpose, first, a smooth coating is used to reduce the friction with the leaves. Second, to avoid damages to the leaves such as scratches or punches, the shape of the cover is rounded and includes no comers or sharp edges.
Although the use of a spacer can substantially maximize the proportion of in-focus images obtained from a scan, it will be appreciated that since plant surfaces are irregular in as far as distribution and position, a scan using the present system can still include out-of-focus images. Since processing of such images along with in-focus images can negatively affect object/pattern identification, the present system can include an image processing algorithm for discarding out- of-focus image from a scan. For this purpose, the absolute value of the gradients along two orthogonal axes are calculated and summed per each image. When there are no sharp details in the frame as a result of defocusing, gradients in the image will be relatively low. Focused images
present fine details (in the order of a single pixel) and therefore present high gradients. Setting a threshold for the absolute sum of the gradients is therefore an effective approach for separating focused and defocused frames.
As is mentioned hereinabove, the distal surface of the spacer is moved along the plant surface in order to obtain a series of images (video, time lapse stills) from the plant. In order to minimize disruption or damage to the tissue when moved thereagainst, the distal surface of the spacer is covered with a smooth surface and has a rounded shape with no sharp edges. Furthermore it can also be configured for minimizing transfer of pests and pathogens from one plant to the next. For this purpose the cover can be disposable/serializable and be configured for replacement following a pre-defined number of scans. The used cover can be sterilized in any standard procedure on the spot (e.g. alcohol) and be prepared for a new scan. A cartridge of sterilized covers assures continuous scanning procedure. Both of the cover replacement and sterilization can be performed either manually or autonomously.
In addition, the distal surface of the spacer can be fitted with contact/proximity sensors that can indicate contact (and optimal focus) as well as automatically switch imaging on and off thereby ensuring that only in-focus images are obtained.
The present system can include a second camera having a high depth of field and zoom and auto-focus capabilities to provide a second image of the boom and plants. In a manual system such a camera can be used for imaging of the plant and for positioning, identifying and counting the plant parts imaged by the optical sensor. When installed on a robotic arm of an autonomous system, this camera can be used for additional purposes as well. It can provide images at different scales of the plant, leaves, fruits flowers etc. Those can be used for pest and disease monitoring as well as for plant growth monitoring. For example, parameters such as the plant height, number of buds flowers and fruits can be extracted from the overview large-scale images. The additional camera can also be used as a vision camera to guide the automatic robotic arm operation. In addition, it will be used to image the inner part of flowers by aiming the camera directly inside the flower and zooming-in. This will allow scanning the flowers without contact and without using the optical sensor.
The present system can also include a processing unit for processing images captured by one or more cameras and identifying out-of-focus images. Since object/pattern recognition is processor intensive and prone to errors, discarding out-of-focus images can substantially enhance the results obtained. The processing unit can be on-board one or both cameras or inside the housing of the system. Alternatively, such a processing unit can be positioned remotely (e.g.
cloud server) in which case images can be communicated between the memory device of the camera(s) and the server via wireless communication.
The present system can further include a light source positioned on the housing of the system or the spacer. The light source can include LED lights of one or more wavelengths. The light source can be used to illuminate the plant tissue with white or yellow light or with lights of a specific wavelength that can enhance a contrast between the plant tissue and pests. For example, using blue or purple light enhances the surface of the leaves since it is strongly absorbed by the chlorophyll that is densely spread within the plant tissues. Green or white light is scattered from the entire cross section of the leave and as a result the details of the surface are less prominent.
The in-focus images obtained by the present system can be processed to identify objects/pattems representing pests or pathologies. To that end, the present system can include a processing unit executing object/pattem detection algorithms. One of several algorithms based on deep convolutional neural network can be used by the present system to accomplish pests and pathologies detections.
(i) Saliency Map and CNN - Identify the region in the image that contain insects or diseases (for example by means of color contrast). Scale and process the resulted section with a DCNN classifier for the classification itself.
(ii) RCNN, Fast-RCNN - Scale the entire image (or part of it) to a region proposal network and then check it with a classifier (convolutional neural network)
(iii) SSD, YOLO - Single shot detectors have a set of pre-defined boxes to look for objects.
These algorithms require a large training set and can achieve high accuracy (above 90%). In order to detect the pests and diseases the neural network can be trained with annotated images of each species and inter- species. As a result, the weights of the network filters changes and the network "learns" to detect this species or diseases or state.
The processing unit executing the above algorithm can be included within the housing of the present system or in a remote server communicating with the local components of the present system (e.g. memory device of the camera).
As is mentioned above, imaging of plant tissue for the purpose of identifying pests and diseases can be conducted in order to ascertain the need for pesticides.
Since the present system is utilized in the field it can also be used to selectively apply pesticides to plant regions infested with pests or affected by disease. To that end, the present system can also include a pesticide reservoir fluidly connected to a nozzle positioned near the distal surface of the spacer. A pump, either manual or electrical, keeps the pressure within the
reservoir container higher than atmospheric pressure. Once pesticides are detected on a specific plant, a valve is opened to allow the pesticide to flow through the nozzle. In parallel, the system is scanned along the plant to uniformly spread the pesticide. The total amount of pesticide applied on a plant is pre-determined by the specific crop, pests and pesticide. As an alternative approach, the valve can be opened during the pesticides scan whenever pesticides are detected. This approach can reduce the total amount of pesticides used. On the other hand, since not all the leaves are necessarily scanned, there might be untreated pests left on the plant. However, if the scan is repeated frequently where in each iteration the scanned leaves are randomly selected, it increases the chance that all pests will be detected eventually. A different solution is to provide the detection information to an external automatic pesticide system. The pesticide system can work with the detection system in an open or closed loop manner.
The present system can also include one or more sensors for tracking the performance of the system operator. Such sensors can include a boom-mounted camera that images the scanned leaves and various plant parts, a GPS unit and/or a 9-axis accelerometer. Such sensors can be used to track operator and boom position and movement to provide real-time feedback to the operator regarding data collection efficiency. The sensor data can also be used to tag images with location/position data (e.g., provide‘metadata’ regarding images captured such as position of boom and operator during capture etc.).
Alternatively, a smartphone can be used to provide the sensor data. In such a configuration, the boom can include a smartphone mount and the system can interface with the smartphone sensors and camera to provide the aforementioned data.
The camera and sensors can also be used to guide the operator through a predetermined scan path. The system can use the image and position/movement data to instruct the operator which plant to scan next and when to move on to the next plant.
During a scan of a specific plant the system tracks the speed of the sensor as it moves along the various plant parts. If too fast, the images might be blurry and not enough images will be captured from each part of the plant. In such a case the system provides a physical feedback (for example turn on a light on the sensor or boom, play a unique sound or an artificial speech engine to tell the scanner to slow down) in real-time to instruct the scanner to slow down. In addition, the number of scanned leaves and the side (lower or upper side of the leave) that was scanned is tracked by following the 9-axis sensor orientation. Once all the counters exceed the pre-determined thresholds and enough focused images are captured from the various plant parts the system announces the scanner that he can continue to the next plant of the scanning program. Optionally, the system can indicate what type of plant parts still need to be scanned. This can be
in the form of several light sources located on the boom, each indicating a different plant part. For example, when the light indicating flowers is switched off, the scanner knows that he scanned enough flowers on this plant and he can continue scanning the remaining parts. To further help the scan and reduce the rate of human errors the system can vocally announce (using a loudspeaker or earphones) the exact location of the next plant by indicating a row number and plant number along the row. While walking towards the next plant the 9-axis sensor is used for step counting and motion tracking. This information is merged with the GPS data and the map of the greenhouse (manually inserted into the SW tool) to estimate if the scanner indeed reached the correct plant before he starts scanning. The indication that the user is in the correct position can be in the form of a change in color of a light source on the sensor or boom stick or with an acoustical signal.
The performance tracking feature of the present invention is further described hereinbelow with reference to Figures 11 A- 12.
The present system can provide a grower with the following advantages:
(i) Coverage of over 1% of the plants in the field or greenhouse with real-time detection of pest or disease (and feedback) at a fraction of the cost of experts.
(ii) Construction of prediction models and maps from data integrated from multiple sensors and multiple sites (and multiple times).
(iii) Automatic generation and presentation of interactive reports.
(iv) Operational suggestions based on proprietary prediction models.
(v) Methodological scouting (today every farmer and pest control expert scan differently).
(vi) Reduce dependency of the farmer on expert power (pest control specialists are both expensive and hard-to-find)
(vii) Fast integration of the system - an untrained employee can begin scanning following a short introduction to the system.
(viii) No infrastructure installations required.
(ix) Automatic monitoring system will both teach an operator how to scan on-the-go and will ensure that the collected of the data meets quality requirements.
(x) Automatic data digitization with no requirement for manual inputs.
One embodiment of the present system, referred to herein as system 10, is described below with reference to Figures 1-6B.
Referring now to the drawings, Figure 1 is a side cutaway view of system 10 showing system housing 12, camera 14 including imaging sensor 16 and lens 18 and spacer 20.
System 10 also includes an arm 22 connectable to a boom (Figure 6A) or vehicle (Figure 6B).
Camera 14 is connected via a communications and power cable 24 to a power source (battery) and a communication module (configured for local and/or remote communication of images acquired by camera 14).
Lens 18 can be a l2mm focal length macro lens with F#=l 1, with an imaging area (AI) of l5X20mm and a depth of field (DoF) of 4mm. Lens 18 can be fitted on a stage 19 for autofocus function. Imaging sensor 16 can be a rolling shutter CMOS with 2048X3072 pixels and a pixel size of 2.4pm. An example of camera 14 that can be used with the present system include IMX178 by Sony Inc.
Spacer 20 is attached to housing 12 around lens 18 with a distal surface 34 thereof configured for contacting a surface of the plant. Spacer 20 sets a focal region within the DoF of lens 18 with respect to the focal plane (FP) thereof.
Spacer 20 can be configured as an open box fabricated from finned/ribbed aluminum for dissipating heat from camera 14 (heat sink). Housing 12 can include light sources 30 (e.g. 2-10 LEDs) positioned such that beams (B) outputted therefrom converge within the focal region, i.e. light sources 30 are angled inward (at 15-50 degrees) such that the light beams outputted therefrom converge at the focal plane.
Figures 2A-C illustrate positioning of system 10 against leaves of a plant and a scanning motion that can be employed to capture a series of images from the surfaces of the leaves. As is shown in Figure 2B, when the distal surface of spacer 20 contacts a leaf surface, the light beams projected from light sources 30 converge within the DoF region of lens 18. This ensures maximal lighting for in-focus imaging of any region on the leaf surface and can also be used to discern in-focus images from out-of-focus images based on a lighting level.
Figures 3A-C illustrate imaging of an underside (bottom surface) of leaves. Similar to that shown in Figures 2A-C, contacting spacer 20 with the leaf surface ensures that lens 18 is in focus and lighting is maximized.
Spacer 20 can include one or more contact sensors 36 for indicating contact with a leaf surface. Such sensors can be a proximity sensor based on a photo-diode coupled to a light source or a sensitive mechanical pressure sensor. Sensor 36 can provide an indication to a user (open loop) or control image acquisition (closed loop) by switching image capture on and off according to contact sensed.
Figures 4 and 5 illustrate an embodiment of spacer 20 having a mirror 40. Mirror 40 can be a l0Xl5mm AREA mirror fitted on a movable arm 42 connected to spacer 20. Mirror 40 can
be angled (A) with respect to the imaging plane (IP) of lens 18 and can be completely retracted out of the way (arrow R) when direct imaging is desired). As is shown in Figure 5, mirror 40 facilitates imaging of a bottom surface of plant tissue (leaves, flowers, fruit).
Figures 6A-B illustrate manual (Figure 6A) and automatic (Figure 6B) configurations of system 10.
The optical scanner is installed on a 0.5-1.5m long adjustable boom to allow the operator to reach the different parts of the plant from a single position. The angle of the sensor with respect to the boom is variable and allows for adaptations per each crop and the angles of its leaves with respect to the horizon. The operator scans the plant by performing motions as described in Figures 2A-3C to scan the leaves of each plant. In addition the operator can scan other parts of the plant such as flowers, fruits and branches by placing the sensor next to it. Alternatively, the scanner can be installed on a robotic arm on either a remote-controlled or an autonomous vehicle. In this configuration the autonomous vehicle drives along the crops lines and stops or slows down near each plant that is to be scanned. Scanning of leaves is performed by the robotic arm by mimicking the motions of the human operator as described in Figures 2A-3C. In order to scan other parts of the plant the robotic arm should be directed with specific orientation and position. For this purpose a regular and/or a 3D camera is installed on the system and the robotic arm and algorithms applied on the video feed detect the target part of the plant. Once detected, the orientation and position of the target with respect to the system is estimated and the robotic arm is automatically moved to image the target from a close distance and with an optimized orientation (perpendicular to the target surface). To optimize the positioning of the arm a closed feedback loop can be applied the detection algorithms activated on the robotic arm camera and the positioning of the arm. This improves the accuracy of positioning and orientation in real-time as the arm gets closer to the target.
As is mentioned hereabove, system 10 employs local or remotely executed algorithms for processing images captured thereby in order to identify pests, pathologies and assess the state of a plant.
Image acquisition and processing as carried out by system 10 is described hereinbelow with reference to Figures 7 and 8.
While in imaging mode the system continuously captures frames, as illustrated in box A of the flowchart in Figure 7. Once a frame is grabbed a sorting algorithm is applied (box B) based on the contents of the image. Only if the image contains a part of a plant and it is in a sufficient focus level the processing continues. Otherwise, the image is deleted (box C). Detection algorithms are applied on the focused plant images (box D) to detect pests and diseases. In cases
where the detection confidence reaches a pre-defined threshold the outputs are saved to a database (box E) together with the relevant meta-data: capture time, position in the field, position on the plant (based on the current height of the robotic arm) and any other relevant data that is collected by additional sensors in the system (temperature, humidity etc.). In cases where the detection confidence is lower than a pre-defined threshold the images are saved for post processing analysis (box F). Optionally, the detection outputs are sent to a server over wireless communication in real-time to present the grower the detection report (box G). Alternatively, the report can be generated off-line when the scanning is completed and the system returned with the saved data.
Images that include undetected pests go through manual detection process by experts. After the detected pests and diseases are marked the detection results are added to the database and the marked images are added to the image-database for the purpose of training the neural- networks algorithms and allow automatic detection of the new pests in future scans.
The scanning scheme of an entire field is outlined in Figure 8. A scanning program needs to be defined prior to the scan that includes a density of scanned plants in each row, the speed of the scanning motion (in autonomous systems), the duration of scan of a single plant, and the specific rows to be scanned. Following activation of the system (box A), the scanning process begins with the first plant of the first row (box B). The system will skip the pre-defined number of plants to scan the next plant according to a predefined scanning list (box C) until it reaches the end of the line. Next it will move to the next line on the scanning list until it finishes following the entire scanning program (box D). Fast, when the system is returned to its position, the detections list and the image database that was collected are copied for post processing (box E) and scanning report generation.
Figures 11 A- 12 illustrate data collection, feature extraction and performance monitoring according to an embodiment of the present invention.
When scanning a crop a user is typically instructed to follow a pre-defined scanning program that defines both the scanning procedure for each plant and the selected plants to be scanned. As a result, the quality and reliability of the outputs is strongly dependent on the user’s performance.
The present invention can minimize this dependency on performance and enables scanning by untrained users.
To minimize the dependency on the user’s performance, the present invention provides two features - real-time feedback (during scan) to improve scanning and provide data for post analysis evaluation and validation of the scanning process and a comparison between expected
and actual performance of the user. Both approaches can utilize various sensors that form a part of, or are utilized by, the present system (Figure 11 A) and unique algorithms that are applied to data in order to extract information and features (Figures 11B-C) that relate to the scan procedure.
Feedback to the user can be simple and intuitive to allow comprehension and correction of mistakes. Data is processed and various indications are extracted during the scan (within 1 second from the first action) or at the end of the scan.
The extracted indications can cover various aspects of the scanning process as required to assure successful results.
The required spatial resolution should be lower than the distance between adjacent plants, typically lower than 1 meter. GPS systems typically do not provide such accuracy and as such, additional sensors such as those listed in Figure 11A can also be utilized.
Analysis of sensor information can be completely automatic. Motion related information is extracted from the 9-axis sensor. Data of the angular velocity vector can provide an accurate measure of step counting. Applying a Fast Fourier Transform (FFT) on the 3D components of the angular velocity provides information about the motion status of the user. In a walking state large motions take place periodically this is indicated in the data by an increase of the weights of specific frequencies with respect to the background level. Detecting the frequency components that reach a pre-defined threshold and comparing their values with a look-up table can provide a good estimation for the walking rate. The look-up table can be generated based on a collection of data from a wide group of scanners that were collected while walking in a pre-defined rate. Another alternative is to calibrate the system per each scanner and provide more accurate step counting values. Integration by time over the stepping rate can provide an estimation for the walking distance between two points. The distance calculation requires the average step size of the scanner. This can be inserted to the system manually (after a measurement) or automatically by letting the system count the total number of steps along a pre-defined distance. The direction of advancement can be estimated by the compass coordinates of the 9-axis sensor. This provides a 2D vector of the scanner advancement between 2 points in time and allows to draw a map of his route and position each scanned plant on it. To further increase the accuracy of the distance, a GPS signal is used. Since the accuracy of the GPS coordinates is relatively low it is not used for small movements but for the end points of each part of the scan; for example, it can be used to estimate the vector between the two end points of a given row. In a post processing analysis, the 2D coordinates are positioned on the digital map of the crops.
The constraints of the map help to further increase the accuracy of the position coordinates. For example, if the crop rows are all in the same length, then the end-to-end distances of all the scanned rows will be similar. This can be used to correct data errors that might result from varying step size between different rows by normalizing the variations.
Another useful tool for improving the position accuracy is adding additional constraints to the map. For example, different markers can be installed in pre-defined locations along the crops. It can be a ring around the plant itself attached to a numbered sign, a color-coded sign etc. Reading the signs can be performed with one of the cameras of the system after pressing a dedicated button that informs the system to save this image separately from the plants scanning data. Automatic detection of the imaged signs by post-processing image analysis and locating their positions on the map based on their meta-data can be used for accurate normalization of step size variation along a single row and for correcting orientation errors of compass and GPS coordinates. The more signs that are scattered on the field, the higher the accuracy of the scanning map. In the extreme case of signing every plant to be scanned in the program this approach provides a 100% accuracy in positioning.
The orientation of the optical head during image capture can be extracted from the 9-axis accelerometer coordination. When stationary or during smooth motion the dominant contribution to the acceleration vector comes from the gravity force. Therefore, the orientation of the acceleration vector can provide a good estimation for the orientation of the optical sensor. This meta data is used in real-time to detect what part of the plant is scanned. For example, when the sensor scans the lower part of leaves it is oriented upwards and vice versa while when scanning the stem, the sensor is oriented horizontally.
Applying image analysis algorithms on the optical sensor images can also provide information about the scanned part of the plant. Color and reflection-based algorithms can be used to differentiate between both sides of leaves, detect flowers, stem, fruits etc. Object detection based on neural-network algorithms (Zhao et al. Journal of Latex Class Files, vol. 14, no. 8, March 2017) can provide an additional independent indication for the various objects. Detection of the scanned objects in real-time can be used to provide feedback to the scanner regarding the remaining objects to scan in a specific plant according to a pre-defined plan.
Scanned objects detection can also be used in post-processing analysis to compare the expected and actual performance of the scanner and to qualitatively grade the quality of the scanning process and the collected data. An optical distance meter placed at the front of the optical sensor provides the position of the scanned object with respect to the camera. This can be used in real-time as a trigger for the camera. Once the object is located within a pre-defined
distance range, determined by the position of the focal plane and the depth of field of the camera, the camera is triggered to capture images until the trigger is turned-off or until enough images of the objects are taken.
In an alternative operation scheme, the imaging trigger is independent of the distance meter. The object position is saved for post processing analysis of the captured images. For example, the magnification variation with the distance from the camera can be normalized to increase the scaling accuracy of the imaged objects. Defocused images can be automatically deleted based on their distance from the camera to the images object, either in real-time or in post-processing.
Another parameter that determines the quality of the captured images is the velocity of the camera during the exposure. Using a high power illumination and short exposure can reduce the sensitivity of the camera to motion up to a certain speed. Analyzing the amplitudes of the angular velocity and acceleration vectors can provide an indication for the magnitude of the sensor velocity. By applying threshold levels for the vectors magnitude, the system can indicate the scanner in real-time if he should slow down the sensor and/or repeat the last measurement.
Figure 12 is a flowchart illustrating data collection and analysis according to an embodiment of the present invention.
A scanning program with a uniform distribution and 7% coverage can be used to scan every 3rd row and every 6th plant along the row. Each plant scan includes a total of 70 images that include leaves from both sides, at least one flower (in case there are flowers on a specific plant), at least one fruit (in case there are fruits on a specific plant) and 5 images of the stem. During the scan of a specific plant the system analyzes the performance of the scanner. First, every image that is captured is analyzed in real-time for quality estimation. In case the image is too blurred or contains mainly background it is immediately deleted. A real-time indication is provided to the scanner every time an image is passing the quality filter so he can intuitively understand when he scans correctly with respect to speed and distance from the target. In addition, the speed of the sensor during image capturing is calculated in real-time and alerts the scanner to slow down once it reaches the threshold. If the speed is too high the system deletes the collected images automatically and the scanner is required to re-scan in order to reach the expected number of images. Another indicator for image quality is the distance from the target object. When an image is captured while the target object is not located in the vicinity of the focal plane the image is automatically deleted. The sensor orientation analysis allows the system to detect the side of the scanned leave and to determine when the stem is scanned. In this example it is used to count at least 5 images of the stem. In other cases, it can be used to set a constraint on the required
number of images from each side of the leave. Once a plant scan is over the scanner start walking towards the next plant in the program, as indicated to him by the system. Once he stops and starts scanning his position is estimated and compared with the expected position according to the program. In case of large deviations in position the scanner is ordered to walk to a reference point, and then walk back to the scanned plant. A reference point can be either a beginning of a row or one of several marked points in the scanned area. After the scanning process the meta-data is analyzed to provide feedback to the scanner regarding his performance. For example, the average speed while scanning different parts of the plant is compared with optimal values. If it is shown that the scanner was scanning relatively fast, the highest-allowable scanning speed threshold can be decreased to force the scanner to slow down in the next scan. The above described automatic user tracking and guiding system provides the following advantages to both the manual and autonomous configurations of the present system:
(i) Real-time feedback concerning the scanning performance of an employee enables collection of high quality data;
(ii) Post analysis indications concerning the collection process and images quality improves efficiency of real time scanning and future scans;
(iii) A feedback loop based on real-time performance evaluation increases the efficiency of the scan (i.e. the amount of high quality collected data per time unit);
(iv) Image quality analysis in real-time ensures that a pre-defined number of images at sufficient quality is collected from each plant;
(v) Based on detected objects in real-time the system can change the scanning plan according to a lookup table. For example, if mites are detected on a plant, increase the number of collected images by 3, reduce the step size between plants to 1 for the whole row and scan adjacent rows. This will ensure a full mapping of a potentially infested zone in real time (without the need to repeat the whole scan).
Thus, the present invention provides a disease/pest detection system that addresses the challenges to efficient and rapid pest and disease detection in the field, including:
(i) Detection - pests are hard to detect and identify in the field with the naked eye or with a magnifying glass (vary in size between -30 microns to centimeters and can be found above or under the foliage and inside flowers).
(ii) Data collection - reports filled manually by the employee in the field, either on paper or using a dedicated computer SW, requires additional significant time to process.
(iii) Data analysis - experts typically perform sparse sampling plants in the field and oftentimes base their decisions on data collected by untrained employees.
The present system traverses these limitations of prior art approaches and enables efficient and cost effective sampling of a large and representative number of plants with the ability to detect eggs and nymphs as well as other pest forms in real time while greatly enhancing data collection speed and efficiency.
The present solution enables collection of high quality statistical data from the crops by either untrained employees or by an autonomous robotic system at lower costs and higher confidence than an expert. The collected data enables high confidence detection of pests and diseases and additional parameters regarding the physiological status of the crops. The collected data is tagged with meta-data (e.g., location in the field, location on the plant, time, weather conditions etc.) and detailed pests distribution maps and statistical data are generated thereby providing the grower and experts with real-time information that supports decision making and does not require a second validation in the field.
Early detection and accurate pests distribution maps focus and reduce pesticide dispersion while feedback provided following pesticide treatment can further reduce subsequent pesticide use.
As used herein the term“about” refers to ± 10 %.
Additional objects, advantages, and novel features of the present invention will become apparent to one ordinarily skilled in the art upon examination of the following examples, which are not intended to be limiting.
EXAMPLES
Reference is now made to the following examples, which together with the above descriptions, illustrate the invention in a non-limiting fashion.
EXAMPLE 1
Proof-of-concept testing was performed with a prototype of the present system. A CMOS imaging sensor of 4208(h) x 3120 (v) pixels (AR1335 CMOS by On Semiconductor) was used with a near-field Macro-lens of f=l2.6mm and F#=l l. The focal plane and the spacer were set to 40mm from the sensor and provided a depth-of-field of 4mm. Six white LEDs operating at 3W each were used to illuminate the imaging plane. An edge detection algorithm was applied in real time on grabbed frames and was used to classify and save frames that include focused plant parts. The optical sensor was placed on a lm boom and was manually scanned over the leaves by a technician (Figure 9A). Another technician was used to pull a cart that carried the pc and power sources of the system. Image analysis and detection was performed after the scan was complete.
Peppers greenhouse that includes 8 30m long rows with 40cm separation between plants was scanned. The scanning was performed on every 3rd plant and every 2nd row, such that 1/6 of the plants were scanned each time. Each plant was scanned for l5-30s to cover between 5-10 randomly selected leaves.
An example of a leaf image is shown in Figure 9B. The duration of the scan of the entire greenhouse was about 1 hour. After the scan the collected images, about 20K per scan, were classified by technicians to differentiate images that include pests from images that include only leaves. To minimize false detections classification was performed in parallel by different technicians. Images that include pests were further examined by entomologists to determine the species. In addition, the entomologists visited the greenhouse to examine the pests in-situ and increase the certainty level of their detections. Once the all captured images were processed a series of pest distribution map were generated where each map represents one species that was detected in the greenhouse. Representative images of a number of species that were detected are shown in Figures 9C-F. A whitefly is shown in Figure 9C, and two types of Thrips are shown in Figures 9D-E. Figure 9F shows an Egg, potentially of a whitefly.
EXAMPLE 2
Pest and disease monitoring is typically performed by an expert manually inspecting individual plants. The main challenge is to detect infestation as early as possible so as to control pest and disease and minimize potential crop damage.
The performance of the present system was tested by an experiment performed on bell- pepper plants grown in a greenhouse. The greenhouse plot was scanned once a week for a period of 10 weeks. The scanning area was of 2du with a coverage of around 7% where every 6th plant was scanned in every 3rd row. Each plant scan included 70 images of flowers, both sides of leaves and fruits. The collected images were examined by agronomists who identified the different pests, beneficial and viruses, and marked their findings with an image- annotation software. The findings where used to generate distribution maps of annotated by agronomists for two purposes (Figures 10A-I). First, to train the NN for detection of different pests, flowers etc. Second, the findings were collected from the images to generate a different distribution map for each pest or disease on a weekly basis. Figures 10A-C present distribution maps of Thrips pests during three successive weeks. Figures 10D-F present distribution maps of the beneficial predator Swirskii mite and Figures 10G-I present distribution maps of eggs of Swirskii mite. Figures 10A-C present distribution maps of a scan during week 1 and Figures 10D-F represent scans during week 2. A similar map was generated per each pest that was detected.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting. In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.
In addition, any priority document(s) of this application is/are hereby incorporated herein by reference in its/their entirety.
Claims
1. A system for in situ imaging of plant tissue comprising:
(a) a camera having a macro lens for near field imaging;
(b) a spacer configured for setting a focal distance between said macro lens and a portion of a plant; and
(c) a device for positioning a distal surface of said spacer against said portion of said plant.
2. The system of claim 1, wherein said device is a manually operated boom.
3. The system of claim 1, wherein said device is an autonomous vehicle.
4. The system of claim 1, wherein said macro lens has a depth of field of 1-10 mm.
5. The system of claim 1, wherein said macro lens has an imaging area of 100-2000 mm2.
6. The system of claim 1, wherein said spacer is attached to a housing of said macro lens.
7. The system of claim 1, wherein said spacer is a frame having a cylindrical shape.
8. The system of claim 1, wherein said camera is a video camera.
9. The system of claim 1, wherein said video camera has an imaging sensor of at least 5 MP and a frame rate of at least 25 FPS.
10. The system of claim 1, wherein said spacer includes a mirror.
11. The system of claim 1, wherein said spacer is adjustable for setting said focal distance between said macro lens and said portion of a plant and/or an imaging area of said macro lens.
12. The system of claim 1, further comprising a processing unit for processing images captured by said camera and identifying out-of-focus images.
13. The system of claim 1, further comprising a light source.
14. The system of claim 1, further comprising a pesticide reservoir fluidly connected to a nozzle positioned near said distal surface of said spacer.
15. The system of claim 1, further comprising a motorized stage for moving a focal plane of said macro lens.
16. The system of claim 15, further comprising an auto-focus algorithm for actuating said motorized stage.
17. The system of claim 1, wherein said camera is a spectral imaging camera.
18. The system of claim 1, wherein said spacer includes a cover.
19. The system of claim 1, wherein said cover is a net.
20. The system of claim 1, wherein said distal surface of said spacer includes contact or proximity sensors.
21. The system of claim 2, further comprising location and/or movement tracking sensors.
22. The system of claim 21, further comprising an algorithm for:
(i) providing a user of the system with a plant scanning plan;
(ii) monitoring a scan procedure of said user.
23. The system of claim 22, wherein said algorithm can modify said plan based on movement of said user and/or pests detected during said scan.
24. A method of identifying pests on plant tissue comprising
(a) positioning a camera having a macro lens for near field imaging at a focal distance between said macro lens and a portion of a plant using a spacer; and
(b) capturing a series of images of said portion of said plant via said camera.
25. The method of claim 24, further comprising discarding out-of-focus images in said series of images to obtain in-focus images of said portion of said plant.
26. The method of claim 24, further comprising analyzing said in-focus images to identify the pests on the plant tissue.
27. The method of claim 24, wherein said portion of said plant is a leaf and further wherein said series of images are of an underside of said leaf.
28. The method of claim 24, further comprising repeating (a)-(b) for a plurality of plants in a field or greenhouse.
29. The method of claim 28, further comprising providing a user of said camera with instructions relating to an image capturing path through said field or greenhouse.
30. The method of claim 28, further comprising monitoring movement of said user and/or camera to determine if said path is correctly followed.
31. The method of claim 28, further comprising changing said path based on a quality of said captured images.
32. A method of assessing a state of plant tissue comprising
(a) positioning a camera having a macro lens for near field imaging at a focal distance between said macro lens and a portion of a plant using a spacer; and
(b) capturing a series of images of said portion of said plant via said camera.
33. The method of claim 31, further comprising discarding out-of-focus images in said series of images to obtain in-focus images of said portion of said plant.
34. The method of claim 31, further comprising analyzing said in-focus images to identify the state of the plant tissue.
35. The method of claim 31, wherein the state of plant tissue is a healthy state or a disease/stress state.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862687257P | 2018-06-20 | 2018-06-20 | |
US62/687,257 | 2018-06-20 | ||
US201962834419P | 2019-04-16 | 2019-04-16 | |
US62/834,419 | 2019-04-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019244156A1 true WO2019244156A1 (en) | 2019-12-26 |
Family
ID=68982953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2019/050692 WO2019244156A1 (en) | 2018-06-20 | 2019-06-20 | System for in-situ imaging of plant tissue |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2019244156A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4108082A1 (en) * | 2021-06-21 | 2022-12-28 | FaunaPhotonics Agriculture & Enviromental A/S | Apparatus and method for measuring insect activity |
CN117876879A (en) * | 2024-03-11 | 2024-04-12 | 四川农业大学 | A kiwi flower recognition method based on fusion of spatial and frequency domain features |
CN119164883A (en) * | 2024-09-24 | 2024-12-20 | 生态环境部南京环境科学研究所 | A mobile data acquisition device and chlorophyll fluorescence detection system |
US12239105B2 (en) | 2020-02-19 | 2025-03-04 | Faunaphotonics Agriculture & Environmental A/S | Method and apparatus for determining an index of insect biodiversity, an insect sensor and a system of insect sensors |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6014451A (en) * | 1997-10-17 | 2000-01-11 | Pioneer Hi-Bred International, Inc. | Remote imaging system for plant diagnosis |
US7286300B2 (en) * | 2005-04-25 | 2007-10-23 | Sony Corporation | Zoom lens and image pickup apparatus |
EP2028843B1 (en) * | 2007-08-21 | 2013-10-23 | Ricoh Company, Ltd. | Focusing device and imaging apparatus using the same |
US9235049B1 (en) * | 2012-07-31 | 2016-01-12 | Google Inc. | Fixed focus camera with lateral sharpness transfer |
US20160249951A1 (en) * | 2014-02-21 | 2016-09-01 | Warren R. Hultquist | Skin care methods, systems, and devices |
-
2019
- 2019-06-20 WO PCT/IL2019/050692 patent/WO2019244156A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6014451A (en) * | 1997-10-17 | 2000-01-11 | Pioneer Hi-Bred International, Inc. | Remote imaging system for plant diagnosis |
US7286300B2 (en) * | 2005-04-25 | 2007-10-23 | Sony Corporation | Zoom lens and image pickup apparatus |
EP2028843B1 (en) * | 2007-08-21 | 2013-10-23 | Ricoh Company, Ltd. | Focusing device and imaging apparatus using the same |
US9235049B1 (en) * | 2012-07-31 | 2016-01-12 | Google Inc. | Fixed focus camera with lateral sharpness transfer |
US20160249951A1 (en) * | 2014-02-21 | 2016-09-01 | Warren R. Hultquist | Skin care methods, systems, and devices |
Non-Patent Citations (1)
Title |
---|
ZHAO ET AL.: "In-Field, In Situ, and In Vivo 3-Dimensional Elemental Mapping for Plant Tissue and Soil Analysis Using Laser -Induced Breakdown Spectroscopy", SENSORS, 2016, pages 1 - 13, XP055664887, Retrieved from the Internet <URL:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5087548/pdf/sensors-16-01764.pdf> [retrieved on 20190818] * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12239105B2 (en) | 2020-02-19 | 2025-03-04 | Faunaphotonics Agriculture & Environmental A/S | Method and apparatus for determining an index of insect biodiversity, an insect sensor and a system of insect sensors |
EP4108082A1 (en) * | 2021-06-21 | 2022-12-28 | FaunaPhotonics Agriculture & Enviromental A/S | Apparatus and method for measuring insect activity |
WO2022268756A1 (en) * | 2021-06-21 | 2022-12-29 | Faunaphotonics Agriculture & Environmental A/S | Apperatus and method for measuring insect activity |
CN117876879A (en) * | 2024-03-11 | 2024-04-12 | 四川农业大学 | A kiwi flower recognition method based on fusion of spatial and frequency domain features |
CN117876879B (en) * | 2024-03-11 | 2024-05-07 | 四川农业大学 | A kiwi flower recognition method based on fusion of spatial and frequency domain features |
CN119164883A (en) * | 2024-09-24 | 2024-12-20 | 生态环境部南京环境科学研究所 | A mobile data acquisition device and chlorophyll fluorescence detection system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11849207B2 (en) | Inspection system for use in monitoring plants in plant growth areas | |
WO2019244156A1 (en) | System for in-situ imaging of plant tissue | |
Sujaritha et al. | Weed detecting robot in sugarcane fields using fuzzy real time classifier | |
EP2685811B1 (en) | System and method for three dimensional teat modeling for use with a milking system | |
CN111225854A (en) | drone | |
EP3769036B1 (en) | Method and system for extraction of statistical sample of moving fish | |
US11712032B2 (en) | Device to detect and exercise control over weeds applied on agricultural machinery | |
Gharakhani et al. | Integration and preliminary evaluation of a robotic cotton harvester prototype | |
Dobbs et al. | New directions in weed management and research using 3D imaging | |
KR20110115888A (en) | Pest control system and method | |
JP7068747B2 (en) | Computer system, crop growth support method and program | |
EP4108082A1 (en) | Apparatus and method for measuring insect activity | |
RU2695490C2 (en) | Method of agricultural lands monitoring | |
KR102499264B1 (en) | grasping harmful insect and harmful insect information management system | |
CN117841035A (en) | Crop straw grabbing and carrying robot path position control system | |
WO2024069631A1 (en) | Plant phenotyping | |
CN117148890A (en) | Temperature control system and method for applying infrared sensor to plant growth | |
JP7657737B2 (en) | System for determining the effect of active ingredients on diplotene, insects and other organisms in assay plates containing wells - Patents.com | |
WO2023037397A1 (en) | Dead fowl detection method | |
CN118446827B (en) | Pest control method for forestry tending | |
KR20180133610A (en) | Insect pest image acquisition method for insect pest prediction system of cash crops | |
Sun | A visual tracking system for honeybee 3D flight trajectory reconstruction and analysis | |
AU2007201452A1 (en) | Methods and apparatus for measuring geometrical parameters of foliage | |
CN119600570A (en) | Method and equipment for removing toxic grass based on computer vision | |
WO2023222594A1 (en) | Apparatus and method for detecting insects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19822678 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19822678 Country of ref document: EP Kind code of ref document: A1 |