WO2021221704A1 - Crop management method and apparatus with autonomous vehicles - Google Patents
Crop management method and apparatus with autonomous vehicles Download PDFInfo
- Publication number
- WO2021221704A1 WO2021221704A1 PCT/US2020/039685 US2020039685W WO2021221704A1 WO 2021221704 A1 WO2021221704 A1 WO 2021221704A1 US 2020039685 W US2020039685 W US 2020039685W WO 2021221704 A1 WO2021221704 A1 WO 2021221704A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- sections
- crop growing
- crops
- farm
- Prior art date
Links
- 238000007726 management method Methods 0.000 title description 27
- 238000000034 method Methods 0.000 claims abstract description 85
- 238000001228 spectrum Methods 0.000 claims abstract description 58
- 230000000007 visual effect Effects 0.000 claims description 78
- 230000008569 process Effects 0.000 claims description 53
- 238000004458 analytical method Methods 0.000 claims description 20
- 230000007613 environmental effect Effects 0.000 claims description 20
- 238000003860 storage Methods 0.000 claims description 17
- 230000002452 interceptive effect Effects 0.000 claims description 16
- 241000607479 Yersinia pestis Species 0.000 claims description 14
- 239000002131 composite material Substances 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 14
- 206010061217 Infestation Diseases 0.000 claims description 13
- 235000013311 vegetables Nutrition 0.000 claims description 11
- 241000220225 Malus Species 0.000 claims description 10
- 244000025254 Cannabis sativa Species 0.000 claims description 8
- 235000013399 edible fruits Nutrition 0.000 claims description 8
- 235000012766 Cannabis sativa ssp. sativa var. sativa Nutrition 0.000 claims description 7
- 235000012765 Cannabis sativa ssp. sativa var. spontanea Nutrition 0.000 claims description 7
- 235000008694 Humulus lupulus Nutrition 0.000 claims description 7
- 244000025221 Humulus lupulus Species 0.000 claims description 7
- 235000021016 apples Nutrition 0.000 claims description 7
- 235000005607 chanvre indien Nutrition 0.000 claims description 7
- 235000009120 camo Nutrition 0.000 claims description 6
- 239000011487 hemp Substances 0.000 claims description 6
- 235000009434 Actinidia chinensis Nutrition 0.000 claims description 5
- 235000009436 Actinidia deliciosa Nutrition 0.000 claims description 5
- 241000167854 Bourreria succulenta Species 0.000 claims description 5
- 244000025272 Persea americana Species 0.000 claims description 5
- 235000008673 Persea americana Nutrition 0.000 claims description 5
- 241000220324 Pyrus Species 0.000 claims description 5
- 241000271567 Struthioniformes Species 0.000 claims description 5
- 235000021028 berry Nutrition 0.000 claims description 5
- 235000019693 cherries Nutrition 0.000 claims description 5
- 235000021017 pears Nutrition 0.000 claims description 5
- 241000196324 Embryophyta Species 0.000 claims description 4
- 235000020971 citrus fruits Nutrition 0.000 claims description 4
- 235000021018 plums Nutrition 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 4
- 238000002329 infrared spectrum Methods 0.000 claims description 3
- 241000218236 Cannabis Species 0.000 claims 3
- 238000013528 artificial neural network Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 13
- 238000004590 computer program Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 239000002420 orchard Substances 0.000 description 9
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 235000013305 food Nutrition 0.000 description 6
- 241000238633 Odonata Species 0.000 description 5
- 240000000851 Vaccinium corymbosum Species 0.000 description 5
- 235000003095 Vaccinium corymbosum Nutrition 0.000 description 5
- 235000017537 Vaccinium myrtillus Nutrition 0.000 description 5
- 235000021014 blueberries Nutrition 0.000 description 5
- 201000010099 disease Diseases 0.000 description 5
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 240000004308 marijuana Species 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 239000002689 soil Substances 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 235000014101 wine Nutrition 0.000 description 4
- 241000218228 Humulus Species 0.000 description 3
- 235000013405 beer Nutrition 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 235000012055 fruits and vegetables Nutrition 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 229910052757 nitrogen Inorganic materials 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000002085 persistent effect Effects 0.000 description 3
- 239000000047 product Substances 0.000 description 3
- 230000001932 seasonal effect Effects 0.000 description 3
- 239000002028 Biomass Substances 0.000 description 2
- 241000219094 Vitaceae Species 0.000 description 2
- 235000013361 beverage Nutrition 0.000 description 2
- 238000012656 cationic ring opening polymerization Methods 0.000 description 2
- 229930002875 chlorophyll Natural products 0.000 description 2
- 235000019804 chlorophyll Nutrition 0.000 description 2
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 239000000599 controlled substance Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000009313 farming Methods 0.000 description 2
- 235000021021 grapes Nutrition 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000003973 irrigation Methods 0.000 description 2
- 230000002262 irrigation Effects 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000029553 photosynthesis Effects 0.000 description 2
- 238000010672 photosynthesis Methods 0.000 description 2
- 230000000246 remedial effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000011514 vinification Methods 0.000 description 2
- 244000075955 Andropogon parviflorus Species 0.000 description 1
- 240000006063 Averrhoa carambola Species 0.000 description 1
- 241000219310 Beta vulgaris subsp. vulgaris Species 0.000 description 1
- 241001672694 Citrus reticulata Species 0.000 description 1
- 229920000742 Cotton Polymers 0.000 description 1
- 241001503766 Cylas formicarius Species 0.000 description 1
- 244000068988 Glycine max Species 0.000 description 1
- 235000010469 Glycine max Nutrition 0.000 description 1
- 241000219146 Gossypium Species 0.000 description 1
- 201000008114 Infantile hypophosphatasia Diseases 0.000 description 1
- 240000008415 Lactuca sativa Species 0.000 description 1
- 235000003228 Lactuca sativa Nutrition 0.000 description 1
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 1
- 240000007594 Oryza sativa Species 0.000 description 1
- 235000007164 Oryza sativa Nutrition 0.000 description 1
- 241000233679 Peronosporaceae Species 0.000 description 1
- 241000254101 Popillia japonica Species 0.000 description 1
- 241001398044 Protaetia cuprea Species 0.000 description 1
- 240000003768 Solanum lycopersicum Species 0.000 description 1
- 244000061456 Solanum tuberosum Species 0.000 description 1
- 235000002595 Solanum tuberosum Nutrition 0.000 description 1
- 240000003829 Sorghum propinquum Species 0.000 description 1
- 235000011684 Sorghum saccharatum Nutrition 0.000 description 1
- 235000021536 Sugar beet Nutrition 0.000 description 1
- 235000021307 Triticum Nutrition 0.000 description 1
- 244000098338 Triticum aestivum Species 0.000 description 1
- 235000009754 Vitis X bourquina Nutrition 0.000 description 1
- 235000012333 Vitis X labruscana Nutrition 0.000 description 1
- 240000006365 Vitis vinifera Species 0.000 description 1
- 235000014787 Vitis vinifera Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000003915 air pollution Methods 0.000 description 1
- -1 apparel Substances 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000001651 catalytic steam reforming of methanol Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 235000013339 cereals Nutrition 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 229940125368 controlled substance Drugs 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 239000003337 fertilizer Substances 0.000 description 1
- 235000021022 fresh fruits Nutrition 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000243 photosynthetic effect Effects 0.000 description 1
- 230000008635 plant growth Effects 0.000 description 1
- 235000012015 potatoes Nutrition 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 235000009566 rice Nutrition 0.000 description 1
- 238000009738 saturating Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 239000004753 textile Substances 0.000 description 1
- 235000019505 tobacco product Nutrition 0.000 description 1
- 238000009369 viticulture Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/40—UAVs specially adapted for particular uses or applications for agriculture or forestry operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Definitions
- the present disclosure relates to the field of agriculture. More particularly, the present disclosure relates to method and apparatus for managing growing of crops, e.g., grain, fruit, vegetable, vines of a plurality of varietals, with the assistance of autonomous vehicles.
- crops e.g., grain, fruit, vegetable, vines of a plurality of varietals
- GDP gross domestic product
- Owning and operating a vineyard is a tough business. “To take on the challenge of running a winery, you need to be determined, fearless, and passionate about your craft — although owning a vineyard seems romantic, the wine-making business is a tough one.” 5
- these problems may include, but are not limited to, over or under irrigation, diseases (such as mildews and black rots), or pests (such as berry moth, Japanese beetles, and rose chafers). Further, these problems may vary from one varietal to another. And a vineyard typically grows vines of
- FIG. 1 illustrates an overview of a system for managing growing crops of the present disclosure, in accordance with various embodiments.
- FIGS 2A-2C illustrate taking of aerial images and terrestrial images of crops in the example vineyard of Figure 1, with the assistance of autonomous vehicles, according to various embodiments.
- Figure 3 illustrates a process for managing growing crops with the assistance of autonomous vehicles, and components of the crop growing management system of Figure 1, according to various embodiments.
- Figures 4A-4b illustrate an example visual report to assist in managing growing crops, and an example user interface for interactively viewing the visual report, according to various embodiments.
- Figure 5 illustrates an example process for machine processing the aerial images taken with the assistance of autonomous vehicles, according to various embodiments.
- Figure 6 illustrates an example process for machine analyzing the aerial and/or terrestrial images, according to various embodiments.
- Figure 7 illustrates an example process for machine generating the visual report to assist in managing growing vines of a number of varietals, according to various embodiments.
- Figure 8 illustrates an example process for machine facilitating interactive viewing of the visual report, according to various embodiments.
- Figure 9 illustrates an example neural network suitable for use with present disclosure to analyze aerial and/or terrestrial images for crop growing anomalies, according to various embodiments.
- Figure 10 illustrates an example software component view of the crop growing management system of Figure 1, according to various embodiments.
- FIG 11 illustrates an example hardware component view of the hardware platform for the crop growing management system of Figure 1, according to various embodiments.
- Figure 12 illustrates a storage medium having instructions for practicing methods described with references to Figures 1-9, according to various embodiments.
- Figures 13a-13d illustrate a number of example seasonal NDVI profiles for a number of example crops grown in a number of sections of a number of example crop growing farms.
- Figure 14 illustrates a number of example mean seasonal NDVI profiles for a number of example crops grown in a number of sites of an example geographical region.
- a method for managing growing crops includes operating one or more unmanned aerial vehicles (UAV) to fly over a plurality of sections of a crop growing farm, such as a vineyard.
- the UAVs are fitted with a plurality of cameras equipped to generate images in a plurality of spectrums.
- the plurality of sections of the crop growing farm may grow crops of various types, e.g., a vegetable farm may grow various vegetables, an orchard may grow various fruits, a vineyard may grow vines of a plurality of varietals, and so forth.
- the method further includes taking a plurality of aerial images of the sections of the crop growing farm in the plurality of spectrums, using the plurality of cameras, while the UAVs are flying over the plurality of sections of the crop growing farm; storing the plurality of aerial images of a plurality of spectrums of the crops of various types being grown, e.g., the vines of the plurality of varietals being grown, in a computer readable storage medium (CRSM), and executing an analyzer on a computing system to machine analyze the plurality of aerial images for anomalies associated with growing the crops of various types, e.g., the vines of the plurality of varietals.
- the machine analysis takes into consideration topological information of the crop growing farm, e.g., the vineyard, as well as current planting information of the crop growing farm, e.g., the vineyard.
- the method includes operating one or more terrestrial robots to traverse the plurality of sections of the crop growing farm, such as a vineyard.
- the one or more terrestrial robots are fitted with one or more cameras equipped to generate images in visual spectrum.
- the method further includes taking a plurality of visual spectrum terrestrial images of the crops of various types, e.g., vines of a plurality of varietals, being grown in the plurality of sections of the crop growing farm, e.g., a vineyard, using the one or more cameras fitted on the one or more terrestrial robots, while the one or more terrestrial robots are traversing the plurality of sections of the crop growing farm, e.g., a vineyard; storing the plurality of visual spectrum terrestrial images of the vines of the plurality of varietals being grown, in the CRSM; and executing the analyzer on the computing system to machine analyze the plurality of visual spectrum terrestrial images for additional anomalies associated with growing the crops of various types, e.g., vines of a plurality of varietals, in
- the phrase “A and/or B” means (A), (B), or (A and B).
- the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C).
- the description may use the phrases “in an embodiment,” or “In some embodiments,” which may each refer to one or more of the same or different embodiments.
- the terms “comprising,” “including,” “having,” and the like, as used with respect to embodiments of the present disclosure are synonymous.
- module may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
- ASIC Application Specific Integrated Circuit
- processor shared, dedicated, or group
- memory shared, dedicated, or group
- System 100 may be employed to manage growing crops of various types in various crop growing farms.
- crops growing that may be managed include, but are not limited to, growing of Hops, Kiwis, Apples, Berries, Cherries, Citrus Fruit, Avocado, Pears, Plums, Hemp, Cannabis, and so forth.
- crop growing farms include, but are not limited, vegetable farms, orchards, vineyards, and so forth.
- varietals may include, but are not limited to, cabernet sauvignon, pinot noir, chardonnay, pinot gris, and so forth.
- example of varietals may include cannabis indica or cannabis sativa species.
- system 100 for managing growing crops in a crop growing farm 102 may include one or more UAVs 104, and crop growing management system 110 having crop growing management software 120.
- system 100 may further include one or more terrestrial robots 108.
- one or more UAVs 104 are fitted with a plurality of cameras to capture aerial images in a plurality of spectrums.
- the UAVs 104 are operated to fly over crop growing farm, e.g., a vineyard, 102 and capture aerial images of various sections of crop growing farm, e.g., a vineyard, 102 in the plurality of spectrums.
- Different crop types e.g., fruits or vegetables of different types, fruits of different types, cannabis or vines of different varietals, may be grown in different sections of crop growing farm 102.
- the plurality of spectrums may include, but are not limited to, visual, red, near infrared, near red, and so forth.
- Crop growing management software 120 is configured to be executed on one or more computer processors of crop growing management system 110, to machine process the aerial images taken, and machine analyze the aerial images for anomalies associated with growing crops of various types, e.g., over or under irrigation, and generate a visual report with indications of the anomalies.
- the machine analysis and reporting takes into consideration topological information of the crop growing farm, e.g., natural or man made geographical features, like a pond, a building and so forth, current planting information, e.g., the crop types or varietals being grown, and where in crop growing farm 102.
- each of the one or more terrestrial robots 106 is equipped with one or more cameras to capture terrestrial images in visual spectrum, in one or more directions.
- the one or more directions may include a left (up/straight ahead/down) outward looking direction, a right (up/straight ahead/down) outward looking direction, a forward (up/straight ahead/down) outward looking direction, and/or a backward (up/straight ahead/down) outward looking direction.
- the one or more terrestrial robots 106 are operated to traverse crop growing farm 102 and capture terrestrial images of the crops of various types or varietals being grown in the various sections of crop growing farm 102, in the visual spectrum.
- the images of the crops may show various aspects of the crops, e.g., for vines, the images may show various aspects of the vines, e.g., its grapes, its leaves, its roots, and so forth.
- Crop growing management software 120 is further configured to be executed on the one or more computer processors of crop growing management system 110, to machine process the terrestrial images taken, and machine analyze the terrestrial images of the crops for further anomalies associated with growing crops of a particular type or varietal, e.g., plant diseases and/or pest infestations, and generate a visual report with indications of the anomalies.
- the machine analysis and reporting additional takes into consideration phenology information of the crop types or varietals, at different growing stages.
- a plurality of UAVs 104 are employed. Each UAV 104 is either successively or concurrently fitted with three (3) cameras for capturing aerial images in 3 spectrums, visual spectrum, red or near infrared spectrum, and near red spectrum.
- the 3 cameras include a red/green/blue (RGB) camera configured to capture and generate images in visual spectrum, a Normalized Difference Vegetation Index (NDVI) camera configured to capture and generate images in red or near infrared spectrum, and a Normalized Difference Red Edge (NDRE) camera equipped to capture and generate images in near red spectrum.
- one or more UAVs 104 may be fitted with one or more infrared thermal cameras to capture aerial thermal images of the various sections of vineyard 102.
- the plurality of UAVs 104 are operated to systematically fly over all sections of vineyard 102, or selectively fly over selected sections of vineyard 102, capturing and generating aerial images of the all or selected sections of vineyard 102.
- the cameras has resolution and zoom in power to allow each pixel of each aerial image to cover approximately 3-4 cm 2 of a sectional area, with the UAVs 104 operated at 400 ft or below.
- each pixel of each aerial image may cover an area as small as 1cm 2 .
- a swarm of lightweight UAVs with less Federal Aviation Administration (FAA) operation restrictions such as Dragonfly Drone with less height operational restrictions
- a Dragonfly Drone is a drone weighing typically less than 1 lb, which is available in multiple form factors but less than 9” in diameter.
- the Dragonfly Drone may be incorporated with extremely high resolution near infrared (NIR), short wave infrared (SWIR), thermal, and/or RGB cameras, amongst a large variety of others.
- the Dragonfly drones are used together in coordinated swarms mapping together to rapidly take imagery and collect data from the air over large crop growing farms.
- the small form factor eliminates the current need to acquire an FAA license to fly the drones.
- the Dragonfly drones may also contain a magnetic content for magnetic charging, and can return to a WIFI enabled base station for recharging, uploading of imagery, and downloading of flight plan.
- the concurrent employment of the near red spectrum images of the NDRE cameras provides certain information that complement the limitations of the red or infrared images of the NDVI cameras.
- the index of NDVI images is derived from reflectance values in red and near-infrared bands of the electromagnetic spectrum. Index values ranging from -1 to 1 indicate the instantaneous rate of photosynthesis of the crop of interest.
- NDVI is commonly thought of as an index of biomass, but a normal NDVI curve will decline towards the end of the growing season even when the amount of biomass is at peak levels. Therefore, NDVI can only be considered an index of photosynthetic rate, not the amount of foliage.
- NDVI is not sensitive to crops with high leaf area index (LAI) and tends to saturate at high LAI values.
- NDRE uses the red-edge portion of the spectrum. Since red- edge light is not absorbed as strongly as red-light during photosynthesis, it can penetrate deeper into the crop canopy and thereby solves the issue of NDVI saturating at high LAI values. NDRE is also more sensitive to medium- high chlorophyll levels and can be used to map variability in fertilizer requirements or foliar nitrogen demand. Leaf chlorophyll and nitrogen levels do not necessarily correlate with soil nitrogen availability and should be ground-truthed with soil or tissue samples.
- a plurality of terrestrial robots 106 are employed. Each terrestrial robot 106 is fitted with at least two (2) cameras to capture and generate terrestrial images of the crops of different types or varietals, in visual spectrum, in at least 2 directions along the routes traversed by the terrestrial robots 106. Similarly, the plurality of terrestrial robots 106 are operated to systematically traverse the entire crop growing farm 102 or selectively traverse selected sections of crop growing farm 102, capturing and generating terrestrial images of all crops of all types or varietals or selected crops of selected types or varietals grown in all or selected sections of crop growing farm 102.
- each UAV 104/robot 106 includes computer readable storage medium (CRSM) 124/126 to temporarily store the aerial/terrestrial images captured/generated.
- the CRSM are removable, such as a universal serial bus (USB) drive, allowing the captured/generated aerial/terrestrial images to be removably transferred to CRSM of crop growing management system 110, via a compatible input/output (I/O) port on crop growing management system 110.
- each UAV 104/robot 106 may include an I/O port (not shown), e.g., a USB port, to allow the stored aerial/terrestrial images to be accessed and transferred to crop growing management system 110.
- each UAV 104/robot 106 may include a communication interface, e.g., WiFi or Cellular interface, to allow the stored aerial/terrestrial images to be wirelessly transferred to crop growing management system 110, via e.g., access point/base station 118.
- Access point/base station 118 may be communicatively coupled with crop growing management system 110 via one or more private or public networks 114, including e.g., the Internet.
- UAVs 104 and robots 106 may otherwise be any one of a number of such vehicles/devices known in the art.
- UAVs 104 may be a fixed wing UAV, a tricopter, a qudcopter, a hexcopter or a lightweight UAV.
- UAVs 104 may be lightweight UAVs that may operate above 400 ft, Similarly, terrestrial robots 106 may be wheeled, threaded or screw propelled.
- system 100 may further include various sensors 108 dispositioned at various locations in crop growing farm 102 to collect and generate various local environmental data.
- sensors 108 may include, but are not limited to, temperature sensors for sensing local temperature, humidity sensors for sensing local humidity level, moisture sensors for sensing moisture level in the soil, and so forth.
- sensors 108 may be equipped to provide the collected sensor data to crop growing management system 110 wirelessly, via access points/base station 118.
- some of these sensors may be dispositioned in UAVs 104 and/or terrestrial robots 106 to collect and generate various local environmental data as UAVs 104 and/or terrestrial robots 106 fly over or traverse crop growing farm 102.
- crop growing management software 120 further takes these local environmental data into consideration, when machine processing aerial images (and optionally, terrestrial images) to machine detect anomalies associated with growing crops of various types or varietals.
- sensors 108 may include imaging devices or cameras mounted at various fixtures of the crop growing farm, e.g., utility poles, or nettings covering the crops being grown, to capture overhead images of the corresponding sections of the crop growing farm (to complement or supplement the aerial images captured with the UAVs).
- netting coverings include netting typically used in farms to reduce wind, sun and bird effects on growing plants.
- the netting may be incorporated with extremely high resolution near infrared (NIR), short wave infrared (SWIR), thermal, and/or RGB cameras, amongst a large variety of others. Multiple cameras can be attached to the underside of nets, providing both real time and video imagery of plant growth.
- NIR near infrared
- SWIR short wave infrared
- RGB RGB
- system 100 may further include remote servers 112 having repositories of environmental data applicable to crop growing farm 102, e.g., weather or environmental data services, with temperature, precipitation, air pollution data for the geographical region/area where crop growing farm 102 is located.
- Crop growing management system 110 may be communicatively coupled with remote servers 112 via one or more private or public networks 114, including e.g., the Internet.
- crop growing management software 120 further takes these regional environmental data into consideration, when machine processing and analyzing aerial images (and optionally, overhead and/or terrestrial images) to machine detect anomalies associated with growing crops of various types or varietals.
- crop growing management software 120 is further configured to support interactive viewing of the visual report by a user, via a user computing device 116.
- User computing device 116 may be coupled with crop growing management system 110 via one or more public and/or private networks 114, and/or access point/base station 118.
- crop growing management software 120 may facilitate interactive viewing of the visual report, via web services.
- user computing device 116 may access and interact with the visual report via a browser on user computing device 116.
- crop growing management software 120 may provide an agent e.g., an app, to be installed on user computing device 116 to access and interact with the visual report.
- user computing device 116 may be any one of a number of computing devices known in the art. Examples of such computing devices may include, but are not limited to, desktop computers, mobile phones, laptop computers, computing tablets, and so forth.
- access point/base stations 118 and network(s) 114 may be any one of a number access points/base stations and/or networks known in the art.
- Networks 114 may include one or more private and/or public networks, such as the Internet, local area or wide area, having any number of gateways, routers, switches, and so forth.
- UAVs 104, terrestrial robots 106, and crop growing management system 110 have been described so far, and will continue to be mainly described as assisting in management of growing crops of a plurality of types or varietals in a crop growing farm, the present disclosure is not so limited.
- System 100 may be practiced with UAVs 104, terrestrial robots 106, and crop growing management system 110 configured to service multiple crop growing farms growing crops of multiple types or varietals.
- FIG. 2A Shown in Figure 2A is a composite aerial overhead image 202 of a crop growing farm, an example vineyard, 102, generated by combining or stitching together individual aerial overhead images 204 taken by one or more UAVs 104, while the one or more UAVs 104 fly over the crop growing farm, example vineyard, 102.
- the one or more UAVs 104 are operated to systematically fly over all sections of the crop growing farm, example vineyard, 102 at a selected altitude.
- each pixel of each aerial/overhead image covers an area of approximately 3 -4cm 2 or smaller.
- the number of aerial/overhead images 204 shown in Figure 2A as being combined together to cover the crop growing farm, example vineyard, 102 are significantly less than the number of aerial/overhead images 204 combined/stitched together in real life. The number is reduced for ease of illustration and understanding, and thus is not to be read as limiting on the present disclosure.
- the different levels of grayness in Figure 2A correspond to different colors in real life in the composite NDVI or NDRE images combined/stitched together based on the individual NDVI/NDRE images taken, which in turn correspond to different conditions of the soil and/or different conditions of the crops (depending in part, on the crop types or varietals being grown).
- FIGS 2B-2C Shown in Figures 2B-2C are example terrestrial images 220 and 230 of couple of the example vines of couple varietals being grown in couple of the sections in example vineyard 102, captured by the camera(s) of terrestrial robot(s) 106, while terrestrial robot(s) 106 is (are) operated to traverse various sections of example vineyard 102.
- disease in this case, downy mildew
- phenology information provide, for various growing stages of the vine varietal.
- infestation in this case, sweetpotato weevil
- Figures 2A and 2B are non-limiting examples. As those skilled in various crop growing arts would appreciate, there are many possible vine diseases and/or infestations that need to be managed for the various crops of various types/varietals.
- example crop growing management system 350 includes crop growing management software 320 and CRSM 320.
- Crop growing management system 350 may correspond to crop growing management system 110 of Figure 1
- crop growing management software 320 may correspond to crop growing management software of 120 of Figure 1.
- Crop growing management software 320 includes image processor 322, analyzer 324, reporter 326 and interactive report reader 328.
- Image processor 322 is configured to process and combine/stitch together individual aerial/overhead images taken in various spectrums to form a plurality of composite aerial/overhead images 332 of the crop growing farm, e.g., a vineyard, in the various spectrums.
- Analyzer 324 is configured to machine process and analyze aerial/overhead images 302 and/or terrestrial images 304 to identify anomalies with growing crops of various types, e.g., vine of the various varietals in the various sections of the crop growing farm, e.g., the vineyard.
- analyzer 324 is configured to apply artificial intelligence, e.g., neural networks, to identify anomalies with growing crops of various types, e.g., vine of the various varietals, in the various sections of the crop growing farm, e.g., the vineyard.
- Reporter 326 is configured to machine generate one or more visual reports 336 of the crop growing farm, e.g., the vineyard, identifying/highlighting the anomalies associated with growing crops detected.
- visual reports 336 may be two dimensional (2D) reports. In other embodiments, visual reports 336 may be three dimensional (3D) reports or halograms.
- Interactive report reader 328 is configured to facilitate a user in interactively viewing the visual report.
- CRSM 330 is configured to store individual aerial/overhead images 302 of various sections of the crop growing farm, e.g., a vineyard, captured in various spectrums, as well as individual terrestrial images 304 of various crops of various types, e.g., vines of various varietal, captured in the visual spectrum.
- CRSM 330 is also configured to store topological information 306 of the crop growing farm, such as pond, creeks, streams and so forth, as well as current planting information 308, i.e. crop types/varietals planted, and where.
- CRSM 330 may also be configured to store phenology information 310 of the crop types/varietals, at different growing stages, and environmental data 312 collected by local sensors and/or received from remote servers.
- CRSM 330 may also be configured to store composite aerial/overhead images 332 generated from individual aerial/overhead images 302, analysis results 334 and reports 336.
- phenology information 310 of the crop types/varietals may include various profiles of the various crops types/varietals over a growing season for different growing sections of a crop growing farm.
- Figures 13a-13d illustrate various example profiles of various example crops types/varietals over example growing seasons for different growing sections of a number of example crop growing farms.
- Figure 13a illustrates example NDVI profiles 1300 for various example apples (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of an example apple orchard.
- Each line represents an example NDVI profile for the example apples grown in a section of the apple orchard.
- the vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season.
- the alphanumeric identifiers of the profiles shown in the bottom of the Figure, identify the sections of the example apple orchard.
- Figure 13b illustrates example NDVI profiles 1310 for various example blueberries (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of an example blueberry orchard.
- Each line represents an example NDVI profile for the example blueberries grown in a section of the blueberry orchard.
- the vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season.
- the alphanumeric identifiers of the profiles shown in the bottom of the Figure, identify the sections of the example blueberry orchard.
- Figures 13c and 13d respectively illustrates example NDVI profiles 1320 and 1330 for various example low productivity and high productivity hops (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of couple of example hop growing farms.
- Each line represents an example mean NDVI profile for the example hopss grown in a section of an example hop farm.
- the vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season.
- the alphanumeric identifiers of the profiles shown in the bottom of the Figures, identify the sections of the example hop farms.
- Figure 14 illustrates a number of example mean seasonal NDVI profiles 1400 over an example growing season from mid-October to mid-March for a number of example crops grown in a number of sites of an example geographical region.
- the vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season.
- the example crops include pears, kiwis, cherries, avocados and mandarin oranges.
- the profiles may span different growing seasons, as well as other profiles, such as but not limited to NDRE profiles, may be used instead or additionally.
- CSRM 330 may be any one of CRSM known in the art including, but are not limited to, non-volatile or persistent memory, magnetic or solid state disk drives, compact-disk read-only memory (CD-ROM), magnetic tape drives, and so forth.
- process 300 for managing growing of crops include operations performed at stages A - E.
- the UAVs fitted with a plurality of cameras equipped to capture/generate aerial images 302 in a plurality of spectrums are operated, in succession or concurrently, to fly over all or selected sections of the vineyard.
- individual aerial images 302 of the various sections of the vineyard are taken as the UAVs fly over the sections at a selected altitude.
- the aerial images are taken at high resolution covering a small area of 3-4 cm 2 per pixel or smaller.
- individual aerial images 302 of the various sections of the vineyard are stored into CRSM 330.
- overhead images of selected sections of the crop growing farm may be taken with imaging devices/cameras mounted on nettings and/or fixtures in the sections.
- the one or more terrestrial robots fitted with one or more cameras equipped to capture/generate terrestrial images 304 in visual spectrum may also be optionally operated, in succession or concurrently, to traverse all or selected sections of the crop growing farm, e.g., a vineyard.
- individual terrestrial images 304 of various crops of various types or varietals being grown in the various sections of the crop growing farm are taken as the terrestrial robots traverse over the sections.
- individual terrestrial images 304 of various crops of various types or varietals being grown in the various sections of the crop growing farm are stored into CRSM 330.
- process 300 may proceed to stages B and C in parallel.
- image processor 322 may be executed on a computer system to machine process and combine/stitch together individual aerial/overhead images taken in various spectrums (take by the UAVs and/or stationary mounted carmeras) to form a plurality of composite aerial/overhead images 332 of the crop growing farm, in the various spectrums.
- composite aerial/overhead images 332 of the crop growing farm, in the various spectrums are stored into CRSM 330.
- analyzer 324 may be executed on the computer system to machine process individual aerial/overhead images 302 and/or individual terrestrial images 304 to machine analyze and detect anomalies associated with growing crops of the various types or varietals, taking into consideration topological information of the crop growing farm, and current planting information of the crop growing farm. Anomalies may include, but are not limited, whether a section is under irrigated or over irrigated. By taking into consideration of the topological information, the analyzer may avoid false identification e.g., a pond or a creek as over irrigated, or a section growing crops of particular types, e.g., vines of a particular varietal, being stressed as under irrigated.
- terrestrial images 304 are also being machine processed and analyzed to detect anomalies associated with growing crops of the various types or varietals
- the analysis may take into consideration phenology information of the various crops, at different growing stages.
- Anomalies may include, but are not limited, various types of plant diseases and/or pest infestations.
- machine analysis of aerial/overhead images 302 as well as terrestrial images 304, to detect anomalies associated with growing crops of various types or varietals may further take into consideration local environmental data 312 collected by local sensors disposed at various locations throughout the vineyard, and/or regional/areal environmental data 312 provided by one or more remote environmental data services, applicable to the crop growing farm.
- process 300 may proceed to stage D.
- reporter 326 may be executed on a computer system to machine generate one or more visual reports 336 of the crop growing farm, with indications of the anomalies detected.
- the visual reports 336 are generated using composite aerial/overhead images 332, and based at least in part on the results of the analysis 334. As described earlier, visual reports 336 may be 2D, 3D or hologram.
- interactive report reader 328 may be executed on a computing system to machine facilitate interactive viewing of visual reports 336, by a user.
- each of image processor 322, analyzer 324, reporter 326 and interactive report reader 328 may be implemented in hardware, software or combination thereof.
- Example hardware implementations may include by are not limited to application specific integrated circuit (ASIC) or programmable circuits (such as Field Programmable Gate Arrays (FPGA)) programmed with the operational logic.
- Software implementations may include implementations in instructions of instruction set architectures (ISA) supported by the target processors, or any one of a number of high level programming languages that can be compiled into instruction of the ISA of the target processors.
- ISA instruction set architectures
- analyzer includes at least one neural network
- at least a portion of analyzer 324 may be implemented in an accelerator.
- One example software architecture and an example hardware computing platform will be further described later with references to Figures 10 and 11.
- example visual report 400 includes a composite image 402 of the crop growing farm, e.g., a vineyard, formed by combining or stitching together the individual aerial images taken of the various sections of the crop growing farm, while the UAVs flew over the sections of the crop growing farm (and optionally, individual overhead images taken of the various sections of the crop growing farm by various stationary mounted imaging devices/cameras mounted on fixtures in the various sections).
- a composite image 402 of the crop growing farm e.g., a vineyard
- the UAVs flew over the sections of the crop growing farm
- individual overhead images taken of the various sections of the crop growing farm by various stationary mounted imaging devices/cameras mounted on fixtures in the various sections.
- the various sections of the crop growing farm may be annotated with various information 404 useful to the user, including but are not limited to, section identification information, crop types/varietals being grown, anomalies detected, and so forth.
- Example composite image 402 of an example vineyard is formed by combining or stitching together the individual aerial/overhead NDRE images taken of the various sections of the example vineyard.
- visual report 400 is in colors.
- the different gray scale levels in the drawing correspond to different colors depicting various conditions as captured by the aerial/overhead images taken in a particular spectrum, e.g., NDVI, NDRE and so forth.
- visual report 400 may further include various legend 406 and auxiliary information 408 to assist a user in comprehending the information provided.
- legend 406 may provide the quantitative scale of a condition metric, such as soil moisture level, corresponding to the different colors.
- Example auxiliary information 408 may include, but are not limited to, time the aerial/overhead images are taken, air temperature at the time, wind speed at the time, wind direction at the time, and other observed conditions at the time.
- Figure 4B illustrates an example user interface for facilitating a user in interactively viewing the visual reports with a user/client computing device.
- user interface 450 includes a number of tabs 452a-452c, one each for viewing a particular visual report of aerial/overhead images taken in a particular spectrum.
- Each tab e.g., tab 452a, includes main display area 454 for displaying the visual report 462 annotated with highlights 464 of anomalies detected.
- each tab may further include section 456 for displaying various information, e.g., file information, and section 458 for displaying various command icons.
- a pop up window 466 may be displayed to provide further information and/or facilitate further interaction.
- example process 500 for machine processing/combining the aerial/overhead images to generate the composite aerial/overhead image include operations perform at blocks 502-512. The operations may be performed by e.g., image processor 322 of Figure 3.
- Example process 500 starts at block 502.
- a comer aerial/overhead image may be retrieved.
- it may be the upper left comer image, the upper right comer image, the lower right comer image or the lower left comer image.
- the next image in the next column of the same row (or next row, same column) may be retrieved and combined/stitched with the previously processed images, depending on whether the aerial/overhead images are being combined on a column first basis (or a row first basis).
- the next image in the first column of the next row if the aerial/overhead images are being combed or stitched in a column first basis (or first row, next column, if the aerial/overhead images are being combed or stitched in a row first basis) may be retrieved and combined/stitched with the previously processed images.
- next image in the next row, first column is retrieved, if the aerial/overhead images are being combed or stitched in a column first basis (or next column, first row, if the aerial/overhead images are being combed or stitched in a row first basis), and combined/stitched with the previously processed images.
- process 500 continues at block 504, as earlier described.
- process 500 proceeds to block 512.
- the crop growing farm may be in any shape, and may be partitioned into sections in non-rectangular manner.
- the combining/stitching process may start with any aerial/overhead image, and radiates out to combine and stitch the next aerial/overhead image in any number of directions successively.
- example process for machine analyzing the images include operations at blocks 602-612.
- the operations at blocks 602-612 may be machine performed by e.g., analyzer 324 of Figure 3.
- Process 600 starts at block 602. At block 602, an individual aerial/overhead image (and optionally, corresponding terrestrial images of the section) is (are) retrieved. Next, at block 604, topological information of the crop growing farm and current planting information for the section are retrieved. From block 604, process 600 may proceed to one of blocks 606, 608 or 610 depends on whether corresponding terrestrial images are also analyzed, and/or whether local/remote environmental data are considered.
- process 600 proceeds to block 606.
- phenology information of the crop types/varietals being grown, at different stages, are retrieved for the analysis. If environmental data are also being taken into consideration, process 600 also proceeds to block 608.
- the local/remote environmental data are retrieved.
- process 600 eventually proceeds to block 610.
- the aerial/overhead image, and optionally, corresponding terrestrial images are analyzed for anomalies associated with growing vine of the various varietals. As described earlier, the analysis takes into consideration the topological information of the crop growing farm, and current planting information. The analysis may also optionally take into consideration phenology information, at different growing stages, and/or local/remote environmental data.
- process 600 ends, otherwise the anomalies are noted, before ending process 600.
- Process 600 may be repeated for each section or selected sections of the crop growing farm.
- processor 700 includes various operations performed at blocks 702-710.
- the operations at blocks 702-710 may be performed by e.g., reporter 326 of Figure 3.
- Example process 700 starts at block 702. At block 702, one or more composite aerial/overhead images are outputted. Next at block 704, an area of the crop growing farm, e.g., a vineyard, is selected, and at block 706, the selected area is examined to determine whether anomalies associated with growing crops of various types, such as vine of various varietals, were detected. If a result of the determination indicates that anomalies associated with growing crops of various types or varietals were not detected, process 700 proceeds to block 710, else process 700 proceeds to block 708, before proceeding to block 710. At block 708, the visual report is annotated to highlight the anomaly detected.
- an area of the crop growing farm e.g., a vineyard
- the selected area is examined to determine whether anomalies associated with growing crops of various types, such as vine of various varietals, were detected. If a result of the determination indicates that anomalies associated with growing crops of various types or varietals were not detected, process 700 proceeds to block 710, else process 700 proceeds to block 708, before proceeding to block 710.
- process 800 for facilitating interactive viewing of the visual report comprises operations at blocks 802-806.
- operations at blocks 802-806 may be performed by e.g., interactive report reader 328 of Figure 3.
- Process 800 may start at block 802.
- a user request may be received.
- the user request may be associated with displaying a new visual report, providing further information on a detected anomaly, providing remedial action suggestions for a detected anomaly, and so forth.
- the user request may be processed.
- the processing results e.g., the requested visual report, further explanation of an anomaly of interest, a remedial action suggestion, and so forth, may be outputted/displayed for the user.
- the process results may optionally further include facilities for further interaction by the user.
- example neural network 900 may be suitable for use e.g., by analyzer 324 of Figure 3, in determining whether the images suggest anomalies associated with growing vines of the varietals.
- Example neural network 900 is a multilayer feedforward neural network (FNN) comprising an input layer 912, one or more hidden layers 914 and an output layer 916.
- Input layer 912 receives data of input variables (3 ⁇ 4) 902.
- Hidden layer(s) 914 processes the inputs, and eventually, output layer 916 outputs the determinations or assessments (yi) 904.
- the input variables (xi) 902 of the neural network are set as a vector containing the relevant variable data, while the output determination or assessment (yi) 904 of the neural network are also as a vector.
- Multilayer feedforward neural network may be expressed through the following equations:
- hoi and yi are the hidden layer variables and the final outputs, respectively.
- f() is typically a non-linear function, such as the sigmoid function or rectified linear (ReLu) function that mimics the neurons of the human brain.
- R is the number of inputs.
- N is the size of the hidden layer, or the number of neurons.
- S is the number of the outputs.
- the goal of the FNN is to minimize an error function E between the network outputs and the desired targets, by adapting the network variables iw, hw, hb, and ob, via training, as follows: where ykp and tkp are the predicted and the target values of pth output unit for sample k, respectively, and m is the number of samples.
- analyzer 324 of Figure 3 may include a pre-trained neural network 900 to determine whether an aerial/overhead and/or a terrestrial image suggests one or more anomalies associated with growing vines of the varietals.
- the input variables (x) 902 may include an aerial/overhead image of a section captured in a particular spectrum, one or more corresponding terrestrial images, topological data, current planting data, phenology data, and environmental data.
- the output variables (yi) 904 may include values indicating whether anomalies are detected and/or anomaly types.
- the network variables of the hidden layer(s) for the neural network of analyzer 324 for determining whether an aerial/overhead image of a section captured in a particular spectrum and/or corresponding terrestrial images suggest anomalies with the vines being grown, are determined by the training data.
- the neural network can be in some other types of topology, such as Convolution Neural Network (CNN), Recurrent Neural Network (RNN), and so forth.
- CNN Convolution Neural Network
- RNN Recurrent Neural Network
- V ⁇ CGM system 1000 which could be CGM system 110, includes hardware 1002 and software 1010.
- Software 1010 includes hypervisor 1012 hosting a number of virtual machines (VMs) 1022 -1028.
- Hypervisor 1012 is configured to host execution of VMs 1022-1028.
- the VMs 1022-1028 include a service VM 1022 and a number of user VMs 1024-1028.
- Service machine 1022 includes a service OS hosting execution of a number of system services and utilities.
- User VMs 1024-1028 may include a first user VM 1024 having a first user OS hosting execution of image processing 1034, a second user VM 1026 having a second user OS hosting execution of analysis and report generation 1036, and a third user VM 1028 having a third user OS hosting execution of interactive report viewing, and so forth.
- elements 1012-1038 of software 1010 may be any one of a number of these elements known in the art.
- hypervisor 1012 may be any one of a number of hypervisors known in the art, such as KVM, an open source hypervisor, Xen, available from Citrix Inc, of Fort Lauderdale, FL., or VMware, available from VMware Inc of Palo Alto, CA, and so forth.
- service OS of service VM 1022 and user OS of user VMs 1024-1028 may be any one of a number of OS known in the art, such as Linux, available e.g., from Red Hat Enterprise of Raliegh, NC, or Android, available from Google of Mountain View, CA.
- each user VM 1024, 1026 and 1028 may be configured to respectively handle image processing, analysis, report generation, and interactive report viewing of one crop growing farm, to provide data isolation between crop growing farms.
- computing platform 1100 which may be hardware 1002 of Figure 10, may include one or more system-on-chips (SoCs) 1102, ROM 1103 and system memory 1104.
- SoCs 1102 may include one or more processor cores (CPUs), one or more graphics processor units (GPUs), one or more accelerators, such as computer vision (CV) and/or deep learning (DL) accelerators.
- ROM 1103 may include basic input/output system services (BIOS) 1105.
- BIOS basic input/output system services
- CPUs, GPUs, and CV/DL accelerators may be any one of a number of these elements known in the art.
- BIOS 1105 may be any one of a number of ROM and BIOS known in the art
- system memory 1104 may be any one of a number of volatile storage known in the art.
- computing platform 1100 may include persistent storage devices 1106.
- persistent storage devices 1106 may include, but are not limited to, flash drives, hard drives, compact disc read-only memory (CD-ROM) and so forth.
- computing platform 1100 may include input/output (I/O) device interface to couple I/O devices 1108 (such as display, keyboard, cursor control and so forth) to system 1100, and communication interfaces 1110 (such as network interface cards, modems and so forth).
- I/O devices 1108 may include any number of communication and I/O devices known in the art.
- I/O devices may include in particular sensors 1120, which may be some of the sensors 108 of Figure 1. Examples of communication devices may include, but are not limited to, networking interfaces for Bluetooth®, Near Field Communication (NFC), WiFi, Cellular communication (such as LTE 4G/5G) and so forth.
- the elements may be coupled to each other via system bus 1112, which may represent one or more buses. In the case of multiple buses, they may be bridged by one or more bus bridges (not shown).
- Each of these elements may perform its conventional functions known in the art.
- ROM 1103 may include BIOS 1105 having a boot loader.
- System memory 1104 and mass storage devices 1106 may be employed to store a working copy and a permanent copy of the programming instructions implementing the operations associated with hypervisor 112, service/user OS of service/user VM 1022-1028, and components of the CGM technology (such as image processor 322, analyzer 324, reporter 326 and interactive report reader 328, and so forth), collectively referred to as computational logic 922.
- the various elements may be implemented by assembler instructions supported by processor core(s) of SoCs 1102 or high-level languages, such as, for example, C, that can be compiled into such instructions.
- the present disclosure may be embodied as methods or computer program products. Accordingly, the present disclosure, in addition to being embodied in hardware as earlier described, may take the form of an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to as a “circuit,” “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied in any tangible or non-transitory medium of expression having computer-usable program code embodied in the medium.
- Figure 12 illustrates an example computer-readable non-transitory storage medium that may be suitable for use to store instructions that cause an apparatus, in response to execution of the instructions by the apparatus, to practice selected aspects of the present disclosure.
- non-transitory computer-readable storage medium 1202 may include a number of programming instructions 1204.
- Programming instructions 1204 may be configured to enable a device, e.g., computing platform 1100, in response to execution of the programming instructions, to implement (aspects of) hypervisor 112, service/user OS of service/user VM 122-128, and components of CGM technology (such as mage processor 322, analyzer 324, reporter 326 and interactive report reader 328, and so forth)
- programming instructions 1204 may be disposed on multiple computer-readable non-transitory storage media 1202 instead.
- programming instructions 1204 may be disposed on computer-readable transitory storage media 1202, such as, signals.
- the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non- exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
- the computer- usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
- a computer-usable or computer- readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
- the computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable,
- Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user’s computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- Embodiments may be implemented as a computer process, a computing system or as an article of manufacture such as a computer program product of computer readable media.
- the computer program product may be a computer storage medium readable by a computer system and encoding a computer program instructions for executing a computer process.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Agronomy & Crop Science (AREA)
- Medical Informatics (AREA)
- Quality & Reliability (AREA)
- Radiology & Medical Imaging (AREA)
- Life Sciences & Earth Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Mining & Mineral Resources (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Catching Or Destruction (AREA)
Abstract
In some embodiments, a method for managing growing crops in a crop growing farm includes operating one or more unmanned aerial vehicles (UAV) to fly over a plurality of sections of a crop growing farm. The UAVs are fitted with a plurality of cameras equipped to generate images in a plurality of spectrums. The plurality of sections of the crop growing farm grow crops of one or more types or varietals. The method further includes taking a plurality of aerial images of the sections of the vineyard in the plurality of spectrums, using the plurality of cameras, while the UAVs are flying over the plurality of sections of the crop growing farm, and executing an analyzer on a computing system to machine analyze the plurality of aerial images for anomalies associated with growing the crops of the one or more types or varietals.
Description
CROP MANAGEMENT METHOD AND APPARATUS WITH AUTONOMOUS
VEHICLES
Related Application This application claims priority to U.S. Patent Application, 16/862,093, entitled
“CROP MANAGEMENT METHOD AND APPARATUS WITH AUTONOMOUS VEHICLES,” filed on April 29, 2020.
Technical Field The present disclosure relates to the field of agriculture. More particularly, the present disclosure relates to method and apparatus for managing growing of crops, e.g., grain, fruit, vegetable, vines of a plurality of varietals, with the assistance of autonomous vehicles. Background
The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section. Agriculture is a major industry in the United States, which is a net exporter of food. As of the 2007 census of agriculture, there were 2.2 million farms, covering an area of 922 million acres (3,730,000 km2), an average of 418 acres (169 hectares) per farm. Major crops include com, soybeans, wheat, cotton, tomatoes, potatoes, grapes, oranges, rice, apples, sorghum, lettuce, sugar beets, and so forth. Agriculture, food, and related industries contributed $1 053 trillion to U.S. gross domestic product (GDP) in 2017, a 5.4-percent share. The output of America’s farms contributed $132.8 billion of this sum — about 1 percent of GDP. The overall contribution of the agriculture sector to GDP is larger than this because sectors related to agriculture — forestry, fishing, and related activities; food, beverages, and tobacco products; textiles, apparel, and leather products; food and beverage stores; and food service, eating and drinking places — rely on agricultural inputs in order to contribute added value to the economy.
However, over the years, farming has become an increasingly difficult business. Today, it is estimated that the American farmer receives just 16-cents for every dollar
spent on food by the consumer. That is down 50 percent from 1980 when the farmers were receiving 31 -cents for every dollar spent. Margins, especially on smaller farms, are too thin to have room in their operating budgets to purchase new technology and equipment, invest in experimental agricultural practices or adapt to a new environmental and economic climate, and yet continuous innovation is needed to increase the yields.
For the fruit and vegetable segment, with the increasing interest among Americans in healthy living, there has been a steady increase in demand for fresh fruits and vegetables, including organic fruits and vegetables. The U.S. fruit and vegetable market was valued at USD 104.7 billion in 2016. Vegetables and fruits are presently reigning as the U.S. top snacking items. Like farming in general, owning and operating an orchard or vegetable farm is a tough business. Huge amount of investment in the U.S. is expected in terms of technology to improve the yield and quality of the products, and their efficient transport.
For the wine industry, consumption in America has steadily increased in the last two decades, growing from about 500 million gallons in the year 1996 to about 949 million gallons in 20161. The value of the total U.S. Wine Market for the year 2017 is estimated to be $62.7 billion, of which, $41.8 billion are domestically produced2.
Currently, for the year 2018, the number of wineries in U.S. is estimated to be about 9,6543. The total vine growing acres in the U.S. was estimated to exceed 1,000,000 acres, as far back as 20124.
Owning and operating a vineyard is a tough business. “To take on the challenge of running a winery, you need to be determined, fearless, and passionate about your craft — although owning a vineyard seems romantic, the wine-making business is a tough one.”5 In addition to the upfront financial investments required for the land and the infrastructure (like building, bohling and cellar equipment, trucks, and so forth), there are multitude of potential problems that could arise with growing vines. Examples of these problems may include, but are not limited to, over or under irrigation, diseases (such as mildews and black rots), or pests (such as berry moth, Japanese beetles, and rose chafers). Further, these problems may vary from one varietal to another. And a vineyard typically grows vines of
1 Source: Wine Institute, DOC, BW166/Gomberg, Fredrikson & Associates estimates. Preliminary. History revised.
2 Source: Wines & Vines, 2018, BW166, 2018.
3 Source: Statisa- The Statistics Portal.
4 The world’s grape growing (vineyard) surface area 2000-2012 by Per Karlsson, June 6, 2013, Winemaking & Viticulture.
5 The Economics of Running a Winery, Aug 20, 2018, Caroline Goldstein.
multiple varietals. It is not uncommon for a vineyard to span over 100 acres, with over 1000 vines planted per acre. And multiple varietals are planted in different sections of the vineyard.
For the craft beer industry, consumption in America has also steadily increased in the recent years, 2018 saw 7,346 operating U.S. craft breweries in 2018 — 4,521 microbreweries, 2,594 brewpubs, 231 regional breweries. Craft brewers produced 25.9 million barrels of beer. Retail dollar value for craft beer sold in 2018 was $27.6 billion. Resultantly, there has been significant increase in interest in increasing the efficient growing and production of hops.
Similarly, with the enactment of the 2018 Farm Bill on December 20, 2018, removing hemp from schedule I of the Controlled Substances Act, making hemp no longer a controlled substance, and with increasing number States legalizing the medical and recreational use of marijuana, likewise, there has been significant in increase in interest in increasing the efficient growing and production of cannabis.
Thus, methods and apparatuses that can improve the management of growing crops of various types are desired.
Brief Description of the Drawings
Embodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings. To facilitate this description, like reference numerals designate like structural elements. Embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
Figure 1 illustrates an overview of a system for managing growing crops of the present disclosure, in accordance with various embodiments.
Figures 2A-2C illustrate taking of aerial images and terrestrial images of crops in the example vineyard of Figure 1, with the assistance of autonomous vehicles, according to various embodiments.
Figure 3 illustrates a process for managing growing crops with the assistance of autonomous vehicles, and components of the crop growing management system of Figure 1, according to various embodiments.
Figures 4A-4b illustrate an example visual report to assist in managing growing crops, and an example user interface for interactively viewing the visual report, according to various embodiments.
Figure 5 illustrates an example process for machine processing the aerial images
taken with the assistance of autonomous vehicles, according to various embodiments.
Figure 6 illustrates an example process for machine analyzing the aerial and/or terrestrial images, according to various embodiments.
Figure 7 illustrates an example process for machine generating the visual report to assist in managing growing vines of a number of varietals, according to various embodiments.
Figure 8 illustrates an example process for machine facilitating interactive viewing of the visual report, according to various embodiments.
Figure 9 illustrates an example neural network suitable for use with present disclosure to analyze aerial and/or terrestrial images for crop growing anomalies, according to various embodiments.
Figure 10 illustrates an example software component view of the crop growing management system of Figure 1, according to various embodiments.
Figure 11 illustrates an example hardware component view of the hardware platform for the crop growing management system of Figure 1, according to various embodiments.
Figure 12 illustrates a storage medium having instructions for practicing methods described with references to Figures 1-9, according to various embodiments.
Figures 13a-13d illustrate a number of example seasonal NDVI profiles for a number of example crops grown in a number of sections of a number of example crop growing farms.
Figure 14 illustrates a number of example mean seasonal NDVI profiles for a number of example crops grown in a number of sites of an example geographical region.
Detailed Description
To address challenges discussed in the background section, apparatuses, methods and storage medium associated with managing growing crops, such as vines in a vineyard, are disclosed herein. In some embodiments, a method for managing growing crops includes operating one or more unmanned aerial vehicles (UAV) to fly over a plurality of sections of a crop growing farm, such as a vineyard. The UAVs are fitted with a plurality of cameras equipped to generate images in a plurality of spectrums. The plurality of sections of the crop growing farm may grow crops of various types, e.g., a vegetable farm may grow various vegetables, an orchard may grow various fruits, a vineyard may grow vines of a plurality of varietals, and so forth. The method further includes taking a
plurality of aerial images of the sections of the crop growing farm in the plurality of spectrums, using the plurality of cameras, while the UAVs are flying over the plurality of sections of the crop growing farm; storing the plurality of aerial images of a plurality of spectrums of the crops of various types being grown, e.g., the vines of the plurality of varietals being grown, in a computer readable storage medium (CRSM), and executing an analyzer on a computing system to machine analyze the plurality of aerial images for anomalies associated with growing the crops of various types, e.g., the vines of the plurality of varietals. The machine analysis takes into consideration topological information of the crop growing farm, e.g., the vineyard, as well as current planting information of the crop growing farm, e.g., the vineyard.
In some embodiments, the method includes operating one or more terrestrial robots to traverse the plurality of sections of the crop growing farm, such as a vineyard. The one or more terrestrial robots are fitted with one or more cameras equipped to generate images in visual spectrum. The method further includes taking a plurality of visual spectrum terrestrial images of the crops of various types, e.g., vines of a plurality of varietals, being grown in the plurality of sections of the crop growing farm, e.g., a vineyard, using the one or more cameras fitted on the one or more terrestrial robots, while the one or more terrestrial robots are traversing the plurality of sections of the crop growing farm, e.g., a vineyard; storing the plurality of visual spectrum terrestrial images of the vines of the plurality of varietals being grown, in the CRSM; and executing the analyzer on the computing system to machine analyze the plurality of visual spectrum terrestrial images for additional anomalies associated with growing the crops of various types, e.g., vines of a plurality of varietals, in the plurality of sections of the crop growing farm, e.g., a vineyard. The machine analysis of the visual spectrum terrestrial images takes into consideration phenology information of the various types of crops being grown, at different growing stages.
These and are other aspects of the methods and apparatuses for managing crop growing, in particular, managing growing vine in a vineyard will be further described with references to the Figures. In the following detailed description, reference is made to the accompanying drawings which form a part hereof wherein like numerals designate like parts throughout, and in which is shown by way of illustration embodiments that may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense, and the
scope of embodiments is defined by the appended claims and their equivalents.
Various operations may be described as multiple discrete actions or operations in turn, in a manner that is most helpful in understanding the claimed subject matter. However, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations may not be performed in the order of presentation. Operations described may be performed in a different order than the described embodiment. Various additional operations may be performed and/or described operations may be omitted in additional embodiments.
For the purposes of the present disclosure, the phrase “A and/or B” means (A), (B), or (A and B). For the purposes of the present disclosure, the phrase “A, B, and/or C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C). The description may use the phrases “in an embodiment,” or “In some embodiments,” which may each refer to one or more of the same or different embodiments. Furthermore, the terms “comprising,” “including,” “having,” and the like, as used with respect to embodiments of the present disclosure, are synonymous.
As used herein, the term “module” may refer to, be part of, or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group) that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Referring now to Figure 1, wherein an overview of a system for managing growing crops of the present disclosure, in accordance with various embodiments, is illustrated. For ease of understanding, system 100 will be described with reference to managing growing vine of a plurality of varietals in a vineyard, however, the disclosure is not so limited. System 100 may be employed to manage growing crops of various types in various crop growing farms. Examples of crops growing that may be managed include, but are not limited to, growing of Hops, Kiwis, Apples, Berries, Cherries, Citrus Fruit, Avocado, Pears, Plums, Hemp, Cannabis, and so forth. Examples of crop growing farms include, but are not limited, vegetable farms, orchards, vineyards, and so forth. In the case of growing vine, examples of varietals may include, but are not limited to, cabernet sauvignon, pinot noir, chardonnay, pinot gris, and so forth. In the case of growing cannabis, example of varietals may include cannabis indica or cannabis sativa species.
As shown, system 100 for managing growing crops in a crop growing farm 102, e.g., growing vines in a vineyard, may include one or more UAVs 104, and crop growing
management system 110 having crop growing management software 120. In some embodiments, system 100 may further include one or more terrestrial robots 108. Collectively, one or more UAVs 104 are fitted with a plurality of cameras to capture aerial images in a plurality of spectrums. The UAVs 104 are operated to fly over crop growing farm, e.g., a vineyard, 102 and capture aerial images of various sections of crop growing farm, e.g., a vineyard, 102 in the plurality of spectrums. Different crop types, e.g., fruits or vegetables of different types, fruits of different types, cannabis or vines of different varietals, may be grown in different sections of crop growing farm 102. The plurality of spectrums may include, but are not limited to, visual, red, near infrared, near red, and so forth.
Crop growing management software 120 is configured to be executed on one or more computer processors of crop growing management system 110, to machine process the aerial images taken, and machine analyze the aerial images for anomalies associated with growing crops of various types, e.g., over or under irrigation, and generate a visual report with indications of the anomalies. The machine analysis and reporting takes into consideration topological information of the crop growing farm, e.g., natural or man made geographical features, like a pond, a building and so forth, current planting information, e.g., the crop types or varietals being grown, and where in crop growing farm 102.
For embodiments where system 100 also includes one or more terrestrial robots 106, each of the one or more terrestrial robots 106 is equipped with one or more cameras to capture terrestrial images in visual spectrum, in one or more directions. The one or more directions may include a left (up/straight ahead/down) outward looking direction, a right (up/straight ahead/down) outward looking direction, a forward (up/straight ahead/down) outward looking direction, and/or a backward (up/straight ahead/down) outward looking direction. The one or more terrestrial robots 106 are operated to traverse crop growing farm 102 and capture terrestrial images of the crops of various types or varietals being grown in the various sections of crop growing farm 102, in the visual spectrum. The images of the crops may show various aspects of the crops, e.g., for vines, the images may show various aspects of the vines, e.g., its grapes, its leaves, its roots, and so forth.
Crop growing management software 120 is further configured to be executed on the one or more computer processors of crop growing management system 110, to machine process the terrestrial images taken, and machine analyze the terrestrial images of the crops for further anomalies associated with growing crops of a particular type or
varietal, e.g., plant diseases and/or pest infestations, and generate a visual report with indications of the anomalies. The machine analysis and reporting additional takes into consideration phenology information of the crop types or varietals, at different growing stages.
In some embodiments, a plurality of UAVs 104 are employed. Each UAV 104 is either successively or concurrently fitted with three (3) cameras for capturing aerial images in 3 spectrums, visual spectrum, red or near infrared spectrum, and near red spectrum. In some embodiments, the 3 cameras include a red/green/blue (RGB) camera configured to capture and generate images in visual spectrum, a Normalized Difference Vegetation Index (NDVI) camera configured to capture and generate images in red or near infrared spectrum, and a Normalized Difference Red Edge (NDRE) camera equipped to capture and generate images in near red spectrum. In some embodiments, one or more UAVs 104 may be fitted with one or more infrared thermal cameras to capture aerial thermal images of the various sections of vineyard 102. The plurality of UAVs 104 are operated to systematically fly over all sections of vineyard 102, or selectively fly over selected sections of vineyard 102, capturing and generating aerial images of the all or selected sections of vineyard 102. In some embodiments, the cameras has resolution and zoom in power to allow each pixel of each aerial image to cover approximately 3-4 cm2 of a sectional area, with the UAVs 104 operated at 400 ft or below. In alternate embodiments, as camera resolution further improves, each pixel of each aerial image may cover an area as small as 1cm2.
In some embodiments, a swarm of lightweight UAVs with less Federal Aviation Administration (FAA) operation restrictions, such as Dragonfly Drone with less height operational restrictions, are employed. A Dragonfly Drone is a drone weighing typically less than 1 lb, which is available in multiple form factors but less than 9” in diameter. The Dragonfly Drone may be incorporated with extremely high resolution near infrared (NIR), short wave infrared (SWIR), thermal, and/or RGB cameras, amongst a large variety of others. The Dragonfly drones are used together in coordinated swarms mapping together to rapidly take imagery and collect data from the air over large crop growing farms. The small form factor eliminates the current need to acquire an FAA license to fly the drones. The Dragonfly drones may also contain a magnetic content for magnetic charging, and can return to a WIFI enabled base station for recharging, uploading of imagery, and downloading of flight plan.
The concurrent employment of the near red spectrum images of the NDRE
cameras provides certain information that complement the limitations of the red or infrared images of the NDVI cameras. The index of NDVI images is derived from reflectance values in red and near-infrared bands of the electromagnetic spectrum. Index values ranging from -1 to 1 indicate the instantaneous rate of photosynthesis of the crop of interest. NDVI is commonly thought of as an index of biomass, but a normal NDVI curve will decline towards the end of the growing season even when the amount of biomass is at peak levels. Therefore, NDVI can only be considered an index of photosynthetic rate, not the amount of foliage. Also, NDVI is not sensitive to crops with high leaf area index (LAI) and tends to saturate at high LAI values.
On the other hand, NDRE uses the red-edge portion of the spectrum. Since red- edge light is not absorbed as strongly as red-light during photosynthesis, it can penetrate deeper into the crop canopy and thereby solves the issue of NDVI saturating at high LAI values. NDRE is also more sensitive to medium- high chlorophyll levels and can be used to map variability in fertilizer requirements or foliar nitrogen demand. Leaf chlorophyll and nitrogen levels do not necessarily correlate with soil nitrogen availability and should be ground-truthed with soil or tissue samples.
In some embodiments, a plurality of terrestrial robots 106 are employed. Each terrestrial robot 106 is fitted with at least two (2) cameras to capture and generate terrestrial images of the crops of different types or varietals, in visual spectrum, in at least 2 directions along the routes traversed by the terrestrial robots 106. Similarly, the plurality of terrestrial robots 106 are operated to systematically traverse the entire crop growing farm 102 or selectively traverse selected sections of crop growing farm 102, capturing and generating terrestrial images of all crops of all types or varietals or selected crops of selected types or varietals grown in all or selected sections of crop growing farm 102.
In some embodiments, each UAV 104/robot 106 includes computer readable storage medium (CRSM) 124/126 to temporarily store the aerial/terrestrial images captured/generated. In some embodiments, the CRSM are removable, such as a universal serial bus (USB) drive, allowing the captured/generated aerial/terrestrial images to be removably transferred to CRSM of crop growing management system 110, via a compatible input/output (I/O) port on crop growing management system 110. In some embodiments, each UAV 104/robot 106 may include an I/O port (not shown), e.g., a USB port, to allow the stored aerial/terrestrial images to be accessed and transferred to crop growing management system 110. In still other embodiments, each UAV 104/robot 106
may include a communication interface, e.g., WiFi or Cellular interface, to allow the stored aerial/terrestrial images to be wirelessly transferred to crop growing management system 110, via e.g., access point/base station 118. Access point/base station 118 may be communicatively coupled with crop growing management system 110 via one or more private or public networks 114, including e.g., the Internet.
Except for the cameras fitted to UAVs 104, and the manner UAVs 104 and robots 106 are employed, UAVs 104 and robots 106 may otherwise be any one of a number of such vehicles/devices known in the art. For example, except for the cameras fitted to UAVs 104, UAVs 104 may be a fixed wing UAV, a tricopter, a qudcopter, a hexcopter or a lightweight UAV. In particular, as discussed earlier, UAVs 104 may be lightweight UAVs that may operate above 400 ft, Similarly, terrestrial robots 106 may be wheeled, threaded or screw propelled.
Still referring to Figure 1, in some embodiments, system 100 may further include various sensors 108 dispositioned at various locations in crop growing farm 102 to collect and generate various local environmental data. Example of sensors 108 may include, but are not limited to, temperature sensors for sensing local temperature, humidity sensors for sensing local humidity level, moisture sensors for sensing moisture level in the soil, and so forth. In various embodiments, sensors 108 may be equipped to provide the collected sensor data to crop growing management system 110 wirelessly, via access points/base station 118. In some embodiments, some of these sensors may be dispositioned in UAVs 104 and/or terrestrial robots 106 to collect and generate various local environmental data as UAVs 104 and/or terrestrial robots 106 fly over or traverse crop growing farm 102. For these embodiments where sensors 108 are employed, crop growing management software 120 further takes these local environmental data into consideration, when machine processing aerial images (and optionally, terrestrial images) to machine detect anomalies associated with growing crops of various types or varietals. In some embodiments, sensors 108 may include imaging devices or cameras mounted at various fixtures of the crop growing farm, e.g., utility poles, or nettings covering the crops being grown, to capture overhead images of the corresponding sections of the crop growing farm (to complement or supplement the aerial images captured with the UAVs). Examples of netting coverings include netting typically used in farms to reduce wind, sun and bird effects on growing plants. The netting may be incorporated with extremely high resolution near infrared (NIR), short wave infrared (SWIR), thermal, and/or RGB cameras, amongst a large variety of others. Multiple cameras can be attached to the underside of
nets, providing both real time and video imagery of plant growth.
In some embodiments, system 100 may further include remote servers 112 having repositories of environmental data applicable to crop growing farm 102, e.g., weather or environmental data services, with temperature, precipitation, air pollution data for the geographical region/area where crop growing farm 102 is located. Crop growing management system 110 may be communicatively coupled with remote servers 112 via one or more private or public networks 114, including e.g., the Internet. For these embodiments, crop growing management software 120 further takes these regional environmental data into consideration, when machine processing and analyzing aerial images (and optionally, overhead and/or terrestrial images) to machine detect anomalies associated with growing crops of various types or varietals.
In some embodiments, crop growing management software 120 is further configured to support interactive viewing of the visual report by a user, via a user computing device 116. User computing device 116 may be coupled with crop growing management system 110 via one or more public and/or private networks 114, and/or access point/base station 118. In some embodiments, crop growing management software 120 may facilitate interactive viewing of the visual report, via web services. For these embodiments, user computing device 116 may access and interact with the visual report via a browser on user computing device 116. In some embodiments, crop growing management software 120 may provide an agent e.g., an app, to be installed on user computing device 116 to access and interact with the visual report. Except for its use to access and interact with the visual report, user computing device 116 may be any one of a number of computing devices known in the art. Examples of such computing devices may include, but are not limited to, desktop computers, mobile phones, laptop computers, computing tablets, and so forth.
Similarly, except for their use to facilitate provision of aerial/overhead/terrestrial images and/or environment data, as well as accessing the visual reports, access point/base stations 118 and network(s) 114 may be any one of a number access points/base stations and/or networks known in the art. Networks 114 may include one or more private and/or public networks, such as the Internet, local area or wide area, having any number of gateways, routers, switches, and so forth.
Before further describing elements of system 100, it should be noted while for ease of understanding, UAVs 104, terrestrial robots 106, and crop growing management system 110 have been described so far, and will continue to be mainly described as assisting in
management of growing crops of a plurality of types or varietals in a crop growing farm, the present disclosure is not so limited. System 100 may be practiced with UAVs 104, terrestrial robots 106, and crop growing management system 110 configured to service multiple crop growing farms growing crops of multiple types or varietals.
Referring now to Figures 2A-2C, wherein taking of aerial/overhead images of a crop growing farm, an example vineyard, and terrestrial images of crops being grown, vines growing in the example vineyard with the assistance of autonomous vehicles, according to various embodiments, are illustrated. Shown in Figure 2A is a composite aerial overhead image 202 of a crop growing farm, an example vineyard, 102, generated by combining or stitching together individual aerial overhead images 204 taken by one or more UAVs 104, while the one or more UAVs 104 fly over the crop growing farm, example vineyard, 102. As described earlier, in some embodiments, the one or more UAVs 104 are operated to systematically fly over all sections of the crop growing farm, example vineyard, 102 at a selected altitude. In various embodiments, each pixel of each aerial/overhead image covers an area of approximately 3 -4cm2 or smaller. Thus, the number of aerial/overhead images 204 shown in Figure 2A as being combined together to cover the crop growing farm, example vineyard, 102 are significantly less than the number of aerial/overhead images 204 combined/stitched together in real life. The number is reduced for ease of illustration and understanding, and thus is not to be read as limiting on the present disclosure.
The different levels of grayness in Figure 2A correspond to different colors in real life in the composite NDVI or NDRE images combined/stitched together based on the individual NDVI/NDRE images taken, which in turn correspond to different conditions of the soil and/or different conditions of the crops (depending in part, on the crop types or varietals being grown).
Shown in Figures 2B-2C are example terrestrial images 220 and 230 of couple of the example vines of couple varietals being grown in couple of the sections in example vineyard 102, captured by the camera(s) of terrestrial robot(s) 106, while terrestrial robot(s) 106 is (are) operated to traverse various sections of example vineyard 102. As illustrated by the example picture 220 in Figure 2B, disease (in this case, downy mildew), may be machine detected, using the phenology information provide, for various growing stages of the vine varietal. As illustrated by the example picture 230 in Figure 2C, infestation (in this case, sweetpotato weevil), may be machine detected, using the phenology information provide, for various growing stages of the vine varietal. Figures
2A and 2B are non-limiting examples. As those skilled in various crop growing arts would appreciate, there are many possible vine diseases and/or infestations that need to be managed for the various crops of various types/varietals.
Referring now to Figure 3, wherein an example process for managing growing crops with the assistance of autonomous vehicles, and components of an example crop growing management system of Figure 1, according to various embodiments, are illustrated. As shown, for the illustrated embodiments, example crop growing management system 350 includes crop growing management software 320 and CRSM 320. Crop growing management system 350 may correspond to crop growing management system 110 of Figure 1, and crop growing management software 320 may correspond to crop growing management software of 120 of Figure 1.
Crop growing management software 320 includes image processor 322, analyzer 324, reporter 326 and interactive report reader 328. Image processor 322 is configured to process and combine/stitch together individual aerial/overhead images taken in various spectrums to form a plurality of composite aerial/overhead images 332 of the crop growing farm, e.g., a vineyard, in the various spectrums. Analyzer 324 is configured to machine process and analyze aerial/overhead images 302 and/or terrestrial images 304 to identify anomalies with growing crops of various types, e.g., vine of the various varietals in the various sections of the crop growing farm, e.g., the vineyard. In some embodiments, analyzer 324 is configured to apply artificial intelligence, e.g., neural networks, to identify anomalies with growing crops of various types, e.g., vine of the various varietals, in the various sections of the crop growing farm, e.g., the vineyard. Reporter 326 is configured to machine generate one or more visual reports 336 of the crop growing farm, e.g., the vineyard, identifying/highlighting the anomalies associated with growing crops detected. In some embodiments, visual reports 336 may be two dimensional (2D) reports. In other embodiments, visual reports 336 may be three dimensional (3D) reports or halograms. Interactive report reader 328 is configured to facilitate a user in interactively viewing the visual report.
CRSM 330 is configured to store individual aerial/overhead images 302 of various sections of the crop growing farm, e.g., a vineyard, captured in various spectrums, as well as individual terrestrial images 304 of various crops of various types, e.g., vines of various varietal, captured in the visual spectrum. CRSM 330 is also configured to store topological information 306 of the crop growing farm, such as pond, creeks, streams and so forth, as well as current planting information 308, i.e. crop types/varietals planted, and
where. CRSM 330 may also be configured to store phenology information 310 of the crop types/varietals, at different growing stages, and environmental data 312 collected by local sensors and/or received from remote servers. CRSM 330 may also be configured to store composite aerial/overhead images 332 generated from individual aerial/overhead images 302, analysis results 334 and reports 336.
In various embodiments, phenology information 310 of the crop types/varietals may include various profiles of the various crops types/varietals over a growing season for different growing sections of a crop growing farm. Figures 13a-13d illustrate various example profiles of various example crops types/varietals over example growing seasons for different growing sections of a number of example crop growing farms. Specifically, Figure 13a illustrates example NDVI profiles 1300 for various example apples (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of an example apple orchard. Each line represents an example NDVI profile for the example apples grown in a section of the apple orchard.
The vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season. The alphanumeric identifiers of the profiles, shown in the bottom of the Figure, identify the sections of the example apple orchard.
Figure 13b illustrates example NDVI profiles 1310 for various example blueberries (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of an example blueberry orchard. Each line represents an example NDVI profile for the example blueberries grown in a section of the blueberry orchard. The vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season. The alphanumeric identifiers of the profiles, shown in the bottom of the Figure, identify the sections of the example blueberry orchard.
Figures 13c and 13d respectively illustrates example NDVI profiles 1320 and 1330 for various example low productivity and high productivity hops (of the same or different types/varietals) over an example growing season from end of March to end of October for various sections of couple of example hop growing farms. Each line represents an example mean NDVI profile for the example hopss grown in a section of an example hop farm. The vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season. The alphanumeric identifiers of the profiles, shown in the bottom of the Figures, identify the sections of the example hop farms.
Figure 14 illustrates a number of example mean seasonal NDVI profiles 1400 over an example growing season from mid-October to mid-March for a number of example
crops grown in a number of sites of an example geographical region. The vertical axis depicts the mean NDVI values, while the horizontal axis depicts the dates in the growing season. The example crops include pears, kiwis, cherries, avocados and mandarin oranges.
In other embodiments, the profiles may span different growing seasons, as well as other profiles, such as but not limited to NDRE profiles, may be used instead or additionally.
In various embodiments, except for their usage CSRM 330 may be any one of CRSM known in the art including, but are not limited to, non-volatile or persistent memory, magnetic or solid state disk drives, compact-disk read-only memory (CD-ROM), magnetic tape drives, and so forth.
Still referring to Figure 3, process 300 for managing growing of crops, e.g., vines, include operations performed at stages A - E. Starting at stage A, the UAVs fitted with a plurality of cameras equipped to capture/generate aerial images 302 in a plurality of spectrums are operated, in succession or concurrently, to fly over all or selected sections of the vineyard. And individual aerial images 302 of the various sections of the vineyard are taken as the UAVs fly over the sections at a selected altitude. As described, in some embodiments, the aerial images are taken at high resolution covering a small area of 3-4 cm2 per pixel or smaller. On capture/generation, individual aerial images 302 of the various sections of the vineyard are stored into CRSM 330. Optionally, overhead images of selected sections of the crop growing farm may be taken with imaging devices/cameras mounted on nettings and/or fixtures in the sections.
At stage A, the one or more terrestrial robots fitted with one or more cameras equipped to capture/generate terrestrial images 304 in visual spectrum may also be optionally operated, in succession or concurrently, to traverse all or selected sections of the crop growing farm, e.g., a vineyard. And individual terrestrial images 304 of various crops of various types or varietals being grown in the various sections of the crop growing farm are taken as the terrestrial robots traverse over the sections. On capture/generation, individual terrestrial images 304 of various crops of various types or varietals being grown in the various sections of the crop growing farm are stored into CRSM 330.
From stage A, process 300 may proceed to stages B and C in parallel. At stage B, image processor 322 may be executed on a computer system to machine process and combine/stitch together individual aerial/overhead images taken in various spectrums (take by the UAVs and/or stationary mounted carmeras) to form a plurality of composite
aerial/overhead images 332 of the crop growing farm, in the various spectrums. On generation, composite aerial/overhead images 332 of the crop growing farm, in the various spectrums, are stored into CRSM 330.
At stage C, analyzer 324 may be executed on the computer system to machine process individual aerial/overhead images 302 and/or individual terrestrial images 304 to machine analyze and detect anomalies associated with growing crops of the various types or varietals, taking into consideration topological information of the crop growing farm, and current planting information of the crop growing farm. Anomalies may include, but are not limited, whether a section is under irrigated or over irrigated. By taking into consideration of the topological information, the analyzer may avoid false identification e.g., a pond or a creek as over irrigated, or a section growing crops of particular types, e.g., vines of a particular varietal, being stressed as under irrigated. In embodiments where terrestrial images 304 are also being machine processed and analyzed to detect anomalies associated with growing crops of the various types or varietals, the analysis may take into consideration phenology information of the various crops, at different growing stages. Anomalies may include, but are not limited, various types of plant diseases and/or pest infestations.
In some embodiments, machine analysis of aerial/overhead images 302 as well as terrestrial images 304, to detect anomalies associated with growing crops of various types or varietals may further take into consideration local environmental data 312 collected by local sensors disposed at various locations throughout the vineyard, and/or regional/areal environmental data 312 provided by one or more remote environmental data services, applicable to the crop growing farm.
From stages B and C, process 300 may proceed to stage D. At stage D, reporter 326 may be executed on a computer system to machine generate one or more visual reports 336 of the crop growing farm, with indications of the anomalies detected. The visual reports 336 are generated using composite aerial/overhead images 332, and based at least in part on the results of the analysis 334. As described earlier, visual reports 336 may be 2D, 3D or hologram.
Next, at stage E, on generation of visual reports 336, interactive report reader 328 may be executed on a computing system to machine facilitate interactive viewing of visual reports 336, by a user.
Still referring back to Figure 3, in some embodiments, each of image processor 322, analyzer 324, reporter 326 and interactive report reader 328 may be implemented in
hardware, software or combination thereof. Example hardware implementations may include by are not limited to application specific integrated circuit (ASIC) or programmable circuits (such as Field Programmable Gate Arrays (FPGA)) programmed with the operational logic. Software implementations may include implementations in instructions of instruction set architectures (ISA) supported by the target processors, or any one of a number of high level programming languages that can be compiled into instruction of the ISA of the target processors. In some embodiments, especially those embodiments where analyzer includes at least one neural network, at least a portion of analyzer 324 may be implemented in an accelerator. One example software architecture and an example hardware computing platform will be further described later with references to Figures 10 and 11.
Referring now to Figures 4A-4b, wherein an example visual report to assist in managing crop growing, and an example user interface for interactively viewing the visual report, according to various embodiments, are illustrated. As shown in Figure 4A and described earlier, example visual report 400 includes a composite image 402 of the crop growing farm, e.g., a vineyard, formed by combining or stitching together the individual aerial images taken of the various sections of the crop growing farm, while the UAVs flew over the sections of the crop growing farm (and optionally, individual overhead images taken of the various sections of the crop growing farm by various stationary mounted imaging devices/cameras mounted on fixtures in the various sections). For the illustrated embodiments, the various sections of the crop growing farm may be annotated with various information 404 useful to the user, including but are not limited to, section identification information, crop types/varietals being grown, anomalies detected, and so forth. Example composite image 402 of an example vineyard, is formed by combining or stitching together the individual aerial/overhead NDRE images taken of the various sections of the example vineyard.
In various embodiments, visual report 400 is in colors. The different gray scale levels in the drawing correspond to different colors depicting various conditions as captured by the aerial/overhead images taken in a particular spectrum, e.g., NDVI, NDRE and so forth.
For the illustrated embodiments, visual report 400 may further include various legend 406 and auxiliary information 408 to assist a user in comprehending the information provided. For example, legend 406 may provide the quantitative scale of a condition metric, such as soil moisture level, corresponding to the different colors.
Example auxiliary information 408 may include, but are not limited to, time the aerial/overhead images are taken, air temperature at the time, wind speed at the time, wind direction at the time, and other observed conditions at the time.
Figure 4B illustrates an example user interface for facilitating a user in interactively viewing the visual reports with a user/client computing device. As shown, for the embodiments, user interface 450 includes a number of tabs 452a-452c, one each for viewing a particular visual report of aerial/overhead images taken in a particular spectrum. Each tab, e.g., tab 452a, includes main display area 454 for displaying the visual report 462 annotated with highlights 464 of anomalies detected. Additional, each tab may further include section 456 for displaying various information, e.g., file information, and section 458 for displaying various command icons. In response to a selection of a command, or a selection of an anomaly highlighted, a pop up window 466 may be displayed to provide further information and/or facilitate further interaction.
Referring now to Figure 5, wherein an example process for machine processing the aerial/overhead images, according to various embodiments, is illustrated. As shown, for the embodiments, example process 500 for machine processing/combining the aerial/overhead images to generate the composite aerial/overhead image include operations perform at blocks 502-512. The operations may be performed by e.g., image processor 322 of Figure 3.
Example process 500 starts at block 502. At block 502, a comer aerial/overhead image may be retrieved. For example, it may be the upper left comer image, the upper right comer image, the lower right comer image or the lower left comer image. Next at block 504, the next image in the next column of the same row (or next row, same column) may be retrieved and combined/stitched with the previously processed images, depending on whether the aerial/overhead images are being combined on a column first basis (or a row first basis).
At block 506, a determination is made if the last column has been reached, if the aerial/overhead images are being combed or stitched in a column first basis (or the last row has been reached, if the aerial/overhead images are being combed or stitched in a row first basis). If the last column of the row (or last row of the column) has not been reached, process 500 returns to block 504, and continues therefrom as earlier described. If the last column of the row (or last row of the column) has been reached, process 500 proceeds to block 508.
At block 508, the next image in the first column of the next row, if the
aerial/overhead images are being combed or stitched in a column first basis (or first row, next column, if the aerial/overhead images are being combed or stitched in a row first basis) may be retrieved and combined/stitched with the previously processed images.
At block 510, a determination is made if the last row has been reached, if the aerial/overhead images are being combed or stitched in a column first basis (or the last column has been reached, if the aerial/overhead images are being combed or stitched in a row first basis). If the last row of the column (or last column of the row) has not been reached, process 500 proceeds to block 512.
At block 512, the next image in the next row, first column is retrieved, if the aerial/overhead images are being combed or stitched in a column first basis (or next column, first row, if the aerial/overhead images are being combed or stitched in a row first basis), and combined/stitched with the previously processed images. Thereafter, process 500 continues at block 504, as earlier described.
Eventually, it is determined at block 510 that the last row of the column (or last column of the row) has been reached. At such time, process 500 proceeds to block 512.
While the combining/stitching process to generate the composite aerial/overhead image has been described with an example process that starts at one of the 4 comers of a substantially rectangular sectional partition of the crop growing farm, it should be noted that the present disclosure is not so limited. The crop growing farm may be in any shape, and may be partitioned into sections in non-rectangular manner. The combining/stitching process may start with any aerial/overhead image, and radiates out to combine and stitch the next aerial/overhead image in any number of directions successively.
Referring now to Figure 6, wherein an example process for machine analyzing the images, according to various embodiments, is illustrated. As shown, for the embodiments, example process for machine analyzing the images include operations at blocks 602-612. The operations at blocks 602-612 may be machine performed by e.g., analyzer 324 of Figure 3.
Process 600 starts at block 602. At block 602, an individual aerial/overhead image (and optionally, corresponding terrestrial images of the section) is (are) retrieved. Next, at block 604, topological information of the crop growing farm and current planting information for the section are retrieved. From block 604, process 600 may proceed to one of blocks 606, 608 or 610 depends on whether corresponding terrestrial images are also analyzed, and/or whether local/remote environmental data are considered.
If corresponding terrestrial images are also analyzed, process 600 proceeds to
block 606. At block 606, phenology information of the crop types/varietals being grown, at different stages, are retrieved for the analysis. If environmental data are also being taken into consideration, process 600 also proceeds to block 608. At block 608, the local/remote environmental data are retrieved.
From block 604, 606 or 608, process 600 eventually proceeds to block 610. At block 610, the aerial/overhead image, and optionally, corresponding terrestrial images, are analyzed for anomalies associated with growing vine of the various varietals. As described earlier, the analysis takes into consideration the topological information of the crop growing farm, and current planting information. The analysis may also optionally take into consideration phenology information, at different growing stages, and/or local/remote environmental data.
If anomalies are not detected, process 600 ends, otherwise the anomalies are noted, before ending process 600. Process 600 may be repeated for each section or selected sections of the crop growing farm.
Referring now to Figure 7, wherein an example process for machine generating a visual report, according to various embodiments, is illustrated. As shown, for the embodiments, processor 700 includes various operations performed at blocks 702-710.
The operations at blocks 702-710 may be performed by e.g., reporter 326 of Figure 3.
Example process 700 starts at block 702. At block 702, one or more composite aerial/overhead images are outputted. Next at block 704, an area of the crop growing farm, e.g., a vineyard, is selected, and at block 706, the selected area is examined to determine whether anomalies associated with growing crops of various types, such as vine of various varietals, were detected. If a result of the determination indicates that anomalies associated with growing crops of various types or varietals were not detected, process 700 proceeds to block 710, else process 700 proceeds to block 708, before proceeding to block 710. At block 708, the visual report is annotated to highlight the anomaly detected.
At block 710, a determination is made on whether there are additional areas of the crop growing farm to be analyzed. If a result of the determination indicates there are additional areas of the crop growing farm to be analyzed, process 700 returns to block 704, and proceeds therefrom as earlier described. Otherwise, process 700 ends.
Referring now Figure 8, wherein an example process for machine facilitating interactive viewing of a visual report, according to various embodiments, is illustrated. As shown, for the illustrated embodiments, process 800 for facilitating interactive viewing of
the visual report comprises operations at blocks 802-806. In some embodiments, operations at blocks 802-806 may be performed by e.g., interactive report reader 328 of Figure 3.
Process 800 may start at block 802. At block 802, a user request may be received. The user request may be associated with displaying a new visual report, providing further information on a detected anomaly, providing remedial action suggestions for a detected anomaly, and so forth. Next at block 804, the user request may be processed. At block 806, the processing results, e.g., the requested visual report, further explanation of an anomaly of interest, a remedial action suggestion, and so forth, may be outputted/displayed for the user. The process results may optionally further include facilities for further interaction by the user.
Referring now to Figure 9, wherein an example neural network, in accordance with various embodiments, is illustrated. As shown, example neural network 900 may be suitable for use e.g., by analyzer 324 of Figure 3, in determining whether the images suggest anomalies associated with growing vines of the varietals. Example neural network 900 is a multilayer feedforward neural network (FNN) comprising an input layer 912, one or more hidden layers 914 and an output layer 916. Input layer 912 receives data of input variables (¾) 902. Hidden layer(s) 914 processes the inputs, and eventually, output layer 916 outputs the determinations or assessments (yi) 904. In one example implementation the input variables (xi) 902 of the neural network are set as a vector containing the relevant variable data, while the output determination or assessment (yi) 904 of the neural network are also as a vector.
1, ...,s where hoi and yi are the hidden layer variables and the final outputs, respectively. f() is typically a non-linear function, such as the sigmoid function or rectified
linear (ReLu) function that mimics the neurons of the human brain. R is the number of inputs. N is the size of the hidden layer, or the number of neurons. S is the number of the outputs.
The goal of the FNN is to minimize an error function E between the network outputs and the desired targets, by adapting the network variables iw, hw, hb, and ob, via training, as follows:
where ykp and tkp are the predicted and the target values of pth output unit for sample k, respectively, and m is the number of samples.
In some embodiments, analyzer 324 of Figure 3 may include a pre-trained neural network 900 to determine whether an aerial/overhead and/or a terrestrial image suggests one or more anomalies associated with growing vines of the varietals. The input variables (x) 902 may include an aerial/overhead image of a section captured in a particular spectrum, one or more corresponding terrestrial images, topological data, current planting data, phenology data, and environmental data. The output variables (yi) 904 may include values indicating whether anomalies are detected and/or anomaly types. The network variables of the hidden layer(s) for the neural network of analyzer 324 for determining whether an aerial/overhead image of a section captured in a particular spectrum and/or corresponding terrestrial images suggest anomalies with the vines being grown, are determined by the training data.
In the example neural network of Figure 9, for simplicity of illustration, there is only one hidden layer in the neural network. In some other embodiments, there can be many hidden layers. Furthermore, the neural network can be in some other types of topology, such as Convolution Neural Network (CNN), Recurrent Neural Network (RNN), and so forth.
Referring now to Figure 10, wherein a software component view of the crop growing management (CGM) system, according to various embodiments, is illustrated. As shown, for the embodiments, V\CGM system 1000, which could be CGM system 110, includes hardware 1002 and software 1010. Software 1010 includes hypervisor 1012 hosting a number of virtual machines (VMs) 1022 -1028. Hypervisor 1012 is configured to host execution of VMs 1022-1028. The VMs 1022-1028 include a service VM 1022
and a number of user VMs 1024-1028. Service machine 1022 includes a service OS hosting execution of a number of system services and utilities. User VMs 1024-1028 may include a first user VM 1024 having a first user OS hosting execution of image processing 1034, a second user VM 1026 having a second user OS hosting execution of analysis and report generation 1036, and a third user VM 1028 having a third user OS hosting execution of interactive report viewing, and so forth.
Except for the crop growing management technology of the present disclosure incorporated, elements 1012-1038 of software 1010 may be any one of a number of these elements known in the art. For example, hypervisor 1012 may be any one of a number of hypervisors known in the art, such as KVM, an open source hypervisor, Xen, available from Citrix Inc, of Fort Lauderdale, FL., or VMware, available from VMware Inc of Palo Alto, CA, and so forth. Similarly, service OS of service VM 1022 and user OS of user VMs 1024-1028 may be any one of a number of OS known in the art, such as Linux, available e.g., from Red Hat Enterprise of Raliegh, NC, or Android, available from Google of Mountain View, CA.
In alternate embodiments, where CGM 1000 may be configured to service multiple crop growing farms, each user VM 1024, 1026 and 1028 may be configured to respectively handle image processing, analysis, report generation, and interactive report viewing of one crop growing farm, to provide data isolation between crop growing farms.
Referring now to Figure 11, wherein an example computing platform that may be suitable for use to practice the present disclosure, according to various embodiments, is illustrated. As shown, computing platform 1100, which may be hardware 1002 of Figure 10, may include one or more system-on-chips (SoCs) 1102, ROM 1103 and system memory 1104. Each SoCs 1102 may include one or more processor cores (CPUs), one or more graphics processor units (GPUs), one or more accelerators, such as computer vision (CV) and/or deep learning (DL) accelerators. ROM 1103 may include basic input/output system services (BIOS) 1105. CPUs, GPUs, and CV/DL accelerators may be any one of a number of these elements known in the art. Similarly, ROM 1103 and BIOS 1105 may be any one of a number of ROM and BIOS known in the art, and system memory 1104 may be any one of a number of volatile storage known in the art.
Additionally, computing platform 1100 may include persistent storage devices 1106. Example of persistent storage devices 1106 may include, but are not limited to, flash drives, hard drives, compact disc read-only memory (CD-ROM) and so forth.
Further, computing platform 1100 may include input/output (I/O) device interface to
couple I/O devices 1108 (such as display, keyboard, cursor control and so forth) to system 1100, and communication interfaces 1110 (such as network interface cards, modems and so forth). Communication and I/O devices 1108 may include any number of communication and I/O devices known in the art. I/O devices may include in particular sensors 1120, which may be some of the sensors 108 of Figure 1. Examples of communication devices may include, but are not limited to, networking interfaces for Bluetooth®, Near Field Communication (NFC), WiFi, Cellular communication (such as LTE 4G/5G) and so forth. The elements may be coupled to each other via system bus 1112, which may represent one or more buses. In the case of multiple buses, they may be bridged by one or more bus bridges (not shown).
Each of these elements may perform its conventional functions known in the art.
In particular, ROM 1103 may include BIOS 1105 having a boot loader. System memory 1104 and mass storage devices 1106 may be employed to store a working copy and a permanent copy of the programming instructions implementing the operations associated with hypervisor 112, service/user OS of service/user VM 1022-1028, and components of the CGM technology (such as image processor 322, analyzer 324, reporter 326 and interactive report reader 328, and so forth), collectively referred to as computational logic 922. The various elements may be implemented by assembler instructions supported by processor core(s) of SoCs 1102 or high-level languages, such as, for example, C, that can be compiled into such instructions.
As will be appreciated by one skilled in the art, the present disclosure may be embodied as methods or computer program products. Accordingly, the present disclosure, in addition to being embodied in hardware as earlier described, may take the form of an entirely software embodiment (including firmware, resident software, micro code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to as a “circuit,” “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product embodied in any tangible or non-transitory medium of expression having computer-usable program code embodied in the medium. Figure 12 illustrates an example computer-readable non-transitory storage medium that may be suitable for use to store instructions that cause an apparatus, in response to execution of the instructions by the apparatus, to practice selected aspects of the present disclosure. As shown, non-transitory computer-readable storage medium 1202 may include a number of programming instructions 1204. Programming instructions 1204 may be configured to enable a device, e.g., computing platform 1100, in response to
execution of the programming instructions, to implement (aspects of) hypervisor 112, service/user OS of service/user VM 122-128, and components of CGM technology (such as mage processor 322, analyzer 324, reporter 326 and interactive report reader 328, and so forth) In alternate embodiments, programming instructions 1204 may be disposed on multiple computer-readable non-transitory storage media 1202 instead. In still other embodiments, programming instructions 1204 may be disposed on computer-readable transitory storage media 1202, such as, signals.
Any combination of one or more computer usable or computer readable medium(s) may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non- exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Note that the computer- usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer- readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable,
RF, etc.
Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer,
partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user’s computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some
alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an” and “the” are intended to include plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specific the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operation, elements, components, and/or groups thereof.
Embodiments may be implemented as a computer process, a computing system or as an article of manufacture such as a computer program product of computer readable media. The computer program product may be a computer storage medium readable by a computer system and encoding a computer program instructions for executing a computer process.
The corresponding structures, material, acts, and equivalents of all means or steps plus function elements in the claims below are intended to include any structure, material or act for performing the function in combination with other claimed elements are specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill without departing from the scope and spirit of the disclosure. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for embodiments with various modifications as are suited to the particular use contemplated.
It will be apparent to those skilled in the art that various modifications and variations can be made in the disclosed embodiments of the disclosed device and
associated methods without departing from the spirit or scope of the disclosure. Thus, it is intended that the present disclosure covers the modifications and variations of the embodiments disclosed above provided that the modifications and variations come within the scope of any claims and their equivalents.
Claims
1. A method for managing growing crops in a crop growing farm, comprising: operating one or more unmanned aerial vehicles (UAVs) to fly over a plurality of sections of the crop growing farm, the one or more unmanned UAVs together being fitted with a plurality of cameras equipped to generate images in a plurality of spectrums, wherein the plurality of sections of the crop growing farm grow crops of one or more types or varietals, and the plurality of cameras include a Red-Green-Blue (RGB) camera equipped to generate images in a visual spectrum, a Normalized Difference Vegetation Index (NDVI) camera equipped to generate images in red or near infrared spectrum, and a Normalized Difference Red Edge (NDRE) camera equipped to generate images in near red spectrum; taking a plurality of aerial images of the sections of the crop growing farm in the plurality of spectrums, using the plurality of cameras, while the one or more UAVs are flying over the plurality of sections of the crop growing farm; storing the plurality of aerial images in a computer readable storage medium (CRSM); executing an analyzer on a computing system with access to the CRSM to machine analyze the plurality of aerial images in the visual, red or near infrared, and near red spectrums, and a plurality of overhead or terrestrial images for anomalies associated with growing the crops of the one or more types or varietals in the plurality of sections of the crop growing farm, including potential pest infestations, the machine analysis taking into consideration topological information of the crop growing farm, as well as current planting information of the crop growing farm; and executing a reporter on the computing system to machine produce a visual report with indications of crop growing anomalies in the plurality of sections of the crop growing farm, including potential pest infestations, to assist in managing growing the crops of one or more types or varietals in the plurality of sections of the crop growing farm, the indication of crop growing anomalies, including potential pest infestations, being based at least in part on results of the analysis of the visual, red or near infrared, and near red spectrum images, and the terrestrial images;
wherein the crops are one or more of Hops, Kiwis, Apples, Berries, Cherries,
Citrus Fruit, Avocado, Pears, Plums, Hemp, or Cannabis, or wherein the crop growing farm is a selected one of a fruit growing farm or a vegetable growing farm.
2. The method of claim 1, wherein operating the one or more UAVs to fly over the plurality of sections of the crop growing farm comprises operating one or more fixed wing or lightweight UAVs to systematically fly over all sections of the crop growing farm.
3. The method of claim 1, wherein operating the one or more UAVs to fly over the plurality of sections of the crop growing farm comprises operating an UAV having the RGB, NDVI and NDRE cameras to generate the visual, red or near infrared, and near red spectrum images.
4. The method of claim 1, wherein the crop growing farm spans over 100 acres, with over 1000 plants planted per acre, and each pixel of each of the plurality of aerial images covers about 3-4 cm2 or smaller.
5. The method of claim 1, wherein to machine analyze the plurality of aerial images and a plurality of overhead or terrestrial images comprises to machine identify one or more of the sections as potentially in an over irrigated state or an under irrigated state for the crop types or varietals being grown in the one or more sections, based at least in part on the topological information of the crop growing farm, as well as the current planting information of the crop growing farm.
6. The method of claim 5, wherein executing the reporter on the computer system comprises executing the reporter on the computer system to machine produce the visual report with indications of the one or more sections identified as potentially in an over irrigated state or an under irrigated state for the crop types or varietals being grown in the one or more sections.
7. The method of claim 1, further comprising executing an image processing program on the computing system to machine process the plurality of aerial images to combine the plurality of aerial images to produce one or more composite images of the crop growing
farm; wherein the reporter machine produces the visual report using the one or more composite images of the crop growing farm.
8. The method of claim 1, further comprising: operating one or more terrestrial robots to traverse the plurality of sections of the crop growing farm, the one or more terrestrial robots being fitted with one or more cameras equipped to generate the plurality of terrestrial images in the visual spectrum; and taking the plurality of visual spectrum terrestrial images of the crops of the one or more types or varietals being grown in the plurality of sections of the crop growing farm, using the one or more cameras fitted on the one or more terrestrial robots, while the one or more terrestrial robots are traversing the plurality of sections of the crop growing farm.
9. The method of claim 8, further comprising storing the plurality of visual spectrum terrestrial images of the crops of the one or more types or varietals being grown, in the CRSM; and wherein executing the analyzer further comprises executing the analyzer on the computing system to machine analyze the plurality of visual spectrum terrestrial images, with the machine analysis of the visual spectrum terrestrial images taking into consideration phenology information of the crop types or varietals being grown, at different growing stages.
10. The method of claim 9, wherein to machine analyze the plurality of visual spectrum terrestrial images for additional anomalies comprises to machine identify potential pest infestation in one or more of the crops of the one or more types or varietals being grown in the one or more of the sections of the crop growing farm, based at least in part on the phenology information of the crop types or varietals being grown, at different growing stages.
11. The method of claim 10, wherein executing the reporter on the computer system comprises executing the reporter on the computer system to machine produce the visual report with indications of the one or more crops of the crops of one or more types or varietals being grown in the one or more of the sections of the crop growing farm as being potentially pest infested.
12. The method of any one of claims 1-11, further comprising sensing with a plurality of sensors disposed in at least selected ones of the plurality of sections of the crop growing farm environmental data at various locations in the selected ones of the sections of the crop growing farm; and wherein machine processing and analyzing of the plurality of aerial and overhead or terrestrial images is further in view of the environmental data sensed at the various locations in the selected ones of the sections of the crop growing farm.
13. The method of any one of claims 1-11, further comprising receiving, from an external source, environmental data applicable to the crop growing farm; and wherein machine processing and analyzing of the plurality of aerial and overhead or terrestrial images is further in view of the environmental data applicable to the crop growing farm.
14. The method of any one of claims 1-11, further comprising executing an interactive report reader on the computer system to facilitate a user in viewing the visual report interactively.
15. A method for managing growing crops in a crop growing farm, comprising: operating one or more terrestrial robots to traverse a plurality of sections of the crop growing farm, the one or more terrestrial robots being fitted with one or more cameras equipped to generate images in visual spectrum, and the plurality of sections of the vineyard growing crops of one or more types or varietals; taking a plurality of visual spectrum terrestrial images of the crops of the one or more types or varietals growing in the sections of the crop growing farm, using the one or more cameras, while the one or more terrestrial robots are traversing the plurality of sections of the crop growing farm; storing the plurality of visual spectrum terrestrial images in a computer readable storage medium (CRSM); executing an analyzer on a computing system with access to the CRSM to machine analyze the plurality of visual spectrum terrestrial images and a plurality of aerial or overhead images for anomalies associated with growing the crops of the one or more types or varietals in the plurality of sections of the crop growing farm, where the plurality of aerial or overhead images include visual, red or near infrared, and near red images respectively taken by a Red-Green-Blue (RGB) camera, a Normalized Difference
Vegetation Index (NDVI) camera, and a Normalized Difference Red Edge (NDRE) camera, the anomalies including pest infestation among the vines, and the machine analysis taking into consideration current planting information of the crop growing farm and phenology information of the crops of the one or more types or varietal being grown in the plurality of sections of the crop growing farm, at different growing stages; and executing a reporter on the computer system to machine produce a visual report with indications of growing anomalies of the crops to assist in managing growing the crops of one or more types or varietals in the plurality of sections of the crop growing farm, the indication of growing anomalies being based at least in part on results of the analysis; wherein the crops are one or more of Hops, Kiwis, Apples, Berries, Cherries,
Citrus Fruit, Avocado, Pears, Plums, Hemp, or Cannabis, or wherein the crop growing farm is a selected one of a fruit growing farm or a vegetable growing farm.
16. The method of claim 15, wherein executing the analyzer comprises executing the analyzer on the computing system to process and analyze the plurality of visual spectrum terrestrial images in view of the current planting information of the crop growing farm and the phenology information of the crops of the one or more types or varietals being grown at different growing stages, to identify one or more crops being grown in one or more of the sections as potentially being infested.
17. The method of claim 16, wherein executing the reporter comprises executing the reporter on the computing system to machine produce the visual report with highlights of the identified potential pest infestations.
18. A computing system for managing growing crops in a crop growing farm comprising a processor and memory coupled to the processor and having instructions stored therein to cause the computing system, in response to execution of the instructions by the computing system, to: analyze a plurality of aerial or overhead images in a plurality of spectrums and a plurality of terrestrial images for anomalies associated with growing crops of one or more types or varietals in a plurality of sections of a crop growing farm, taking into consideration topological information of the crop growing farm, as well as current planting
information of the crop growing farm , the plurality of aerial or overhead images include visual, red or near infrared, and near red images respectively taken by an airborne Red- Green-Blue (RGB) camera, an airborne Normalized Difference Vegetation Index (NDVI) camera, and an airborne Normalized Difference Red Edge (NDRE) camera, the plurality of terrestrial images being captured by one or more roving cameras, and the anomalies being analyzed include pest infestations; and produce a visual report with indications of vine growing anomalies in the plurality of sections of the crop growing farm to assist in managing growing the crops in the crop growing farm, the indication of vine growing anomalies, including pest infestations, being based at least in part on results of the analysis; wherein the plurality of aerial images of the sections of the crop growing farm in the plurality of spectrums are taken using the RGB, NDVI and NDRE cameras fitted on one or more unmanned aerial vehicles (UAVs), while the one or more UAVs are flying over the plurality of sections of the crop growing farm; wherein the crops are one or more of Hops, Kiwis, Apples, Berries, Cherries,
Citrus Fruit, Avocado, Pears, Plums, Hemp, or Cannabis, or wherein the crop growing farm is a selected one of a fruit growing farm or a vegetable growing farm.
19. The computing system of claim 18, wherein to analyze the plurality of aerial or overhead images comprises to identify one or more of the sections as potentially in an over irrigated state or an under irrigated state for the crops of the one or more types or varietals being grown in the one or more sections, based at least in part on the topological information of the crop growing farm, as well as the current planting information of the crop growing farm.
20. The computing system of claim 19, wherein to produce the visual report comprises to produce the visual report with indications of the one or more sections identified as potentially in the over irrigated state or the under irrigated state for the crops of the one or more types or varietals being grown in the one or more sections.
21. The computing system of any one of claims 18-20, wherein the analysis of the visual spectrum terrestrial images takes into consideration phenology information of the crops of one or more types or varietals being grown, at different growing stages.
22. The computing system of claim 21, wherein to identify potential pest infestations is based at least in part on the phenology information of the crops of the one or more types or varietals being grown, at different growing stages.
23. The computing system of claim 22, wherein to produce the visual report comprises to produce the visual report with indications of one or more crops of the one or more types or varietals being grown in the one or more of the sections of the crop growing farm as being potentially pest infested.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/862,093 | 2020-04-29 | ||
US16/862,093 US10779476B2 (en) | 2018-09-11 | 2020-04-29 | Crop management method and apparatus with autonomous vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021221704A1 true WO2021221704A1 (en) | 2021-11-04 |
Family
ID=78373838
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2020/039685 WO2021221704A1 (en) | 2020-04-29 | 2020-06-25 | Crop management method and apparatus with autonomous vehicles |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021221704A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11532098B2 (en) * | 2020-09-01 | 2022-12-20 | Ford Global Technologies, Llc | Determining multi-degree-of-freedom pose to navigate a vehicle |
CN118052376A (en) * | 2024-04-16 | 2024-05-17 | 四川职业技术学院 | Sensitive data anomaly detection system and method based on agricultural Internet of things |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160063420A1 (en) * | 2014-08-29 | 2016-03-03 | Ricoh Company, Ltd. | Farmland management system and farmland management method |
US20170015416A1 (en) * | 2015-07-17 | 2017-01-19 | Topcon Positioning Systems, Inc. | Agricultural Crop Analysis Drone |
US20170127606A1 (en) * | 2015-11-10 | 2017-05-11 | Digi-Star, Llc | Agricultural Drone for Use in Controlling the Direction of Tillage and Applying Matter to a Field |
KR101793509B1 (en) * | 2016-08-02 | 2017-11-06 | (주)노루기반시스템즈 | Remote observation method and system by calculating automatic route of unmanned aerial vehicle for monitoring crops |
US20170374323A1 (en) * | 2015-01-11 | 2017-12-28 | A.A.A Taranis Visual Ltd | Systems and methods for agricultural monitoring |
-
2020
- 2020-06-25 WO PCT/US2020/039685 patent/WO2021221704A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160063420A1 (en) * | 2014-08-29 | 2016-03-03 | Ricoh Company, Ltd. | Farmland management system and farmland management method |
US20170374323A1 (en) * | 2015-01-11 | 2017-12-28 | A.A.A Taranis Visual Ltd | Systems and methods for agricultural monitoring |
US20170015416A1 (en) * | 2015-07-17 | 2017-01-19 | Topcon Positioning Systems, Inc. | Agricultural Crop Analysis Drone |
US20170127606A1 (en) * | 2015-11-10 | 2017-05-11 | Digi-Star, Llc | Agricultural Drone for Use in Controlling the Direction of Tillage and Applying Matter to a Field |
KR101793509B1 (en) * | 2016-08-02 | 2017-11-06 | (주)노루기반시스템즈 | Remote observation method and system by calculating automatic route of unmanned aerial vehicle for monitoring crops |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11532098B2 (en) * | 2020-09-01 | 2022-12-20 | Ford Global Technologies, Llc | Determining multi-degree-of-freedom pose to navigate a vehicle |
CN118052376A (en) * | 2024-04-16 | 2024-05-17 | 四川职业技术学院 | Sensitive data anomaly detection system and method based on agricultural Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10779476B2 (en) | Crop management method and apparatus with autonomous vehicles | |
Mahlein et al. | Hyperspectral sensors and imaging technologies in phytopathology: state of the art | |
de Castro et al. | UAVs for vegetation monitoring: Overview and recent scientific contributions | |
Bouguettaya et al. | A survey on deep learning-based identification of plant and crop diseases from UAV-based aerial images | |
Darwin et al. | Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review | |
Di Gennaro et al. | A low-cost and unsupervised image recognition methodology for yield estimation in a vineyard | |
Matese et al. | Beyond the traditional NDVI index as a key factor to mainstream the use of UAV in precision viticulture | |
Anderson et al. | Technologies for forecasting tree fruit load and harvest timing—from ground, sky and time | |
US10660277B2 (en) | Vine growing management method and apparatus with autonomous vehicles | |
Matese et al. | Technology in precision viticulture: A state of the art review | |
Barriguinha et al. | Vineyard yield estimation, prediction, and forecasting: A systematic literature review | |
Casado-García et al. | Semi-supervised deep learning and low-cost cameras for the semantic segmentation of natural images in viticulture | |
Etienne et al. | Machine learning approaches to automate weed detection by UAV based sensors | |
Halstead et al. | Crop agnostic monitoring driven by deep learning | |
Felderhof et al. | Near-infrared imagery from unmanned aerial systems and satellites can be used to specify fertilizer application rates in tree crops | |
Ma et al. | Cotton yield estimation based on vegetation indices and texture features derived from RGB image | |
WO2021221704A1 (en) | Crop management method and apparatus with autonomous vehicles | |
Matese et al. | Assessing grapevine biophysical parameters from unmanned aerial vehicles hyperspectral imagery | |
Shi et al. | Corn and sorghum phenotyping using a fixed-wing UAV-based remote sensing system | |
Wei et al. | Evaluation of the use of UAV-derived vegetation indices and environmental variables for grapevine water status monitoring based on machine learning algorithms and SHAP analysis | |
Popescu et al. | Orchard monitoring based on unmanned aerial vehicles and image processing by artificial neural networks: a systematic review | |
Zhang et al. | YOLO SSPD: a small target cotton boll detection model during the boll-spitting period based on space-to-depth convolution | |
Li et al. | A longan yield estimation approach based on UAV images and deep learning | |
Heylen et al. | Counting strawberry flowers on drone imagery with a sequential convolutional neural network | |
Paul et al. | Advanced segmentation models for automated capsicum peduncle detection in night-time greenhouse environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20934134 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20934134 Country of ref document: EP Kind code of ref document: A1 |