CN107426958B - Agricultural monitoring system and method - Google Patents
Agricultural monitoring system and method Download PDFInfo
- Publication number
- CN107426958B CN107426958B CN201580073902.0A CN201580073902A CN107426958B CN 107426958 B CN107426958 B CN 107426958B CN 201580073902 A CN201580073902 A CN 201580073902A CN 107426958 B CN107426958 B CN 107426958B
- Authority
- CN
- China
- Prior art keywords
- image data
- imaging sensor
- agricultural
- onboard
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 110
- 238000000034 method Methods 0.000 title claims description 257
- 238000003384 imaging method Methods 0.000 claims abstract description 315
- 238000004891 communication Methods 0.000 claims abstract description 50
- 238000012545 processing Methods 0.000 claims description 125
- 230000009418 agronomic effect Effects 0.000 claims description 70
- 230000003287 optical effect Effects 0.000 claims description 54
- 201000010099 disease Diseases 0.000 claims description 52
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims description 52
- 230000033001 locomotion Effects 0.000 claims description 50
- 230000008569 process Effects 0.000 claims description 26
- 239000002689 soil Substances 0.000 claims description 22
- 244000045947 parasite Species 0.000 claims description 18
- 238000004458 analytical method Methods 0.000 claims description 17
- 230000000694 effects Effects 0.000 claims description 16
- 230000005540 biological transmission Effects 0.000 claims description 15
- 238000012907 on board imaging Methods 0.000 claims description 15
- 238000005286 illumination Methods 0.000 claims description 11
- 239000000203 mixture Substances 0.000 claims description 8
- 230000008878 coupling Effects 0.000 claims description 6
- 238000010168 coupling process Methods 0.000 claims description 6
- 238000005859 coupling reaction Methods 0.000 claims description 6
- 239000000463 material Substances 0.000 claims description 4
- 238000011084 recovery Methods 0.000 claims description 2
- 241000209094 Oryza Species 0.000 description 36
- 235000007164 Oryza sativa Nutrition 0.000 description 36
- 235000009566 rice Nutrition 0.000 description 36
- 241000196324 Embryophyta Species 0.000 description 20
- 238000010586 diagram Methods 0.000 description 18
- 239000000575 pesticide Substances 0.000 description 18
- 238000005507 spraying Methods 0.000 description 14
- 238000003973 irrigation Methods 0.000 description 12
- 230000002262 irrigation Effects 0.000 description 12
- 239000000126 substance Substances 0.000 description 10
- 239000007788 liquid Substances 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 238000005070 sampling Methods 0.000 description 8
- 230000001580 bacterial effect Effects 0.000 description 7
- 230000006378 damage Effects 0.000 description 7
- 239000002420 orchard Substances 0.000 description 7
- 239000000047 product Substances 0.000 description 7
- 238000003860 storage Methods 0.000 description 7
- 244000061456 Solanum tuberosum Species 0.000 description 6
- 235000002595 Solanum tuberosum Nutrition 0.000 description 6
- 239000003337 fertilizer Substances 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 4
- 239000007921 spray Substances 0.000 description 4
- 239000004575 stone Substances 0.000 description 4
- 241000209140 Triticum Species 0.000 description 3
- 235000021307 Triticum Nutrition 0.000 description 3
- 230000009471 action Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000010410 layer Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- 229920000742 Cotton Polymers 0.000 description 2
- 241000195493 Cryptophyta Species 0.000 description 2
- 241000238631 Hexapoda Species 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000010410 dusting Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000009472 formulation Methods 0.000 description 2
- 239000003292 glue Substances 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 229910052500 inorganic mineral Inorganic materials 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000011707 mineral Substances 0.000 description 2
- 229920000379 polypropylene carbonate Polymers 0.000 description 2
- 238000002300 pressure perturbation calorimetry Methods 0.000 description 2
- 239000011435 rock Substances 0.000 description 2
- 241000894007 species Species 0.000 description 2
- 230000026676 system process Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000003466 welding Methods 0.000 description 2
- 241001124076 Aphididae Species 0.000 description 1
- 240000006162 Chenopodium quinoa Species 0.000 description 1
- 206010061217 Infestation Diseases 0.000 description 1
- 241000258916 Leptinotarsa decemlineata Species 0.000 description 1
- 241001124569 Lycaenidae Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 235000013339 cereals Nutrition 0.000 description 1
- 239000007795 chemical reaction product Substances 0.000 description 1
- 229930002875 chlorophyll Natural products 0.000 description 1
- 235000019804 chlorophyll Nutrition 0.000 description 1
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 239000012044 organic layer Substances 0.000 description 1
- 230000003071 parasitic effect Effects 0.000 description 1
- -1 pipes Substances 0.000 description 1
- 238000009428 plumbing Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 235000012015 potatoes Nutrition 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000009331 sowing Methods 0.000 description 1
- 238000010183 spectrum analysis Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 235000013311 vegetables Nutrition 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/02—Methods for working soil combined with other agricultural processing, e.g. fertilising, planting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/40—UAVs specially adapted for particular uses or applications for agriculture or forestry operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Image Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Catching Or Destruction (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
An agricultural monitoring system, comprising: an imaging sensor configured and operable to acquire image data of a portion of an agricultural area where crops are growing at a sub-millimeter image resolution when the imaging sensor is airborne; a communication module configured and operable to transmit image data content based on image data acquired by the onboard imaging sensor to an external system; and a connector for connecting the imaging sensor and the communication module to the airborne platform.
Description
Technical Field
The present invention relates to systems, methods and computer program products for agricultural monitoring, and in particular to systems, methods and computer program products for agricultural monitoring based on image data acquired by onboard imaging sensors.
Background
Publication No. CN203528823, entitled "unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating bacterial leaf blight of rice" is a Chinese utility model patent, relates to an unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating bacterial leaf blight of rice, and belongs to the technical field of agricultural aviation plant protection. The intelligent unmanned aerial vehicle for preventing and treating the bacterial leaf blight flies above a rice field to detect the occurrence of the bacterial leaf blight of the rice, a camera and a video camera which are arranged in a photoelectric pod below the intelligent unmanned aerial vehicle for preventing and treating the bacterial leaf blight of the rice input a colored rice image storage system for storage, then input a colored rice image recognition instrument for comparison with a stored colored rice standard image, recognize and confirm the type and the damage condition of the disease, input the damage information of the bacterial leaf blight into a computer for spraying and treating a disease instruction information system for processing, formulate a spraying and treating instruction, apply pressure to treating chemical pesticide liquid by a pressure pump according to the spraying and treating instruction, and spray the pressurized chemical pesticide liquid to the rice field by a sprayer for treating the bacterial leaf blight chemical pesticide liquid.
Publication No. CN203528822, entitled "unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating rice sheath blight disease", relates to an unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating rice sheath blight disease, and belongs to the new technical field of agricultural aviation plant protection. A camera and a camera in a photoelectric pod below an intelligent unmanned aerial vehicle for preventing and treating the rice sheath blight disease input a sensed color rice disease image in a rice field into a color rice disease image storage system for storage, then input a color rice disease image recognition instrument for comparison with a stored color rice disease standard image, recognize the damage condition of the rice sheath blight disease, input the damage information of the sheath blight into a computer spraying disease treatment instruction information system for processing, regulate and control the pressure applied to the disease-treating chemical pesticide liquid by a pressure pump through a spraying disease treatment instruction information transmission line by a spraying treatment instruction information system, and regulate and control the spraying of the pressurized chemical pesticide liquid to the rice field through a sprayer for spraying the disease-treating chemical pesticide liquid.
The publication No. CN103523226A, entitled "unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating rice sheath blight disease", relates to an unmanned aerial vehicle with colored rice disease image recognition instrument for preventing and treating rice sheath blight disease, and belongs to the new technical field of agricultural aviation plant protection. A camera and a camera in a photoelectric pod below an intelligent unmanned aerial vehicle for preventing and treating the rice sheath blight disease input a color rice disease image in a rice field into a color rice disease image storage system for storage, then input a color rice disease image recognition instrument for comparison with a stored color rice disease standard image, recognize the damage condition of the rice sheath blight disease, input the damage information of the sheath blight into a computer pesticide spraying and disease treating instruction information system for processing, regulate and control the pressure applied to a disease-treating chemical pesticide liquid by a pressure pump through a pesticide spraying and disease treating instruction information transmission line by a pesticide spraying and disease treating instruction sent by the computer pesticide spraying and disease treating instruction information system, and regulate and control the pressurized chemical pesticide liquid to be sprayed to the rice field through a sprayer for the disease-treating chemical pesticide liquid to prevent and treat the rice sheath blight.
Japanese patent application laid-open No. JPH11235124A, entitled "precision agriculture," discusses a method of precision agriculture that prevents excessive or insufficient fertilizer and pesticide, improves the application efficiency of fertilizer and pesticide, and increases the yield of crops by monitoring the growth state of the crops, automatically forming a growth map of the crops, and applying fertilizer, pesticide, etc. based on the formed data of the growth map. This patent application discusses a method of precision agriculture comprising monitoring the growth status of a field crop, such as by aerial photography of the growth status of the field crop by a helicopter mounted camera 70, detecting the chlorophyll content of the crop from pictures taken by the camera 70 configured with a color sensor, and then forming a field crop growth map.
US patent application No. US11/353,351 entitled "remote irrigation sensing system" discusses a data collection device associated with an agricultural irrigation system that includes at least one camera operatively connected to the irrigation system.
Disclosure of Invention
According to one aspect of the invention, an agricultural monitoring method is disclosed, the method comprising: (a) flying an airborne imaging sensor along a flight path over an agricultural area where crops are growing; (b) acquiring image data of a portion of the agricultural area by the on-board imaging sensor, wherein the acquiring of the image data is performed at a set of imaging locations along the flight path, the imaging locations capable of acquiring the image data at a sub-millimeter image resolution; and (c) sending image data content based on the image data acquired by the onboard imaging sensor to an external system.
According to another aspect of the invention, the method may include sending the image data content to the external system to display to an agriculturist at a remote location agronomic image data based on the image data content to enable the agriculturist to remotely analyze an agricultural area
According to another aspect of the invention, the flight path is a terrain following flight path.
According to another aspect of the invention, the acquiring includes acquiring image data at the set of imaging locations while the onboard imaging sensor is flying along the imaging locations at a speed that is no less than 50% of an average speed of the onboard platform along the flight path.
According to another aspect of the invention, the acquiring includes mechanically moving at least one component of the onboard imaging sensor relative to a loaded onboard platform for compensating for motion of the onboard imaging sensor relative to the crop during the acquiring.
According to another aspect of the invention, the acquiring comprises: (a) mechanically rotating at least one optical component of the onboard imaging sensor relative to an onboard platform of a load for compensating for motion of the onboard imaging sensor relative to the crop during acquisition; and (b) while the at least one optical assembly is rotating, for each frame of the plurality of frames of image data: and when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging.
According to another aspect of the invention, the acquiring includes illuminating the crop during the acquiring to compensate for movement of the onboard imaging sensor relative to the crop during the acquiring.
According to another aspect of the invention, the flying includes flying the onboard imaging sensor along a flight path that extends to at least a first agricultural asset of a first owner and a second agricultural asset of a second owner other than the first owner, wherein the method includes acquiring first image data of a portion of the first agricultural asset and acquiring second image data of a portion of the second agricultural asset; generating first image data content based on the first image data and second image data content based on the second image data; for providing first image data content to a first entity in a first message and for providing second data content to a second entity in a second message.
According to another aspect of the invention, the acquiring includes acquiring image data of a portion of the agricultural area that is unreachable for ground vehicles.
According to another aspect of the invention, the acquiring includes acquiring image data of portions of the agricultural area that are not reached by foot.
According to another aspect of the invention, the flying comprises flying the imaging sensor by an agricultural aircraft configured for aerial application of the crop protection product.
According to another aspect of the invention, the method further comprises selecting aerial application parameters for aerial application of crop protection products by the agricultural aircraft based on the processing of the image data.
According to another aspect of the invention, the set of imaging locations along the flight path is located less than 20 meters above the top of the crop growing in the agricultural area.
According to another aspect of the invention, the acquiring comprises acquiring image data of the agricultural area at a coverage rate of less than 500 square meters per hectare.
According to another aspect of the invention, the transmitting is followed by a subsequent instance of the flying, acquiring and transmitting, wherein the method further comprises planning a route for the subsequent flying instance based on the image data acquired in the previous instance.
According to another aspect of the invention, the acquiring includes compensating for motion of the imaging sensor during the image data acquisition.
According to another aspect of the invention, the acquiring of the image data comprises acquiring the image data using vertical imaging.
According to another aspect of the invention, the method further comprises applying computerized processing algorithms to the image data content for detecting leaf diseases or indicating parasite effects on leaves in one or more plants in the agricultural field.
According to another aspect of the invention, the flying, acquiring and transmitting are repeated over a period of weeks, wherein the method further comprises processing image data acquired at different times over the period of weeks for determining growth parameters of plants in the agricultural area.
According to another aspect of the invention, the method further includes applying computerized processing algorithms to the image data to identify selected agronomic significant data and generating agronomic image data for transmission to a remote system based on the selected agronomic significant data.
According to another aspect of the invention, the method further comprises applying computerized processing algorithms to the selected agronomic significant data to select a recipient of the agronomic image data from a plurality of possible recipients based on agronomic expertise of the possible recipients.
According to another aspect of the invention, a surveillance flight plan is developed for an airborne surveillance system prior to the flight, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations, wherein an airborne sensor flying along a flight path over an agricultural area is part of the flying airborne surveillance system according to the surveillance flight plan.
According to another aspect of the invention, the flight path is a terrain following flight path; wherein flying comprises flying the imaging sensor through an agricultural aircraft configured for aerial application of the crop protection product; wherein the set of imaging locations along the flight path are located less than 20 meters above a top of a crop growing in the agricultural area; wherein the acquiring comprises: (a) acquiring image data at the set of imaging locations while an onboard imaging sensor is flying along the imaging locations at a speed that is no less than 50% of an average speed of the onboard platform along a flight path; and (b) compensating for movement of the onboard imaging sensor relative to the crop during acquisition by illuminating the crop during acquisition and by mechanically moving at least one component of the onboard imaging sensor relative to the loaded onboard platform; wherein the sending comprises sending the image data content to an external system to display agronomic image data based on the image data content to an agriculturist at a remote location to enable the agriculturist to perform remote analysis of the agricultural area; wherein the method further comprises: developing a surveillance flight plan for an airborne surveillance system prior to the flight, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations, wherein the airborne sensors flying along a flight path over an agricultural area according to the surveillance flight plan are part of flying the airborne surveillance system.
According to one aspect of the invention, there is disclosed a method of agricultural monitoring, the method comprising: (a) formulating a surveillance flight plan for an airborne surveillance system, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations; (b) based on the monitored flight plan, the on-board monitoring system flies along a flight path over an agricultural area where crops are growing; (c) acquiring image data of a portion of an agricultural area at sub-millimeter image resolution during flight of the airborne surveillance system based on the acquisition location plan; (d) transmitting image data content based on image data acquired by the on-board monitoring system to an external system.
According to another aspect of the invention, the monitoring flight plan is formulated by receiving monitoring requests associated with a plurality of independent entities, and includes formulating the monitoring flight plan to indicate an imaging location of the crop of each of the plurality of independent entities.
In accordance with another aspect of the invention, the agricultural area includes a plurality of fields in which at least two types of crops are growing, wherein the monitoring of the flight plan includes determining different acquisition parameters for imaging locations associated with different varieties of crops.
In accordance with one aspect of the present invention, an agricultural monitoring system is disclosed, comprising: (a) an imaging sensor configured and operable to acquire image data of a portion of an agricultural area where crops are growing at a sub-millimeter image resolution when the imaging sensor is airborne; (b) a communication module configured and operable to transmit image data content based on image data acquired by the onboard imaging sensor to an external system; and (c) a connector for connecting the imaging sensor and the communication module to the airborne platform.
In accordance with another aspect of the invention, the agricultural monitoring system further includes an onboard platform that flies the onboard imaging sensor along a flight path over an agricultural area.
In accordance with another aspect of the invention, the agricultural monitoring system further includes a detachable coupling for detachably coupling the onboard imaging sensor to the onboard platform.
According to another aspect of the invention, the imaging sensor is configured and operable to acquire image data less than 20 meters above the top of crops growing in an agricultural area.
According to another aspect of the invention, the imaging sensor is configured and arranged to acquire image data when the flight speed exceeds 10 m/s.
In accordance with another aspect of the invention, the agricultural monitoring system further includes at least one mechanical coupler coupling at least one component of the imaging sensor to an engine, the at least one component of the imaging sensor being mechanically moved relative to the airborne platform by movement of the engine while image data is acquired by the imaging sensor.
According to another aspect of the invention, the agricultural monitoring system further comprises a motor operable to mechanically rotate at least one optical component of the imaging sensor relative to the airborne platform for compensating for movement of the imaging sensor relative to the crop during acquisition; wherein the imaging sensor is configured and operable to: (a) initiate a focusing process while rotating the at least one optical component when the collection optical axis is at an angle greater than 20 ° to the vertical axis, and (b) collect image data using vertical imaging when the collection optical axis is at an angle less than 20 ° to the vertical axis.
According to another aspect of the invention, the agricultural monitoring system further includes an illumination unit configured and operable to illuminate the crop during the acquisition of image data by the imaging sensor.
According to another aspect of the invention, the imaging sensor is configured and operable to acquire image data using vertical imaging.
The agricultural monitoring system according to claim 23, further comprising a processor configured and operable to process image data content for detecting leaf disease or indicating parasite effects on leaves in one or more plants in the agricultural field.
According to another aspect of the invention, the agricultural monitoring system further includes a processor configured and operable to process the image data content to identify selected agronomic significant data and generate agronomic image data for transmission to a remote system based on the selected agronomic significant data.
According to one aspect of the invention, there is disclosed a method of agricultural monitoring, the method comprising: (a) receiving image data content based on agricultural area image data, wherein the image data is sub-millimeter image resolution image data acquired by an onboard imaging sensor at a set of imaging locations along a flight path extending over an agricultural area; (b) processing the image data content to generate agronomic data comprising agronomic image data; and (c) sending the agronomic data to the end user remote system.
According to another aspect of the invention, the processing includes analyzing the image data content to identify selected agronomically significant data within the image data content; and processing the agronomic critical data to provide the agronomic data.
According to another aspect of the invention, the processing comprises applying computerized processing algorithms to the image data content for detecting leaf diseases or indicating parasite effects on leaves in one or more plants in the agricultural field.
According to another aspect of the invention, the receiving comprises receiving image data content of the agricultural area acquired on different days, wherein the processing comprises processing the image data content to determine growth parameters of plants in the agricultural area.
According to another aspect of the invention, the method further comprises applying computerized processing algorithms to the agronomic data to select a recipient of the agronomic image data from a plurality of possible recipients based on agronomic expertise of the possible recipients.
According to another aspect of the invention, the image data content includes first image data content of a first agricultural asset of a first owner, and second image data content of a second agricultural asset of a second owner other than the first owner; wherein the sending comprises sending the first image data content in a first message and sending the second data content in a second message.
According to another aspect of the invention, the image data content is based on image data acquired at a set of imaging locations along the flight path, the imaging locations being located less than 20 meters above the tops of crops growing in the agricultural area.
Drawings
In order to understand the invention and to see how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
FIG. 1A is a functional block diagram illustrating an example of a system in an example environment, according to an example of the disclosed subject matter;
FIG. 1B is a functional block diagram illustrating an example of a system in an example environment, according to an example of the disclosed subject matter;
FIG. 1C is a functional block diagram illustrating an example of a system in an example environment, according to an example of the disclosed subject matter;
FIG. 2 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 3 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 4A illustrates a system, a field of agricultural areas, and a flight path according to an example of the disclosed subject matter;
FIG. 4B illustrates a system, a field of agricultural areas, a flight path, a server, and a plurality of exemplary entities that may accept agriculturally important data based on image data acquired by the system, in accordance with an example of the presently disclosed subject matter;
fig. 5A through 5E illustrate optional stages of an agricultural monitoring method according to examples of the presently disclosed subject matter;
FIG. 6 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 7 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 8 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 9 is a functional block diagram illustrating an example agricultural monitoring system according to an example of the presently disclosed subject matter;
FIG. 10 is a functional block diagram illustrating an example agricultural monitoring system according to an example of the presently disclosed subject matter;
11A, 11B, 11C, and 11D are functional block diagrams illustrating an example agricultural monitoring system with a motion compensation mechanism according to an example of the disclosed subject matter;
FIG. 12 is a functional block diagram illustrating an example agricultural monitoring system according to an example of the presently disclosed subject matter;
FIG. 13 illustrates a number of images acquired by an onboard imaging sensor according to one method of agricultural monitoring, in accordance with an example of the presently disclosed subject matter;
FIG. 14 illustrates cropping of a single leaf in image data, according to an example of the disclosed subject matter;
FIG. 15 is a flow chart illustrating an example of an agricultural monitoring method according to an example of the presently disclosed subject matter;
FIG. 16 is a functional block diagram illustrating an example of a server for agricultural monitoring, in accordance with an example of the presently disclosed subject matter;
FIG. 17 is a flow chart illustrating an example of a method of monitoring a surface area in accordance with an example of the disclosed subject matter;
FIG. 18 is a flow chart illustrating an example of a method of monitoring a surface area in accordance with an example of the presently disclosed subject matter;
fig. 19 is a functional block diagram illustrating an example of a server for ground area monitoring in accordance with an example of the presently disclosed subject matter.
It will be appreciated that for simplicity and clarity of illustration, elements illustrated in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the presently disclosed subject matter.
In the drawings and description that are shown, like reference numerals designate like parts that are common to those of the different embodiments or configurations.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining," "generating," "setting," "configuring," "selecting," "formulating," or the like, refer to the action and/or processes of a computer that manipulate and/or transform into other data, such as physical quantities, e.g., electronic quantities, and/or such data representing physical objects. The terms "computer," "processor," and "controller" should be broadly construed to encompass any type of electronic device having data processing capabilities, including, as non-limiting examples, personal computers, servers, computing systems, communication devices, processors (e.g., Digital Signal Processors (DSPs), microcontrollers, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), and the like), any other electronic computing device, and/or any combination thereof.
Acts in accordance with the teachings herein may be performed by a computer program stored in a computer readable storage medium, by a computer specially constructed for the desired purposes, or by a general purpose computer specially configured for the desired purposes.
As used herein, the phrases "for example," "such as," and variations thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one instance," "some instances," "other instances," or variations thereof means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the presently disclosed subject matter. Thus, appearances of the phrases "one instance," "some instances," "other instances," or variations thereof may not necessarily all refer to the same embodiment.
It is appreciated that certain features of the disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the disclosed subject matter which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.
In examples of the presently disclosed subject matter, one or more of the stages shown in the figures can be performed in a different order, and/or one or more of the stages can be performed concurrently, or vice versa. The figures show a general schematic of a system architecture according to one example of the presently disclosed subject matter. Each module in the figures may be comprised of any combination of software, hardware, and/or firmware that performs the functions defined and explained herein. The modules in the figures may be centralized in one location or distributed among multiple locations.
FIG. 1A is a functional block diagram illustrating an example of a system 10 in an example environment according to an example of the disclosed subject matter. System 10 is an onboard system that includes an onboard platform 100 carrying an imaging sensor 210. As discussed in more detail below, the imaging sensor 210 is flown over the agricultural area by the airborne platform 100 to enable the imaging sensor 210 to acquire image data of the agricultural area. Image data content based on the acquired image data is then transmitted from the system 10 to a remote location where it can be analyzed for agronomically important data.
Different types of airborne platforms may be used as airborne platform 100. For example, airborne platform 100 may be any of the following airborne platform types: airplanes, helicopters, multi-rotor helicopters (e.g., quadrotors), Unmanned Aerial Vehicles (UAVs), powered parachutes (also known as electric parachutes, PPCs, and gliding landing soft-wing aircraft), and the like. The type of airborne platform 100 may be determined based on, for example, aerodynamic parameters (e.g., speed, altitude, maneuverability, stability, and load capacity, etc.), the degree of manual or automated control, and other usage requirements of the airborne platform.
In addition to imaging sensors 210, system 10 includes a processor 220 and a communication module 230, all of which are coupled to airborne platform 100. The connection of imaging sensor 210, processor 220, and communication module 230 (or any other component of system 10 carried by airborne platform 100) to airborne platform 100 may be a detachable connection, but need not be. For example, any of the components 210, 220, and/or 230 described above may be designed to be easily installed on and removed from airborne platform 100, and may be used for other purposes when the associated components of system 10 are not installed thereon.
FIG. 1B is a functional block diagram illustrating an example of a system 10 in an example environment according to an example of the disclosed subject matter. As can be seen from the example of fig. 1B, some of the components of the system 10 (particularly the imaging sensor 210) may be included in a separate detachable pod 280 that is attached and detached from one or more aircraft as needed. Such a self-contained pod 280 may include the agricultural monitoring system 200 as discussed in fig. 9-11C.
FIG. 1C is a functional block diagram illustrating an example of a system 10 in an example environment, according to an example of the disclosed subject matter. In the example of fig. 1C. Some of the components of the system 10 that enable agricultural utilization are located in the external pod 280, while other functions are implemented by components of the airborne platform 100 (in the illustrated example, the communication module 230).
As shown in fig. 1B and 1C, the detachable nacelle 280 is a nacelle that is detachable with respect to the airborne platform 100. For example, the detachable pods 280 may be detachably connected to the fuselage of the airborne platform 100 (e.g., to the belly as shown in fig. 1B) or to the wings of the airborne platform 100 (as shown in fig. 1C).
It should be noted that system 10 may include additional components such as altimeters, airspeed indicators, pitch, roll, and/or yaw sensors, interfaces for connecting to avionics and other systems of airborne platform 100, and the like.
Fig. 2 is a flow chart illustrating an example of an agricultural monitoring method 500 according to an example of the presently disclosed subject matter. Referring to the examples of the previous figures, the method 500 may be performed by the system 10. Additional discussion and details related to system 10 are provided following the discussion below regarding method 500.
Stage 510 of method 500 includes flying an onboard imaging sensor along a flight path over an agricultural area where crops are growing. Referring to the example described in the previous figures, the onboard imaging sensor may be imaging sensor 210 and the flight of stage 510 may be performed by onboard platform 100.
It is noted that different types of crops may be grown in the agricultural areas described above, and that a crop may include one or more plant types. For example, the agricultural area may be a field (annual crops such as grains, cotton, potatoes, vegetables, etc.), land for growing long-term crops (such as orchards, vineyards, fruit plantations, etc.). It should be noted that the agricultural area may also be a marine (or other water-based) agricultural area, for example, a water surface for growing algae (algae cultivation). Furthermore, the method 500 may be used for agricultural monitoring of agricultural fields, but may also be used for agricultural monitoring of non-agricultural fields (e.g., natural forests, rangelands, grasslands, etc.). In this case, the plants growing in these areas can be monitored as crops in these areas. The agricultural area being agriculturally monitored using method 500 may include one or more types of agricultural areas (e.g., any one or more of the examples described above, including orchards and potato fields, for example).
A stage 520 of method 500 includes acquiring image data of a portion of the agricultural area with the on-board imaging sensor, wherein the acquiring of the image data includes the on-board imaging sensor acquiring at least a portion of the image data at a set of imaging locations along the flight path, the imaging locations capable of acquiring the image data at a sub-millimeter image resolution. The acquisition of stage 520 may be performed by imaging sensor 210, with reference to the examples described in the previous figures.
The image data acquired in stage 520 may include one or more individual images, one or more video sequences, and combinations thereof, and may also include any other type of image data known in the art. The acquisition of image data in stage 520 may include acquiring visible light or other electromagnetic radiation (e.g., Ultraviolet (UV), Infrared (IR), or other portions of the electromagnetic spectrum). Other image acquisition techniques may also be used in addition to or instead of light acquisition. For example, stage 520 may include acquiring image data by a Synthetic Aperture Radar (SAR) sensor.
Acquiring image data in stage 520 includes acquiring at least a portion of the image data at a sub-millimeter resolution. That is, a portion of the agricultural area is imaged at a level of detail in at least a portion of the image data acquired by the on-board imaging sensor such that the portions of the agricultural area are resolved to a detail of less than one square millimeter (mm)2) Thinner (i.e., smaller). It should be noted that the resolvable detail of the image data may be significantly less than one square millimeter, for example, less than 0.01 square millimeter.
It should be noted that stage 520 may include acquiring, by an on-board imaging sensor, image data of a portion of the agricultural area at an image resolution at least one order of magnitude finer than the average leaf size of the imaged crop. That is, in at least a portion of the image data, a plurality of leaves of the crop are imaged at a resolution that is capable of resolving at least ten independently resolvable portions of the leaves. Optionally, stage 520 may include acquiring, by an on-board imaging sensor, image data of the portion of the agricultural area at an image resolution at least two orders of magnitude finer than the average leaf size of the imaged crop. Optionally, stage 520 may include acquiring, by an on-board imaging sensor, image data of the partial agricultural area at an image resolution that is at least three or more orders of magnitude finer than the average leaf size of the imaged crop.
The imaging of individual leaves of a crop in image data with a plurality of individually resolvable areas (e.g. more than 100 individually resolvable areas) enables the use of these image data to detect leaf conditions of the crop, e.g. to identify different leaf diseases, to identify insects and parasites on the leaves, to identify the effect of parasites on the leaves (e.g. on the food parts), etc.
It should be noted that stage 520 may include acquiring image data of a portion of the agricultural area at more than one resolution and/or more than one image acquisition technique. In this case, different images (or videos) of the same portion of the agricultural area taken at different resolutions and/or techniques may be taken at the same time or at different times (e.g., in different portions of the flight path, possibly flying in another direction, at another altitude, etc.). Images of different portions of the electromagnetic spectrum and/or different resolutions may be taken by a single sensor (e.g., at different times, using different lenses, using different optical filters, using different electronic filters, etc.).
Stage 540 of method 500 includes sending image data content based on image data acquired by the onboard imaging sensor to an external system. The transmission of stage 540 may be performed by communications module 230, with reference to the examples described in the previous figures. The image data content transmitted in stage 540 may include some or all of the image data acquired in stage 520. Alternatively (or additionally), the image data sent in stage 540 may include image data content created by processing the image data acquired in stage 520.
The transmitting of stage 540 may include wirelessly transmitting the image data content while the airborne platform carrying the airborne platform is still airborne. However, this is not necessarily so, and some (or all) of the image data content transmitted in stage 540 may be transmitted after the aircraft lands. The sending of the image data content may include sending the image data content wirelessly (e.g., using radio communication, satellite-based communication, cellular networks, etc.), by wire (particularly if the data is sent after the aircraft lands, such as using Universal Serial Bus (USB) communication), or any combination thereof. The sending of image data content in stage 540 may be performed in real time or near real time (transmitting image data corresponding to a portion of the imaged agricultural area before acquiring image data corresponding to another portion of the imaged agricultural area), but this is not necessarily so.
As will be discussed in more detail below, image data content may be sent to different types of entities and different applications implemented by these entities. For example, the image data content may be sent to an off-site system, reviewed by an expert and/or processed by a computerized system in order to determine agronomically important data for agricultural areas and/or internal crops. In another example, the image data content may be transmitted to an aerial application system (e.g., an agricultural airplane or ground control system) for determining aerial application parameters of an aerial application of pesticide (crop dusting) and/or fertilizer (aerial top dressing). It should be noted that airborne application may refer to the application of various materials from an airplane-fertilizer, pesticide, seeds, etc., such an airplane may be an airplane or a helicopter, but other types of aircraft (e.g., hot air balloons) may also be used. It should be noted that in the context of the present disclosure, agricultural aircraft (particularly aerial utility aircraft) may be manned aircraft or unmanned aircraft.
Fig. 3 is a flow chart illustrating an example of an agricultural monitoring method 600 according to an example of the presently disclosed subject matter. The method 600 may be performed by the system 10 with reference to the examples described in the previous figures. Method 600 is an example of method 500 and the stages of method 600 are numbered with corresponding reference numbers of method 500 (i.e., stage 610 is an example of stage 510, stage 620 is an example of stage 520, etc.). It should be noted that where applicable, variations and examples discussed with reference to method 500 (either above or below referenced in this disclosure) are also relevant to method 600.
The method 500 as implemented in the example of method 600 includes using an onboard imaging sensor carried by an aircraft to fly at low altitudes, enabling the acquisition of very high resolution images of crops at high rates (sampling a large area in an agricultural area in relatively little time). Image data content generated on the onboard system is transmitted for processing at a remote off-site analysis server. The image data content is processed by the analysis server and then distributed to a management interface (e.g., personal computer, handheld computer, etc.) which is provided to an agriculturist, to a manager for other professionals, or to a specialized system for further analysis. The high resolution image obtained in stage 620 enables analysis of individual blades, which may be used, for example, to detect leaf disease and/or to indicate parasitic effects on the blades, etc.
As set forth in more detail below, not all agricultural areas are imaged and a representative sample may be selected. It should be noted that an agriculturist inspects agricultural areas (e.g., fields, orchards) for leaf disease typically samples the agricultural area by walking, along a sampling path designed to represent a portion of the agricultural area. Using an onboard imaging sensor, sub-millimeter resolution images of the blades of an agricultural area are provided at a high rate, not only faster than walk-sampling of the agricultural area, but also to image portions of the agricultural area that are inaccessible to pedestrians. For example, foliage on the treetops and plants located under dense vegetation or rough terrain may be imaged.
A stage 610 of method 600 includes flying an onboard imaging sensor over an agricultural area where crops are growing along a flight path that includes a plurality of low-altitude imaging locations capable of acquiring image data at sub-millimeter image resolution. The flight path may include successive low-altitude flight legs (a flight leg is part of the flight plan between two waypoints). Referring to the example described in the previous figures, the onboard imaging sensor may be imaging sensor 210 and the flight of stage 610 may be performed by onboard platform 100.
Optionally, stage 610 may include flying the onboard imaging sensor along the terrain following a flight path (also referred to as "flying on the ground"). Terrain following the flight path varies in elevation above the ground (as measured from the earth's surface or vegetation, as the case may be) according to various considerations (e.g., aerodynamic issues, optical requirements of the imaging sensor, size of the crop, etc.). For example, stage 610 may include flying the imaging sensor over an agricultural area at an elevation below 30 meters (30 m) above the ground. For example, stage 610 may include flying the imaging sensor over an agricultural area at an elevation below 20m above the ground. For example, stage 610 may include flying the imaging sensor at an elevation below 10 meters above the ground over an agricultural area. It should be noted that the ground level along the flight path may be measured with reference above the plants growing in the agricultural area (e.g., below 10m, 20m, or 30m, with reference above the top of these crops).
FIG. 4A illustrates a system 10, a field of agricultural areas 900 and a flight path 910 according to an example of the disclosed subject matter; in the example shown, agricultural area 900 includes two separate areas-a wheat field 901 and an orchard 902.
It is noted that the two portions of agricultural area 900 (i.e., areas 901 and 902) may belong to different entities. For example, the wheat field 901 may belong to farmer Maji height, while the orchard 902 may be a research park of an agricultural company. Thus, in a simple flight, method 500 (and thus method 600) may include collecting image data of agricultural assets of the independent entity.
It is clear that the wheat fields 901 and orchards 902 differ from each other in both agricultural and agronomic respects. Thus, imaging of these two different regions may require different operational parameters-the onboard platform (e.g., speed, ground height, stability, etc.) and/or the onboard imaging sensors (e.g., exposure time, f-number, lens focal length, resolution, detector sensitivity, speed compensation, etc.). It should be noted that the image data acquired in stage 520 (and thus also in stage 620) may include acquiring image data of different parts of the agricultural area using different acquisition modes (different from each other in aerodynamic and/or sensor parameters, e.g. as described above).
Returning to fig. 3, stage 620 of method 600 includes acquiring, by the on-board imaging sensor, image data of a portion of the agricultural area at a sub-millimeter resolution. It is noted that portions of the agricultural area may also be imaged at a lower resolution (e.g., for generating an orientation image with which sub-millimeter image data may be associated). However, most agricultural areas are imaged in stage 620 preferably to sub-millimeter resolution. As mentioned above with respect to method 500, optionally, the imaged agricultural area may be a sample of the agricultural area for which an agronomic analysis was obtained in method 600. As described above, the same portion imaged at sub-millimeter resolution may also be imaged at a lower resolution. The acquisition of stage 620 may be performed by imaging sensor 210, with reference to the examples described in the previous figures.
Imaging of the agricultural area at stage 620 includes acquiring imaging data of image data of representative portions of the agricultural area (e.g., sampled at different sampling locations across the agricultural area) at an image resolution sufficient to analyze individual leaves of the imaged crop (e.g., at least one or two orders of magnitude finer than the average leaf size of the imaged crop). Fig. 13 illustrates a number of images 1000 acquired by an onboard imaging sensor according to method 600 in accordance with an example of the disclosed subject matter. As can be seen from the different illustrations, different types of leaf conditions (e.g., dryness, pests, diseases, etc.) can be analyzed.
Returning to fig. 3, it is noted that the image resolution of the image data acquired by the onboard imaging sensor depends on several factors — some of which depend on the imaging sensor itself (e.g., lenses, pixel density of detectors, etc.), and some depend on the onboard platform (e.g., ground height, speed, stability, etc.).
A ground sampling interval (GSD) may be defined for the acquired image data as the distance between pixel centers measured on the ground. For example, in image data (corresponding to a single image or video data) with a GSD of 500 nanometers, adjacent pixel image locations are 500 nanometers apart on the ground. It should be noted that the GSD of an image is not equal to its resolution, since the resolution data of neighboring pixels constitutes an additional requirement (e.g. the optical resolution quality of the lens used for imaging). GSD is also known as ground projection sampling interval (GSI) or ground projection instantaneous field of view (GIFOV).
As a general consideration, given a particular imaging sensor, GSD is inversely proportional to the distance between the imaging sensor and the imaging subject. The low-altitude flight of stage 510 may facilitate acquisition of image data at sub-millimeter resolution. Optionally, the GSD of the image data acquired in stage 620 is below 0.75mm (i.e. each pixel covers less than 0.75x 0.75mm)2Optionally, the GSD of the image data acquired in stage 620 is less than 0.5mm (i.e. each pixel covers less than 0.5 × 0.5.5 mm)2The ground area of).
Stage 630 of method 600 includes processing, by the on-board processing unit, the image data to provide image data content including high quality images of the crop leaves. The airborne processing unit is carried by the same airborne platform that flies the airborne imaging sensor in the agricultural area. Stage 630 may be performed by processor 220, with reference to the examples described in the previous figures.
The processing of stage 630 may include filtering the image data (e.g., discarding image data of insufficient quality or selecting a representative image for each region), compressing the image data, improving the image data (e.g., applying image enhancement processing algorithms), selecting agronomically important data, or any combination of the above, as well as other possible processing techniques known in the art.
For example, the processing of stage 630 may include processing the acquired image data to filter out acquired images of insufficient quality, analyzing remaining images of some of the acquired images to identify leaves of agricultural area crops (e.g., based on leaf identification parameters preloaded into the processing module), selecting an image that includes high quality identifiable leaves representative of the sample, and compressing the selected image to provide image data content to be sent to an external system.
Stage 640 of method 600 comprises wirelessly transmitting image data content to a displaced remote server for distribution to an end user. The transmission of stage 640 may be performed by communications module 230, with reference to the examples described in the previous figures. The wireless transmission of image data content in stage 640 may be performed in different manners (e.g., using radio communications, satellite-based communications, cellular networks, etc.).
Image data content or agronomically important data based on the image data content may be distributed from the server to various entities such as farmers, agriculturists, pilots, on-board systems, etc.
Fig. 4B illustrates, in accordance with an example of the disclosed subject matter, a system 10, a field of agricultural areas 900, a flight path 910, a server 300, and a plurality of exemplary entities that may receive agricultural significant data based on image data acquired by the system 10.
Alternatively, the server may apply various computerized processing algorithms to the image data for identifying selected agronomic significant data and generating agronomic image data for transmission to the remote system based on the selected agronomic significant data.
For example, the image data content (whether processed or not) may be provided to the agronomist 992 (in the illustrated example, via the satellite connection 994). Agriculturist 992 (e.g., an agriculturist specializing in quinoa in other areas) can analyze the provided data and make feedback giving the next steps that should be performed. Such information may be provided to the farmer 993 or owner of the agricultural area, or directly to another entity (e.g., an aerial application instruction to spray crop protection products onto the crop, directly to an agricultural aircraft 991 that may apply such products to the agricultural area).
It should be noted that the system-on-board platform 100 of the system 10 may be used as an agricultural aircraft for aerial dusting. In this case, the acquisition of image data by the onboard imaging sensor may be performed in-flight (running concurrently with the in-flight application, or at other times of flight) with the in-flight application. In this way, dedicated on-board imaging sensors may be installed on agricultural aircraft intended to fly over agricultural areas, and thus may use flight to collect the added benefit of image data of agricultural interest.
Such directions or recommendations do not necessarily require the participation of an agriculturist, and optional other entities (e.g., farmers 993 or the server 300 itself) may analyze information based on image data acquired by the system 10 to provide recommendations, instructions, analysis, or information that may be used to improve the condition of the agricultural area and/or crops planted therein.
Further, information collected about the agricultural area imaged by system 10 may be used to determine how to improve the condition of areas other than the imaged agricultural area-for example, if the imaging data is capable of identifying aphids in the agricultural area, nearby fields may also be sprayed based on this information.
Fig. 5A through 5E illustrate optional stages of an agricultural monitoring method 500 according to an example of the presently disclosed subject matter. Fig. 5A through 5E illustrate additional stages and variations of the stages previously presented that may be implemented as part of method 500. It should be noted that not all of these stages and variations need to be implemented together in a single instance of the invention. All combinations of the variation phases discussed with respect to method 500 may be implemented and form part of this disclosure.
Referring to stage 510, optionally, the flight path is terrain following flight path. In other words, stage 510 may include an optional stage 511 of flying an onboard imaging sensor along terrain following a flight path. During the imaged flight segment, the terrain following flight path on the ground may be below a predetermined height, for example, 20m above the ground (or above the height of the crop, as applicable, for example, above a dense forest).
It should be noted that stage 510 may include flying the airborne platform at an altitude that reduces the effects of optical aberrations of the imaging sensor and the effects of vibration of the imaging sensor and/or the airborne platform on the imaging data so as to enable the acquisition of imaging data at sub-millimeter resolution.
As discussed below with respect to stage 520, optionally, image data is acquired by the onboard imaging sensors as the onboard platform moves, and onboard platform deceleration may not be required. In this way, the system 10 as a whole can image a large portion of an agricultural area at a given time. This stage 510 may include a stage 512 of flying each imaging position across the aforementioned set of imaging positions at a speed in excess of 10m/s (where acquiring image data at sub-millimeter image resolution is performed).
Assuming an average velocity of the onboard platform along an imaging leg comprising a plurality of imaging locations as described above, the flying of stage 510 may comprise flying 513 the onboard imaging sensor along the imaging location of the imaging leg at a velocity that is no less than 50% of the average velocity along the imaging leg.
Stage 510 may include a stage 514 of flying an onboard imaging sensor by an agricultural aircraft configured for aerial application of crop protection products. It is to be noted that the acquisition of stage 520 may in this case be performed in parallel with the aerial application (typically at a very low altitude above the crop, for example at an altitude of 3-5 metres above the crop, or possibly even lower), or at other stages of flight (for example when the agricultural aircraft is switched between two areas). As discussed in more detail below, the application itself may be based on the processing of image data acquired in method 500, or based on the real-time processing of image data acquired by the same onboard system or by the processing of image data acquired in a previous flight.
Referring to the examples described with respect to the preceding figures, each of stages 511, 512, 513, and 514 may be performed by airborne platform 100.
As noted above, agricultural areas may include different areas associated with different entities. Thus, it should be noted that stage 510 may include flying the onboard imaging sensor along a flight path that extends to at least a first agricultural asset of a first owner and a second agricultural asset of a second owner other than the first owner. In this case, the acquiring in stage 520 may include acquiring first image data of a portion of the first agricultural asset and acquiring second image data of a portion of the second agricultural asset, and the method may further include generating first image data content based on the first image data and generating second image data content based on the second image data. This enables the first image data content to be provided to a first entity in a first message and the second data content to be provided to a second entity in a second message. Each of the first message and the second message may include information identifying the respective agricultural asset owner, and/or may be addressed to a system and/or another entity associated with the respective owner. Note that the distinction between the first image data content and the second image data content is not necessarily performed on the onboard system 200, but may also be performed by the server 300.
Referring now to stage 520, it includes acquiring image data of a portion of an agricultural area by an onboard imaging sensor, wherein the acquiring of the image data includes acquiring at least a portion of the image data by the onboard imaging sensor at a set of imaging locations along a flight path, the imaging locations capable of acquiring the image data at a sub-millimeter image resolution.
As described above, image data can be acquired without slowing down the flight speed of the onboard platform as it progresses along the flight path at a prescribed speed. Optionally, stage 520 may include a stage 521 of acquiring (a portion or all of) image data at the imaging location while flying the onboard imaging sensor along the imaging location at a speed no less than 50% of an average speed of the onboard platform along the flight path.
It should be noted that the speed of flight need not be slowed at all, and the acquisition of stage 520 may be performed without slowing the speed at which the onboard imaging sensor is flown along the flight path. Optionally, the acquiring of stage 520 may include compensating for movement of the imaging sensor during the image data acquisition. This may be achieved, for example, by using one or more motion compensation techniques.
These motion compensation techniques may be used, for example, to avoid image blurring caused by capturing images while an airborne platform carrying an airborne imaging sensor is flying forward.
One such motion compensation technique that may be used as part of method 500 is to move the onboard imaging sensor (or a portion thereof) during the acquisition of image data. The movement of the on-board imaging sensor (or one or more relevant parts thereof) may be performed when the image data is actually collected (e.g. when the detector of the on-board imaging sensor, e.g. a charge-coupled device, CCD, is collecting light arriving from the agricultural area), but may also be performed during other parts of the image data acquisition process (e.g. during focusing before collecting light).
Such motion compensation may be achieved by moving one or more components of the onboard imaging sensor without rotating the optical axis of the light collection component of the sensor (e.g., moving the sensor in a direction opposite the direction of flight), and/or by moving or rotating components of the onboard imaging sensor to rotate its light collection optical axis (e.g., by rotating a mirror or prism that directs light arriving from an imaging location of the agricultural area onto a light recording component of the sensor, such as a CCD).
Accordingly, stage 520 may include stage 522 of mechanically moving at least one component of the onboard imaging sensor relative to the loaded onboard platform for compensating for motion of the onboard imaging sensor relative to the crop during acquisition.
The motion compensation in stage 520 may reduce the relative velocity between the imaged position and the optical recording component to zero or to an effect of the relative motion between the two on the image quality below a predefined threshold.
As described above, motion compensation by rotating components of an onboard imaging sensor begins in the focusing phase, it should be noted that focusing begins when the optical axis of light acquisition is diagonal to the horizon, and the actual acquisition of image data may occur when the rotational motion is such that the optical axis toward the imaged crop (e.g., the imaged leaf) is perpendicular to the horizon.
Optionally, the acquiring of stage 520 may include: at least one optical component (e.g., rotating mirror 213, prism 212, etc.) of the onboard imaging sensor is mechanically rotated relative to the onboard platform of the load to compensate for motion of the onboard imaging sensor relative to the crop during acquisition, the rotation of the at least one optical component while, for each of a plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is larger than 20 degrees, starting a focusing process, and when the angle between the acquisition optical axis and the vertical axis is smaller than 20 degrees, acquiring image data by utilizing vertical imaging. The acquisition optical axis is a line connecting the center of the imaging position of the agricultural area (the area covered by a particular image frame) in a given frame and the center of the opening (e.g., transparent window 219) through which light enters the imaging system toward the rotating optical component.
In general, whether or not motion compensation is used, the acquisition of image data at stage 520 may include acquiring some or all of the image data using vertical imaging (strictly vertical imaging or steeply tilted imaging, e.g., less than 20 degrees from the vertical axis).
Additionally or alternatively, other motion compensation techniques may be configured as part of method 500. For example, stage 520 may include stage 523, stage 523 illuminating the crop during acquisition for compensating for motion of the on-board imaging sensor relative to the crop during acquisition. The illumination of stage 523 may include flash illumination, steady illumination (which may last at least during the acquisition stage, but may also be longer), or other types of illumination. Alternatively, the illumination may start when the focusing process before image acquisition starts.
As described above, acquiring image data (particularly sub-millimeter resolution image data) from an onboard platform, as disclosed by method 500, enables the collection of agriculturally and agronomically significant image data in places that are inaccessible, inaccessible or slow to access, dangerous, expensive and/or cause injury to crops. For example, foliage on the treetops and plants located under dense vegetation or rough terrain may be imaged. Optionally, stage 520 may include a stage 524 of acquiring image data of an agricultural area not accessible to the partially landed vehicle. While it is possible to design and manufacture a land vehicle that can reach the tips of rainforest tropics, doing so is complex and expensive and can be harmful to the natural environment. The inaccessibility of stage 524 is particularly useful for land vehicles commonly used in agriculture, such as tractors, pick-up trucks, hub irrigation equipment, combine harvesters, cotton pickers, and the like. It is noted that stage 520 may include collecting image data of portions of the agricultural area that are inaccessible by foot (e.g., people walking, hiking, etc.).
As described above, the image data acquired by the imaging sensor in stage 520 need not show all agricultural areas, it can image a representative sample thereof.
The relative portion of the agricultural area imaged by the imaging sensor may vary between different crop species. Different minimum coverage areas may be defined for each type of crop. Fiducial markers that may be used to define the coverage area are a comparison to the coverage area or higher percentage that is possible by ground human inspector walking. For example, if a pedestrian inspector expects to inspect a non-random 2-3% field, focusing on the area outside the field that the inspector can walk or drive to, the flight path can be planned so that it will produce a random coverage, including at least 3-5% of the field, and including interior portions of the field (rather than just exterior coverage).
Alternatively, stage 520 may comprise stage 525 of acquiring image data of the agricultural area at a coverage rate of less than 500 square meters per hectare (i.e. less than 5% of the agricultural area is covered by image data).
Note that stage 520 may include stage 526 of focusing the imaging sensor before the imaging sensor collects light. It should be noted that the focusing of the imaging sensors may be difficult, especially if performed when the airborne platform is flying at a significant speed (e.g. a speed above 10m/s, a speed not lower than 50% of the average speed of the airborne platform along the flight path, etc., as discussed in stage 521). Focusing may not only be affected by motion from the onboard platform relative to the imaging position, but may also be affected by movement within the imaging system (e.g., as discussed at stage 522). It should be noted that the operating parameters of the imaging system (e.g., system 200) and/or the onboard platform of the load may be selected to enable focusing. For example, a maximum height above the top of the crop may be selected to enable efficient focusing of the imaging sensor during flight.
Stage 530 of method 500 includes processing, by the on-board processing unit, the image data to provide image data content including a high quality image of the crop leaf. The onboard processing unit is carried by the same onboard platform that flies the onboard imaging sensor above the agricultural area. Stage 530 may be performed by processor 220 with reference to the examples described in the previous figures.
The processing of stage 530 may include filtering the image data (e.g., discarding poor quality image data or selecting a representative image for each region), compressing the image data, improving the image data (e.g., applying image enhancement processing algorithms), selecting agronomically important data, or any combination thereof, as well as other possible processing techniques known in the art.
For example, the processing of stage 530 can include processing the captured image data to filter out poor quality captured images, analyzing the remaining images to identify blades of the agricultural area crop in some of the captured images (e.g., based on pre-loaded blade identification parameters in the processing module), selecting recognizable blade images that include a representative high quality, and compressing the selected images to provide image data content to be sent to an external system.
During flight path and image collection, the onboard system optionally performs initial image analysis, e.g., defining photo quality, blur level, and image resolution, to exclude images that are not within the minimum requirements of the remote image analysis server, thereby saving analysis time and transmitting data to a remote location, whether the server or interface of the end product.
As described above, stage 540 includes sending image data content based on image data acquired by the onboard imaging sensor to an external system.
Stage 540 may include stage 541, stage 541 transmitting the image data content to an external system for displaying the agronomic image data based on the image data content to an agriculturist at a remote location, thereby enabling the agriculturist to remotely analyze the agricultural area. Note that the image data content can be sent to an external system, either directly or via an intermediate system (e.g., a server), and the external system can display the agronomic image data directly to an agriculturist, or provide information to another system capable of presenting the agronomic image data to an agriculturist (e.g., an agriculturist's handheld computer, such as a smartphone). It should be noted that such agronomic image data (e.g., images of selected infected leaves) may be communicated to one or more agronomists and/or other entities, as discussed in fig. 4B. Additional details regarding optional stage 541 are discussed below.
Several optional stages that may be included in the method 500 are shown in FIG. 5D. It should be noted that showing different routes between stage 530 and any higher numbered stage is optional, and that some stages may be reached in different ways in different instances of method 500. For example, stage 580 may be executed directly after stage 540, or after an intermediate stage, stage 550, is executed (which may include different combinations of sub-stages of stage 550). It should be noted that while a comprehensive phase route (indicating order of execution) is shown in fig. 5D, these routes do not contain all the possible choices, and other routes may be selected, depending on various considerations that naturally appear to those skilled in the art.
The image data content may be processed and used in various ways. It can serve as a basis for various decisions and actions, such as in what manner the crop should be treated, which steps are required to further monitor the agricultural area, how to treat the crop in an adjacent (or even remote) agricultural area, when the crop is mature, to predict the yield of the crop in the agricultural area and the time of harvest, and so forth.
The method 500 may include processing image data content (or information based thereon) to provide decision-promoting information. The processing may be performed by an onboard processor (denoted 551) carried by the onboard system, by a server (denoted 552), and/or by an end-user device (denoted 553). The process may involve human input (e.g., in an end-user device, an agriculturist may enter instructions based on his analysis of the image data content, or mark a farmer for what to look for to see if the proposed process works properly.
For example, the processing of stage 550 may include detecting a single leaf and clipping only the leaf from the image data, as shown in FIG. 14. Fig. 14 illustrates the cropping of a single leaf from image data according to an example of the presently disclosed subject matter. Image 1000 is processed to detect leaf edges (image 1010) and then partially imaged leaves are removed to provide an image (image 1020) that includes only single leaf information. Leaf cropping or other image processing algorithms applied to the image data may be based on leaf images, parameters and/or data from multiple databases for multiple seasons.
This process or stage 530 may provide, for example, any one or more of the following: leaf size statistics, leaf density statistics, leaf color and spectral analysis, and morphology statistics.
The image data content may be processed and used in various ways. Alternatively, it may be sent to one or more entities, as described above (e.g., in fig. 4B). The image data content may also be used to determine parameters of the onboard system performing method 500. Such parameters may be parameters related to the acquisition of further image data in another example of method 500, may be aerodynamic parameters or operational parameters of the airborne platform, may be operational parameters of another system carried by the airborne platform (e.g., agricultural spray parameters), and the like.
Optionally, the method 500 may include planning a subsequent flight based on the image data content obtained in the method 500. The plan may be based on image data content as well as agricultural considerations and/or other considerations. Method 500 may include another example of stages 510, 520, 530, and 540 after step 540. In this case, the method 500 may comprise a stage 561 of planning a route for a subsequent flight based on acquired image data obtained from a previous acquisition instance.
Optionally, method 500 may include a stage 562 of selecting aerial application parameters for aerial application of crop protection products by an agricultural vehicle based on processing of the image data.
All of the stages discussed above are performed on an onboard platform carrying the imaging sensors used for the acquisition stage 520 (where stage 550 may also be performed partially or completely on a remote system). Other phases may also be performed by other entities (not carried by the onboard platform), such as servers or end-user units.
Optional stage 570 includes sending, by a server located remotely from the onboard platform, decision-promoting information based on the image data content to the end-user device. Stage 570 may be performed by server 300, with reference to the examples described in the previous figures. The transmission may be performed wirelessly and/or over a wired communications medium, and may be facilitated by one or more intermediate systems (e.g., an internet router, etc.). Various examples of information that is based on image data content and that can contribute to the decision, and examples of decisions taken thereby, are provided above.
Returning to stage 550, which includes processing the image data content or information based thereon to provide decision-promoting information, it should be noted that the processing may include various processing procedures.
Fig. 6 is a flow chart illustrating an example of an agricultural monitoring method 700 according to an example of the presently disclosed subject matter. Referring to the examples described with respect to the preceding figures, method 700 may be performed by system 10. Method 700 is an example of method 500 and the various stages of method 700 are numbered with reference numbers corresponding to method 500 (i.e., stage 710 is an example of stage 510, stage 720 is an example of stage 520, etc.). It should be noted that where applicable, variations and examples discussed with reference to method 500 (either in the foregoing or in the following of the present disclosure) are also relevant to method 700.
The method 500 as implemented in the example of method 700 includes acquiring very high resolution images of crops at high speeds (enabling large area sampling of agricultural areas in relatively little time) using an onboard imaging sensor carried by an aircraft flying at very low altitudes. Image data content generated on the onboard system is transmitted for processing at a remote off-site analysis server. The image data content is processed by the analysis server and then distributed to a management interface (e.g., a personal computer, handheld computer, etc.), where it is provided to an agriculturist, sent to a manager to another professional, or sent to a specialized system for further analysis. The high resolution of the image obtained in stage 720 enables analysis of individual leaf levels, which may be used, for example, to detect leaf disease or to indicate the effect of parasites on leaves.
Stage 710 of method 700 includes flying an onboard digital camera by an agricultural aircraft (e.g., a spray aircraft as shown in fig. 6) over the potato growth area at a speed of 10 to 15m/s along a flight path including a plurality of low-altitude imaging locations about 40 feet above crop level, enabling acquisition of image data at sub-millimeter image resolution.
Stage 720 of method 700 includes acquiring, by the on-board digital camera, image data of the partial potato area at a sub-millimeter resolution of approximately 0.4 mm. The ground area covered by the digital camera in a single image is shown by a trapezoid drawing over the field.
Stage 730 of method 700 includes processing the image data by an onboard processing unit carried by the agricultural aircraft to provide image data content including a high quality image of the potato blades. In the illustrated example, a portion of the image acquired in stage 720 is cropped so that only the area around the suspect point detected in the acquired image can be transmitted in stage 740. In the example shown, the suspect point is actually a leaf area exhibiting early blight disease.
Stage 740 of method 700 comprises wirelessly transmitting image data content to a displaced remote server for distribution to an end user, such as an agriculturist.
Fig. 7 is a flow chart illustrating an example of an agricultural monitoring method 800 according to an example of the presently disclosed subject matter. Referring to the examples shown with respect to the previous figures, the method 800 may be performed by the system 10. Additional discussion and details regarding system 10 are provided following the discussion regarding method 800 below.
Stage 805 of method 800 includes developing a surveillance flight plan for the on-board surveillance system, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations.
Referring to the example described with respect to the previous figures, stage 805 may be performed by different entities, such as on-board system 10, server 300, and end-user devices (e.g., agriculturists 992, farmers 993, planning centers not shown in the figures, etc.), or any combination thereof (e.g., an agriculturist may suggest a plan that is then modified by on-board system 10 based on meteorological conditions).
The formulation of stage 805 can be based on various considerations. For example, a monitoring flight path and possible additional parameters can be established in order to be able to carry out image acquisition with the required quality. Stage 805 can include, for example, the following sub-stages
Determining a desired agricultural area (also referred to as a "plot") based on information obtained from the customer;
receiving Geographic Information System (GIS) information of the plot, as well as information about the structure of the plot (e.g., GIS information about irrigation pipes, roads, or other aspects of the plot structure).
Optionally, information about growing crops in the agricultural area is received, such as crop type, crop age (calculated from planting time), variety, and the like.
Mapping of terrain and obstacles, such as field deployed irrigation systems, tall tree wires, fixed machinery and others,
a surveillance flight path plan is formulated using a flight planning tool, for each crop and each plot based on general criteria for each crop (e.g., a potato or other flat crop is targeted at a plot of 5-20Ha, each plot receives a high altitude photograph by a high altitude single shot. These definitions are modified according to the crop species or the specific request of the client. It should be noted that optionally, the same flight path is taken multiple times throughout the season on each plot.
It should be noted that the surveillance flight plan may be updated. For example, on the day of actual flight (if a surveillance flight plan is preset), the crew and/or local contacts arrive at the agricultural area and verify that the obstacle is low for flight, and the cheek senses the wind direction to optimize the flight path by flying downwind or upwind (e.g., preferably taking a photograph of downwind rather than upwind).
Stage 810 of method 800 includes flying an on-board monitoring system along a flight path over an agricultural area where crops are growing according to a monitoring flight plan. Referring to the examples described in the previous figures, the on-board monitoring system may be imaging sensor 210 or the entire on-board system 10, and the flight of stage 810 may be performed by on-board platform 100. It should be noted that all optional variations, implementations, and sub-phases discussed with respect to phase 510 may be adjusted to accommodate phase 810 being performed based on the monitored flight plan.
Stage 820 of method 800 includes acquiring, by the on-board monitoring system during the flight, image data of the portion of the agricultural area at a sub-millimeter image resolution based on the acquisition location plan. Referring to the examples described in the previous figures, the on-board monitoring system may be the imaging sensor 210 or the entire on-board system 10. It should be noted that all optional variations, implementations, and sub-phases discussed with respect to phase 520 may be adjusted to accommodate phase 820 being performed based on the monitored flight plan.
It is noted that method 800 may also include processing of the image data to provide other decision-promoting information, similar to the processing discussed with respect to stage 550 (e.g., with respect to stage 551). Like stage 830, this processing of the image data may be based on monitoring a flight plan, but need not be.
Stage 840 of method 800 includes sending image data content based on image data acquired by the on-board monitoring system to the external system. The transmission of stage 840 may be performed by communication module 230, with reference to the examples described in the previous figures. It should be noted that all optional variations, implementations, and sub-phases discussed with respect to phase 520 may be adjusted to accommodate phase 820 being performed based on the monitored flight plan.
Fig. 8 is a flow chart illustrating an example of an agricultural monitoring method 800 in accordance with an example of the presently disclosed subject matter. Method 800 optionally includes a stage 801 (prior to stage 805) comprising receiving monitoring requests associated with a plurality of independent entities. In this case stage 805 may include a stage 806 of developing a surveillance flight plan to indicate the crop imaging locations of each of the plurality of independent entities. As described above, such entities may be different agricultural areas (e.g., fields and orchards), agricultural areas of different clients (e.g., a field of one client and another field belonging to another client), and so forth.
As discussed with respect to method 500 above (e.g., as depicted in fig. 4A), more than one type of crop may be growing in the agricultural area. Stage 805 may include a stage 807 of defining different acquisition parameters for imaging locations associated with different types of crops.
Such acquisition parameters may include operating parameters of the onboard platform (e.g., speed, ground height, stability, etc.) and/or characteristics of the onboard monitoring system, particularly its sensors (e.g., exposure time, f-number, lens focal length, resolution, detection sensitivity, speed compensation, etc.).
Fig. 9 is a functional block diagram illustrating an example agricultural monitoring system 200 according to an example of the presently disclosed subject matter. Some components of the agricultural monitoring system 200 (also referred to as system 200 for convenience) may have a simulated structure, function and/or effect in the system 10 (and vice versa), and thus such simulated components are referred to using the same reference numerals. It should be noted that different components of the system 200 may perform different stages of the methods 500, 600, 700, and 800 (e.g., as shown below), and the system 200 as a whole may perform a process that includes two or more stages of these methods.
The agricultural monitoring system 200 includes at least an imaging sensor 210, a communication module 230, and a connector 290, and may include additional components such as (but not limited to) those discussed below.
The imaging sensor 210 is configured and operable to acquire image data of a portion of the agricultural area 900 where the crop is growing at a sub-millimeter image resolution when the imaging sensor is airborne. Imaging sensor 210 is onboard and may be used to acquire image data while the aircraft is in flight. It should be noted, however, that the imaging sensor 210 may also be used to capture images when it is not being carried by an aircraft. In addition, a standard imaging sensor (e.g., a standard digital camera such as Canon EOS 60D or Nikon D3200) may be used as the imaging sensor 210.
It should be noted that while at least a portion of the image data acquired by the imaging sensor 210 is acquired at sub-millimeter resolution, the imaging sensor 210 may alternatively acquire image data of a portion of the agricultural area at a lower resolution (e.g., 2mm, 1cm GSD, etc.). Imaging sensor 210 may be configured to acquire image data at a low resolution using the same configuration as acquiring a sub-millimeter resolution (e.g., if an airborne platform carrying imaging sensor 210 is flying at a higher altitude) or using other configurations. Such other configurations may be used, for example, to acquire orientation quality images (e.g., with a GSD of 2 cm) that may carry high resolution image data.
As discussed above with respect to stage 520 of method 500, imaging sensor 210 may be used to acquire image data of a portion of an agricultural area at an image resolution that is at least one order of magnitude finer than the average leaf size of the imaged crop. That is, in at least a portion of the image data, a plurality of leaves of the crop are imaged at a resolution capable of resolving at least ten independently resolvable portions of the leaves. Different intensity levels may be measured for each of these distinguishable portions of the blade. Optionally, the imaging sensor 210 may be used to acquire image data of a portion of the agricultural area at an image resolution that is at least two orders of magnitude finer (and optionally at least three orders of magnitude finer) than the average leaf size of the imaged crop.
Different kinds of imaging sensors 210 may be used as part of the system 200. For example, the image sensor 210 may be a semiconductor Charge Coupled Device (CCD) image sensor, a Complementary Metal Oxide Semiconductor (CMOS) image sensor, or an N-type metal oxide semiconductor (NMOS) image sensor. It should be noted that more than one imaging sensor 210 may be included in the system 200. For example, the system 200 may include a first on-board imaging sensor for low-altitude photography of an agricultural area and a second imaging sensor 210 for high-altitude directional photography of the agricultural area (and possibly its surroundings as well). Further, the system 200 may include different types of imaging sensors 210. For example, system 200 may include imaging sensors 210 that are sensitive to different portions of the electromagnetic spectrum.
In addition to those optics that may be incorporated into the imaging sensor 210, the system 200 may also include additional optics (e.g., optional lens 211) for directing light from the agricultural area onto the light collection surface of the imaging sensor 210 (e.g., elements 211, 212, and 213 in fig. 11A). Such additional optics may condition the light it collects before it is directed to the imaging sensor 210. For example, the additional optics may filter out portions of the electric spectrum, may filter out and/or alter the polarization of the collected light, and/or the like.
Alternatively, the system 200 may be used to image a portion of an agricultural area in low-altitude flight (e.g., less than 10m above ground, e.g., less than 20m above ground, e.g., less than 30m above ground). Alternatively, the imaging sensor 210 may be configured and used to acquire image data at a height of less than 20 meters above the top of crops growing in an agricultural area.
The selection of the fly height of system 200 may depend on several factors. First, the height of the onboard system above the agricultural area determines the amount of light reaching the imaging sensor 210 during image data acquisition, and therefore also determines the exposure time and aperture available for collecting light. Thus, while low flight may limit the field of view of the imaging sensor, it enables the use of short exposure times and small apertures to acquire image data, thereby facilitating the capture of image data by the system 200 while flying at considerable speeds.
Another consideration in determining the operational flying height is noise and noise cancellation, particularly when image data is acquired at significant flying speeds (e.g., over 10 m/s). As discussed with respect to motion compensation, one method for compensating for airborne platform motion during acquisition may be accomplished by rotating imaging sensor 210 relative to the agricultural area or rotating an optical component (e.g., prism 212 or rotating mirror 213) to direct light from the agricultural area onto imaging sensor 210. In this case, the rotational speed of the rotating optics should compensate for the angular velocity of the onboard system (e.g., the center of the acquired frame of image data) relative to a fixed point on the agricultural area. Given a fixed linear velocity v of the airborne platform (assuming it flies perpendicular to the ground), the angular velocity of the airborne platform relative to the ground is inversely proportional to the height of the airborne platform above the ground.
However, the actual angular velocity of an airborne platform relative to an agricultural area depends not only on its flight speed and altitude, but also on noise and motion (pitch, yaw, roll, vibration, drift, etc.). The angular velocity is therefore composed of a component formed by the flight velocity of the airborne platform and a component caused by this noise. If V is the horizontal flying speed of the airborne platform, R is that it is higher thanHeight of the ground, then angular velocity is . Therefore, low-altitude flight reduces the relative influence of noise on angular velocity, and improves image quality. It is noted that the angular velocity of the rotating optical component may also be determined by the angular velocity with respect to ωNoise(s)Such as information regarding movement of the onboard platform collected by the IMU 270.
The system 200 further includes a communication module 220 configured and operable to transmit image data content based on image data acquired by the onboard imaging sensor to an external system. The external system is not part of system 200, and optionally is a system that is not installed on the aircraft carrying system 200. It should be noted that communication module 230 may be used to transmit image data content directly to an external system remote from the onboard platform carrying system 200. Alternatively, the communication module 230 may transmit the image data content to such a remote system by communicating via a communication module of the onboard platform (or a communication system mounted on the onboard platform). For example, if the airborne platform is equipped with a radio connection and/or satellite communication to a ground unit, the communication module 230 may send the image data content to the radio unit and/or satellite communication unit and then transmit it to the ground unit. As described with respect to stage 540 of method 500, communication module 230 may be used to wirelessly transmit image data content to an external system. As described with respect to stage 540 of method 500, communication module 230 may be used to transmit image data content to an external system in real-time (or near real-time).
Different kinds of communication modules 230 may be included as part of system 200. For example, an internet communication module may be used, a fiber optic communication module and a satellite-based communication module may be used.
The communication module 230 is optionally an onboard communication module, meaning for transmitting image data while flying by the aircraft. It should be noted, however, that the communication module may also transmit image data content when the aircraft returns to the ground. The system 200 may be connected to an onboard platform when the communication module transmits image data content, but this is not necessarily so.
Different kinds of connectors 290 may be used to connect the imaging sensor 210. For example, the following connector types (and any combination thereof) may be used as the connector 290: glue, welding, one or more screws, mechanical latches, clamps, clasps, rivets, clips and/or bolts, hook and loop fasteners, magnetic and/or electromagnetic fasteners, and the like.
It should be noted that connector 290 may connect imaging sensor 210 to an onboard platform either directly (i.e., the sensor is in direct contact with the platform or the connector is the only separation of the two) or indirectly (e.g., connecting the housing of system 200 to the onboard platform, with imaging sensor 210 connected to the housing).
It should be noted that connector 290 may connect other components of system 200 to the onboard platform-either directly or indirectly. For example, one or more connectors 290 may be used to connect communication module 230, optional processor 220, and/or an optional housing (not shown) of system 200 to an onboard platform. Each of these components may be connected directly or indirectly to the onboard platform by a connector 290. It should be noted that connector 290 may include a number of connection components that may be used to connect different portions of system 200 to an onboard platform. Referring to the example of fig. 1A-1C, connector 290 may include a weld to weld one type of communication module 230 to the rear of the aircraft, and four screws to connect imaging sensor 210 to the front of the aircraft.
It should be noted that connector 290 may be used to removably connect one or more components of system 200 to an onboard platform (e.g., using screws, hook and loop fasteners, snaps, etc.). It should be noted that connector 290 may be used to connect one or more components of system 200 to an onboard platform in a non-removable manner (e.g., using welding, glue, etc. Imaging sensor 210 may be connected to an onboard platform using a removable and/or non-removable connector 290. The use of a detachable connector 290 may be useful, for example, if the system 200 is a portable unit that is connected to a different aircraft on a need basis (e.g., to a spraying agricultural aircraft according to the spraying plan for the day).
Fig. 10 is a functional block diagram illustrating an example agricultural monitoring system 200 according to an example of the disclosed subject matter.
Optionally, the system 200 may include a processor 220. Processor 220 may be used to receive image data acquired by imaging sensor 210, to process the data, and to transfer it to another component, unit, or system information based on the processing of the image data (such information may include, for example, instructions, image data content, etc.). It should be noted that optional processor 220 may base its processing on another source of information than the image data acquired by imaging sensor 210. In general, processor 220 may be configured and operable to perform any combination of one or more of the processing, analysis, and computation processes discussed with respect to stages 530 and 550 of method 500.
For example, the processor 220 may be configured and arranged to process the image data content for detecting leaf disease and/or indicating parasite effects on leaves in one or more plants in the agricultural area. For example, the processor 220 can be configured and operable to process the image data content to identify selected agronomic significant data and generate agronomic image data for transmission to a remote system based on the selected agronomic significant data.
Alternatively, the imaging sensor 210 may be configured and used to acquire image data at flying speeds in excess of 10 m/s. Alternatively, the imaging sensor 210 may be configured and used to acquire image data at a flight speed that is no less than 50% of the average speed of the airborne platform along the flight path or along the imaging leg 911, as depicted in FIG. 4B.
Images acquired while a loaded airborne platform is flying at higher speeds may cover a larger portion of the agricultural area. Image data may also be collected from representative samples of agricultural areas by sampling them to facilitate wide coverage of the agricultural area (as depicted in fig. 4B). For example, the system 200 may be used to acquire image data of an agricultural area with a coverage rate of less than 500 square meters per hectare.
11A, 11B, 11C, and 11D are functional block diagrams illustrating an example agricultural monitoring system 200 with a motion compensation mechanism in accordance with an example of the presently disclosed subject matter.
In the example of fig. 11A, motion compensation is achieved by rotating a prism through which light is directed to the imaging sensor 210. In the example of fig. 11A, the system 200 includes one or more mechanical connections 241 (shafts in the illustrated example) that connect at least one component of the imaging sensor 210 (in this case, the prism 212) to the motor 240. Through mechanical linkage 241, the motion of motor 240 mechanically moves at least one component of imaging sensor 210 (i.e., moving prism 212 in the illustrated example) relative to an onboard platform of a load (not shown in fig. 11A). The motion of the motor moves the individual component(s) of the imaging sensor 210 to the same time that image data is acquired by the imaging sensor 210. It should be noted that the prism 212, the lens 211, and the rotating mirror 213 are labeled on the outside of the box of the imaging sensor 210 for convenience, and actually belong to the imaging sensor 210. It should be noted that the optical components may belong to the imaging sensor 210 even if they are not enclosed in the same housing that protects the light-sensitive surface of the imaging sensor 210.
It should be noted that other components that deflect light onto the photosensitive surface of the imaging sensor 210 may be used in place of a prism (e.g., a rotatable mirror as shown in FIG. 11B, the entire imaging sensor 210 is moved by the motor 240 relative to the loaded onboard platform to acquire image data.
The motion of the imaging sensor 210 relative to one or more components of the airborne platform may be used to compensate for the motion of the airborne imaging sensor relative to the crop during acquisition. Thus, the imaging sensor 210 may operate within the system 200 to acquire image data of the agricultural area when the onboard platform speed of the load is high (e.g., above 10 m/s) and thus create a high coverage of the agricultural area.
The speed at which the mechanical linkage 241 moves the various components of the imaging sensor 210 may be selected such that the relative speed between the light collecting surface (214 in fig. 11B) of the imaging sensor 210 and the imaged portion of the agricultural area (in this case of image data acquisition) is zero or near zero, but need not be.
The imaging sensor 210 may include a focusing mechanism (not shown) for focusing light passing from a portion of the agricultural area onto a light-sensitive surface of the imaging sensor 210. For example, a focusing mechanism may be required to acquire image data in order to achieve flight at different altitudes above ground. The focus mechanism may be operated automatically (by a focus control processor not shown). When light from the first portion of the agricultural area is projected onto the light collection surface of the imaging sensor 210, the focus control processor may be configured and used to focus the optics of the imaging sensor 210 because the subsequent image data from the second portion of the agricultural area acquired by the imaging sensor 210 does not completely overlap the first portion of the agricultural area. Referring to the example of fig. 11B, this may be used, for example, to focus an image when light rays arrive diagonally (relative to the ground) at the imaging sensor 210, and to acquire image data when light from an agricultural area arrives perpendicular to the imaging sensor 210.
Optionally, the motor 240 may be used to mechanically rotate at least one optical component of the imaging sensor 210 relative to the airborne platform (e.g., via one or more mechanical connections 241) to compensate for movement of the imaging sensor 210 relative to the crop during acquisition. In this case, the imaging sensor 210 may be configured and operable to: (a) rotating the at least one optical component while initiating a focusing process when the collection optical axis is at an angle greater than 20 ° from the vertical axis, and (b) collecting image data using vertical imaging when the collection optical axis is at an angle less than 20 ° from the vertical axis.
In the example of fig. 11C, motion compensation is implemented using illumination. Optionally, the system 200 includes an illumination unit 250 (e.g., a projector and/or a flash unit) configured and used to illuminate the crop during the acquisition of image data by the on-board imaging sensor. For example, LED (light emitting diode) lighting may be used. The illumination may be used to compensate for motion of the onboard imaging sensor relative to the crop during acquisition. Various types of lighting may be used (e.g., depending on the relative importance of energy consumption considerations with respect to other design factors of the system 200). It should be noted that flash illumination may be used to reduce the time that the light-sensitive surface 214 of the imaging sensor 210 should be exposed to light from the agricultural area 900 to produce an image, which in turn reduces the effect of motion on the blurring of the resulting image data.
In the example of fig. 11D, the agricultural monitoring system 200 includes an altimeter 250. For example, the altimeter 250 may be a laser altimeter whose laser beam passes through a corresponding window (denoted as "altimeter window 252") of the agricultural monitoring system 200. The system 200 may also include an Inertial Measurement Unit (IMU) 270 that measures and reports the speed, direction, and gravity of the aircraft using a combination of one or more accelerometers, gyroscopes, and/or magnetometers. System 200 may also include a rotary encoder 230, where rotary encoder 230 measures the rate of rotation of rotating mirror 213 (or the rate of rotation of prism 212 as described above).
The engine controller 248 may use information from the IMU 270, altimeter 250, and rotary encoder 260 to determine the rotational speed of the engine 240 (and thus to the rotational speed of the rotating mirror).
It should be noted that the angular velocity of the imaging plane (e.g., the angular velocity of the transparent window 219 that transmits light from the agricultural area 900 to the imaging sensor 210) depends on various factors, including the airspeed of the aircraft, its pitch angle, and its altitude above the agricultural area 900. In addition, information from the laser altimeter may also require corrections based on pitch angle and tilt angle data.
Optionally, the axis of rotation of the rotating mirror 213 is parallel to the horizon and perpendicular to the main axis of the aircraft. However, since the aircraft flight direction is not necessarily parallel to the main axis of the aircraft (e.g., because crosswinds may drift, or for maneuvering reasons), the system 200 may also compensate for components that are perpendicular to the main axis of the aircraft.
Fig. 12 is a functional block diagram illustrating an example agricultural monitoring system 200 according to an example of the presently disclosed subject matter. As described above, the agricultural monitoring system 200 may optionally include an airborne platform 100 that may be used to fly an airborne imaging sensor along a flight path over an agricultural area.
Different types of airborne platforms may be used as airborne platform 100. For example, airborne platform 100 may be any of the following airborne platform types: airplanes, helicopters, multi-rotor helicopters (e.g., quadrotors), Unmanned Aerial Vehicles (UAVs), powered parachutes (also known as electric parachutes, PPCs, and gliding landing soft-wing aircraft), and the like. The type of airborne platform 100 may be determined based on, for example, aerodynamic parameters (e.g., speed, altitude, maneuverability, stability, and load capacity, etc.), the degree of manual or automated control, and other usage requirements of the airborne platform.
Optionally, the airborne platform 100 included in the system 200 may include an engine that may propel the airborne platform 100 during flight. Optionally, airborne platform 100 included in system 200 may include wings (whether fixed or rotating) that may be used to provide lift to airborne platform 100 during flight.
Fig. 15 is a flow chart illustrating an example of an agricultural monitoring method 1100 according to an example of the presently disclosed subject matter. Referring to the examples described with respect to the preceding figures, the method 1100 may be performed by the server 300. Referring to method 500, it should be noted that execution of method 1100 may be initiated after completion of stage 540 of transferring image data content, but may also be initiated during execution of stage 540. That is, the server may begin receiving, processing, and utilizing some of the image data content before all of the image data content is generated by the on-board system. This may be the case, for example, if the onboard system processes and transmits image data content during the acquisition flight.
Optionally, the processing of stage 1120 may include analyzing the image data content to identify selected agronomically significant data within the image data content, and processing such agronomically significant data to provide the agronomic data.
Optionally, the processing of stage 1120 may include applying computerized processing algorithms to the image data content for detecting leaf disease or indicating parasite effects on the leaves in one or more plants of the agricultural area.
Referring to the examples described in the previous figures, the agronomic data transmitted in stage 1130 may be transmitted to various entities, such as agricultural aircraft 991, agriculturists 992, and/or farmers 993.
It should be noted that method 1100 is performed by a server (e.g., server 300) that supports the various variations described with respect to method 500. For example, with respect to detecting growth of crops in an agricultural area, the receiving of stage 1110 may include image data content of the agricultural area acquired (by the at least one imaging sensor) on different days (possibly extending over several weeks), and the processing of stage 1120 may include processing the image data content for determining growth parameters of plants in the agricultural area.
With respect to another example of monitoring agricultural areas of a plurality of entities, it should be noted that, optionally, the image data content may include first image data content of a first agricultural asset of a first owner and second image data content of a second agricultural asset of a second owner other than the first owner, and the sending of stage 1130 may include sending the first image data content in a first message and sending the second data content in a second message. Each of the first message and the second message may include information identifying the respective agricultural asset owner, and/or may be addressed to a system and/or another entity associated with the respective owner.
The method 1100 may further include applying computerized processing algorithms to the agronomic significant data to select a recipient of the agronomic image data from a plurality of potential recipients based on agronomic expertise of the potential recipients. The sending of stage 1130 may be performed based on the results of the selection.
Fig. 16 is a functional block diagram illustrating an example of a server 300 for agricultural monitoring, in accordance with an example of the disclosed subject matter. The server 300 may include a communication module 310 and a server processing module 320, as well as additional components (e.g., power supplies, user interfaces, etc.) that are omitted for simplicity.
As described in detail above, the receipt of image data content may be based on image data acquired in low flight in an agricultural area. In particular, the image data content may be based on image data acquired at a set of imaging locations along the flight path that are less than 20 meters above the agricultural area where the crop is growing.
The systems and methods described above are described in the context of monitoring an agricultural area for crop growth. It will be apparent to those skilled in the art that these methods and systems may also be used (e.g., useful for human engineering) to monitor ground areas where no crop is currently growing on them. For example, the systems and methods may be used to determine the soil type in such soils, their composition of matter, irrigation levels in such areas, to determine parasites or weeds, and the like. It should therefore be noted that the system described above may be adapted, mutatis mutandis, to monitor a ground area with or without a crop. Furthermore, the above method may be adapted, mutatis mutandis, for monitoring a ground area with or without a crop. In both cases, imaging of the ground area is still performed at sub-millimeter resolution and may be accomplished in any of the ways described above (e.g., with motion compensation, etc.). Some examples are provided in fig. 17, 18 and 19.
Fig. 17 is a flow chart illustrating an example of a method 1800 for surface area monitoring in accordance with an example of the disclosed subject matter. Referring to the examples described with respect to the previous figures, method 1800 may be performed by system 10.
253 it should be noted that the variations and examples discussed with reference to method 500 apply mutatis mutandis to method 1800, if applicable. Where applicable, the relevant changes of stages 510, 520 and possibly 530, 540 and subsequent stages may be effected in respective stages of method 1800 performed based on the monitoring flight plan developed at stage 1805 — with modifications, the ground area is not necessarily the agricultural area where the crop is growing.
A stage 1805 of method 1800 includes developing a surveillance flight plan for the on-board surveillance system, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations.
Referring to the example described with respect to the preceding figures, stage 1805 may be performed by a different entity, such as on-board system 10, server 300, and end-user devices (e.g., agriculturist 992, farmer 993, planning center not shown, etc.) or any combination thereof (e.g., agriculturist 992 may propose a plan, which is then modified by on-board system 10 based on meteorological conditions).
The formulation of stage 1805 may be based on various considerations. For example, a monitoring flight path and possible additional parameters can be established in order to be able to carry out image acquisition with the required quality. Stage 1805 may include, for example, the following sub-stages:
determining one or more ground areas needed based on the information obtained from the customer;
receiving Geographic Information System (GIS) information for one or more ground areas, as well as information about the structure of one or more ground areas (e.g., GIS information about irrigation pipes, roads, or other aspects of the structure).
Optionally, information is received about the surface area surface soil, such as soil type, variety, etc.
Mapping of terrain and obstacles, such as field deployed irrigation systems, tall tree electrical lines, fixed machinery, and others, on each of one or more ground areas and the surrounding of one or more ground areas, according to GIS information (other additional information may also be needed).
A surveillance flight plan is developed for each of one or more ground areas (or subdivisions thereof) using the flight planning tool. It should be noted that the general criteria apply alternatively to different types of soil or other different sub-areas of one or more ground areas.
It should be noted that the surveillance flight plan may be updated. For example, on the day of actual flight (if a watch flight plan is preset), the flight crew and/or local contact arrive in the ground area, and the cheek senses the wind direction to optimize the flight path by flying downwind or upwind for low flight verification of the obstruction (e.g., preferably take a photograph of downwind rather than upwind).
A stage 1810 of method 1800 includes flying an airborne surveillance system along a flight path over a ground area based on a surveillance flight plan (the term "ground area" is explained in the preceding paragraph). Referring to the example described in the previous figures, the on-board monitoring system may be imaging sensor 210 or the entire on-board system 10 (modified, the ground area is not necessarily the agricultural area where crops are growing) and the flight of stage 1810 may be performed by on-board platform 100. It should be noted that all optional variations, implementations, and sub-phases discussed with respect to phase 510 may be adjusted to phase 1810 performed based on monitoring the flight plan.
A stage 1820 of method 1800 includes acquiring, by the on-board monitoring system during the flight, image data of the portion of the ground area at a sub-millimeter image resolution based on the acquisition location plan. Referring to the example described in the previous figures, the on-board monitoring system may be the imaging sensor 210 or the entire on-board system 10 (with modifications, the ground area is not necessarily the agricultural area where the crop is growing). It should be noted that all optional variations, implementations, and sub-phases discussed with respect to phase 520 may be adjusted to phase 1820 to be performed based on monitoring the flight plan.
The onboard processing unit of stage 1830 is carried by the same onboard platform that flies the onboard surveillance system in the ground area. Referring to the example described with respect to the previous figures, stage 1830 may be performed by a processor of stage 1810 system (e.g., processor 220 mutatis mutandis). It should be noted that all optional variations, implementations, and sub-stages discussed with respect to stage 530 may be applied to stage 1830.
Stage 1830 may be performed based on the monitored flight plan developed in stage 1805, but this is not necessarily so. For example, the processing of optional stage 1830 may be based on information included in the monitoring flight plan regarding the type of soil or type of agricultural condition sought (e.g., soil moisture, ground evenness, etc.). It should be noted that the surveillance flight plan (or a more general plan defined for surveillance flight, a plan that includes the surveillance flight plan and additional information) may include parameters and/or instructions (e.g., instructions regarding how much information should be transmitted to the external system in stage 1840) that affect the processing of optional stage 1830.
It should be noted that method 1800 may also include processing image data for providing other decision-promoting information, similar to the processing discussed with respect to stage 550 (e.g., with respect to stage 551), mutatis mutandis. Like stage 1830, this processing of the image data may be based on monitoring a flight plan, but this is not necessarily so.
A stage 1840 of method 1800 includes sending image data content based on image data acquired by an on-board monitoring system to an external system. With reference to the examples described with respect to the previous figures, the transmission of stage 1840 may be performed by the communication module 230 with the necessary modifications. It should be noted that all optional variations, implementations, and sub-stages discussed with respect to stage 520 can be adapted, mutatis mutandis, to stage 1820 performed based on monitoring the flight plan.
Referring to method 1800, method 1800 (and the design of the monitoring flight plan, in particular) may be used, for example, to see if a seeded agricultural area has germinated, if a ground area is suitable for agricultural use, to determine that plumbing and/or watering and/or irrigation systems and/or other agricultural systems are functioning, and so forth.
For example, the ground area may include different types of soil, and the acquiring may include acquiring image data of different locations in the ground area to generate a soil map of the ground area (e.g., on an onboard platform and/or ground system).
For example, acquiring may include acquiring image data indicative of material composition at different locations in the ground region. Such a composition of matter may include different types of ground and/or stone, different types of minerals, and the like.
For example, acquiring may include acquiring image data indicative of agricultural disaster recovery levels at different locations in the ground area.
It should be noted that more than one type of soil (or other object on, above, or partially exposed to the ground) may be present in the ground area. Stage 1805 may include defining different acquisition parameters for imaging locations associated with different types of terrain (or other objects as those previously mentioned in this paragraph).
Such acquisition parameters may include operating parameters of the onboard platform (e.g., speed, ground height, stability, etc.) and/or parameters of the onboard monitoring system, particularly its sensors (e.g., exposure time, f-number, lens focal length, resolution, detection sensitivity, speed compensation, etc.).
Fig. 18 is a flow chart illustrating an example of a method 1900 of surface area monitoring in accordance with an example of the disclosed subject matter. Referring to the examples set forth with respect to the following figures, method 1900 may be performed by server 1300.
Referring to method 1800, it should be noted that execution of method 1900 may be initiated after completion of stage 1840 of transferring image data content, but may also be initiated during execution of stage 1840. That is, the server may begin receiving, processing, and utilizing some image data content before all of the image data content is generated by the on-board system. This may be the case, for example, if the onboard system processes and transmits image data content during the acquisition flight.
The term "surface data" relates to data relating to the surface and/or the surface. In some embodiments of the invention, the term "surface data" may be broadly construed to include objects that touch the surface, whether living objects (e.g., worms, fallen leaves) or inanimate objects (e.g., pipelines, sprinklers). However, some embodiments of method 1900 (and server 1300) are implemented in a more rigorous manner, where the term "surface data" only applies to the surface itself (overburden and stone, etc.).
Optionally, the processing of stage 1920 can include analyzing the image data content to identify selected agronomically significant data within the image data content, and processing such agronomically significant data to provide ground data. For example, the selected agronomic data may select images that clearly show ground type, images that show parasites, worms or other creatures, images that show pipe breaks or wear, and so forth.
Optionally, the processing of stage 1920 may include analyzing the image data content to identify selected terrestrial significant data within the image data content, and processing such terrestrial significant data to provide terrestrial data. For example, the selected ground significance data may include images showing ground type, images indicating lower soil (below topsoil) content that may be exposed in some areas, and the like.
Optionally, the processing of stage 1920 can include applying computerized processing algorithms to the image data content to distinguish areas of the ground area having different types of soil. The different types of soil may be different types of soil, rock, stone, and/or other minerals.
Optionally, the processing of stage 1920 may include determining a composition of matter in the surface region, and generating surface data according to the results of the determination.
Stage 1930 of method 1900 comprises transmitting terrestrial data to an end-user remote system. Referring to the examples described with respect to the following figures, stage 1930 may be performed by communications module 1310 of server 1300.
The ground data sent in stage 1930 may be communicated to various entities, such as agricultural aircraft 991, agriculturists 992, agriculturists, geologists, and/or farmers 993, in accordance with the examples described in the preceding figures.
It should be noted that method 1900 may be performed by a server (e.g., server 1300) that supports the variations discussed with respect to method 1800, mutatis mutandis.
Referring to method 1900, it should be noted that the image data content may be based on image data acquired at a set of imaging locations along the flight path, the set of imaging locations being less than 20 meters above the ground area.
Fig. 16 is a functional block diagram illustrating an example of a server 300 for agricultural monitoring, in accordance with an example of the disclosed subject matter. The server 1300 may include a communication module 1310 and a server processing module 1320, as well as additional components (e.g., power supplies, user interfaces, etc.) that are omitted for simplicity.
As described in detail above, the received image data content may be based on image data acquired in low-altitude flight over a ground area. In particular, the image data content may be based on image data acquired at a set of imaging locations along the flight path, the set of imaging locations being located less than 20 meters above the ground area.
While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
It will be appreciated that the embodiments described above are cited by way of example, and that various features thereof, as well as combinations of such features, may be changed and modified.
While various embodiments have been shown and described, it will be understood that there is no intent to limit the invention by such disclosure, but rather, is intended to cover all modifications and alternate constructions falling within the scope of the invention, as defined in the appended claims.
Claims (46)
1. A method of agricultural monitoring, the method comprising:
flying an airborne imaging sensor along a flight path over an agricultural area where crops are growing;
acquiring image data of a portion of the agricultural area with the on-board imaging sensor, wherein the acquiring of the image data is performed at a set of imaging locations along the flight path, the imaging locations capable of acquiring the image data at a sub-millimeter image resolution, the acquiring comprising:
mechanically rotating at least one optical component of the onboard imaging sensor relative to an onboard platform of a load for compensating for motion of the onboard imaging sensor relative to the crop during acquisition; and is
While the at least one optical component is rotating, for each frame of the plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging; and is
Image data content based on image data acquired by the onboard imaging sensor is transmitted to an external system.
2. The method of claim 1, comprising: sending the image data content to the external system to display agronomic image data based on the image data content to an agriculturist at a remote location, thereby enabling the agriculturist to remotely analyze an agricultural area.
3. The method of claim 1, wherein the flight path is a terrain following flight path.
4. The method of claim 1, wherein the acquiring comprises acquiring image data at the set of imaging locations while the onboard imaging sensor is flying along the imaging locations at a speed that is no less than 50% of an average speed of the onboard platform along the flight path.
5. The method of claim 4, wherein the acquiring comprises mechanically moving at least one component of the onboard imaging sensor relative to a loaded onboard platform for compensating for motion of the onboard imaging sensor relative to the crop during the acquiring.
6. The method of claim 1, wherein the acquiring comprises illuminating the crop during the acquiring to compensate for movement of the onboard imaging sensor relative to the crop during the acquiring.
7. The method of claim 1, wherein flying comprises flying the onboard imaging sensor along a flight path that extends to at least a first agricultural asset of a first owner and a second agricultural asset of a second owner other than the first owner, wherein the method comprises acquiring first image data of a portion of the first agricultural asset and acquiring second image data of a portion of the second agricultural asset; generating first image data content based on the first image data and second image data content based on the second image data; for providing first image data content to a first entity in a first message and for providing second data content to a second entity in a second message.
8. The method of claim 1, wherein the acquiring comprises acquiring image data of a portion of the agricultural area inaccessible to the ground vehicle.
9. The method of claim 1, wherein flying comprises flying the imaging sensor through an agricultural aircraft configured for aerial application of crop protection products.
10. The method of claim 9, further comprising selecting aerial application parameters for aerial application of crop protection products by the agricultural aircraft based on the processing of the image data.
11. The method of claim 1, wherein the set of imaging locations along the flight path are located less than 20 meters above a top of the crop growing in the agricultural area.
12. The method of claim 1, wherein the transmitting is followed by a subsequent instance of the flying, acquiring, and transmitting, wherein the method further comprises planning a route for the subsequent instance of the flying based on the image data acquired in the previous instance.
13. The method of claim 1, further comprising applying computerized processing algorithms to the image data content for detecting leaf disease or indicating parasite effects on leaves in one or more plants in the agricultural area.
14. The method of claim 1, wherein said flying, acquiring and transmitting are repeated over a period of weeks, and wherein the method further comprises processing image data acquired at different times over said period of weeks for determining growth parameters of plants in the agricultural area.
15. The method of claim 1, further comprising applying computerized processing algorithms to the image data to identify selected agronomic significant data, and generating agronomic image data for transmission to a remote system based on the selected agronomic significant data.
16. The method of claim 15, further comprising applying computerized processing algorithms to the selected agronomic significant data to select a recipient of the agronomic image data from a plurality of possible recipients based on agronomic expertise of the possible recipients.
17. The method of claim 1, wherein a surveillance flight plan is developed for an airborne surveillance system prior to the flight, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations, wherein the airborne sensors flying along the flight path over the agricultural area are part of a flying airborne surveillance system according to the surveillance flight plan.
18. The method of claim 1, wherein the flight path is a terrain following flight path; wherein flying comprises flying the imaging sensor through an agricultural aircraft configured for aerial application of the crop protection product; wherein the set of imaging locations along the flight path are located less than 20 meters above a top of a crop growing in the agricultural area; wherein the acquiring comprises: (a) acquiring image data at the set of imaging locations while an onboard imaging sensor is flying along the imaging locations at a speed that is no less than 50% of an average speed of the onboard platform along a flight path; and (b) compensating for movement of the onboard imaging sensor relative to the crop during acquisition by illuminating the crop during acquisition and by mechanically moving at least one component of the onboard imaging sensor relative to the loaded onboard platform; wherein the sending comprises sending the image data content to an external system to display agronomic image data based on the image data content to an agriculturist at a remote location to enable the agriculturist to perform remote analysis of the agricultural area; wherein the method further comprises: developing a surveillance flight plan for an airborne surveillance system prior to the flight, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations, wherein the airborne sensors flying along a flight path over an agricultural area according to the surveillance flight plan are part of flying the airborne surveillance system.
19. A method of agricultural monitoring, the method comprising:
formulating a surveillance flight plan for an airborne surveillance system, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations;
based on the monitored flight plan, the on-board monitoring system is flying along a flight path over an agricultural area where crops are growing, wherein the agricultural area includes a plurality of fields where at least two types of crops are growing; acquiring image data of a portion of an agricultural area at a sub-millimeter image resolution during flight of the airborne surveillance system based on the acquisition location plan, the acquiring comprising:
mechanically rotating at least one optical component of an onboard imaging sensor relative to an onboard platform of a load for compensating for movement of the onboard imaging sensor relative to the crop during acquisition; and is
While the at least one optical component is rotating, for each frame of the plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging;
and transmitting image data content based on image data acquired by the on-board monitoring system to an external system.
20. The method of claim 19, wherein the monitoring flight plan is formulated by receiving monitoring requests associated with a plurality of independent entities, and comprising formulating the monitoring flight plan to indicate an imaging location of the crop of each of the plurality of independent entities.
21. The method of claim 19, wherein the formulating of the surveillance flight plan includes formulating different acquisition parameters for imaging locations associated with different varieties of crop.
22. A method of monitoring a surface area, the method comprising:
creating a surveillance flight plan for the airborne surveillance system, the surveillance flight plan including an acquisition location plan indicative of a plurality of imaging locations; based on the monitored flight plan, the on-board monitoring system is flown over a ground area along a flight path;
acquiring image data of a portion of the ground area at a sub-millimeter image resolution during flight of the airborne surveillance system based on the acquisition location plan, the acquiring comprising:
mechanically rotating at least one optical component of an onboard imaging sensor relative to an onboard platform of a load for compensating for movement of the onboard imaging sensor relative to the crop during acquisition; and is
While the at least one optical component is rotating, for each frame of the plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging; and is
Transmitting image data content based on image data acquired by the on-board monitoring system to an external system.
23. The method of claim 22, wherein the ground area comprises different types of soil, and wherein the acquiring comprises acquiring image data of different locations in the ground area to generate a soil map of the ground area.
24. The method of claim 22, wherein the acquiring comprises acquiring image data indicative of material composition at different locations in the ground area.
25. The method of claim 22, wherein the acquiring comprises acquiring image data indicative of agricultural disaster recovery levels at different locations in the ground area.
26. An agricultural monitoring system, comprising:
an imaging sensor configured and operable to acquire image data of a portion of an agricultural area where crops are growing at a sub-millimeter image resolution when the imaging sensor is airborne;
a communication module configured and operable to transmit image data content based on image data acquired by the onboard imaging sensor to an external system;
a connector for connecting the imaging sensor and the communication module to an airborne platform; and
a motor operable to mechanically rotate at least one optical component of the imaging sensor relative to the airborne platform to compensate for movement of the imaging sensor relative to the crop during acquisition,
wherein the imaging sensor is configured and operable to: (a) initiate a focusing process while rotating the at least one optical component when the collection optical axis is at an angle greater than 20 ° to the vertical axis, and (b) collect image data using vertical imaging when the collection optical axis is at an angle less than 20 ° to the vertical axis.
27. The agricultural monitoring system according to claim 26, further comprising an onboard platform that flies the onboard imaging sensor along a flight path over an agricultural area.
28. The agricultural monitoring system according to claim 26, wherein the imaging sensor is configured and operable to acquire image data less than 20 meters above a top of a crop growing in the agricultural area.
29. The agricultural monitoring system according to claim 26, wherein the imaging sensor is configured and operable to acquire image data when flight speeds exceed 10 m/s.
30. The agricultural monitoring system according to claim 29, comprising at least one mechanical coupler coupling at least one component of the imaging sensor to an engine, the at least one component of the imaging sensor being mechanically moved relative to the airborne platform by movement of the engine while image data is acquired by the imaging sensor.
31. The agricultural monitoring system according to claim 29, further comprising an illumination unit configured and operable to illuminate the crop during the acquisition of image data by the imaging sensor.
32. The agricultural monitoring system according to claim 26, further comprising a processor configured and operable to process image data content for detecting leaf disease or indicating parasite effects on leaves in one or more plants in the agricultural area.
33. The agricultural monitoring system according to claim 26, further comprising a processor configured and operable to process the image data content to identify selected agronomic significant data, and based on the selected agronomic significant data, generate agronomic image data for transmission to a remote system.
34. A method of agricultural monitoring, the method comprising:
receiving image data content based on agricultural area image data, wherein the image data is sub-millimeter image resolution image data acquired by an onboard imaging sensor at a set of imaging locations along a flight path extending over an agricultural area, wherein the agricultural area includes a plurality of fields in which at least two types of crops are growing; processing image data content to generate agronomic data comprising agronomic image data, wherein the image data is acquired by:
mechanically rotating at least one optical component of the onboard imaging sensor relative to an onboard platform of a load for compensating for motion of the onboard imaging sensor relative to the crop during acquisition; and is
While the at least one optical component is rotating, for each frame of the plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging; and
transmitting the agronomic data to an end user remote system.
35. The method of claim 34, wherein the processing comprises analyzing the image data content to identify selected agronomically significant data within the image data content; and processing the agronomic critical data to provide the agronomic data.
36. The method of claim 34, wherein the processing comprises applying computerized processing algorithms to the image data content for detecting leaf disease or indicating parasite effects on leaves in one or more plants in the agricultural area.
37. The method of claim 34, wherein the receiving comprises receiving image data content of the agricultural area acquired on different days, and wherein the processing comprises processing the image data content to determine growth parameters of plants in the agricultural area.
38. The method of claim 34, further comprising applying computerized processing algorithms to the agronomic data to select a recipient of the agronomic image data from a plurality of potential recipients based on agronomic expertise of the potential recipients.
39. The method of claim 34, wherein the image data content comprises first image data content of a first agricultural asset of a first owner, and second image data content of a second agricultural asset of a second owner other than the first owner; wherein the sending comprises sending the first image data content in a first message and sending the second data content in a second message.
40. The method of claim 34, wherein the image data content is based on image data acquired at a set of imaging locations along a flight path, the imaging locations being less than 20 meters above a top of crops growing in the agricultural area.
41. A method of monitoring a surface area, the method comprising:
receiving image data content based on ground area image data, wherein the image data is sub-millimeter image resolution image data acquired by an onboard imaging sensor at a set of imaging locations along a flight path extending above a ground area, wherein the image data is acquired by:
mechanically rotating at least one optical component of the onboard imaging sensor relative to an onboard platform of a load for compensating for movement of the onboard imaging sensor relative to the crop during acquisition; and is
While the at least one optical component is rotating, for each frame of the plurality of frames of image data: when the angle between the acquisition optical axis and the vertical axis is more than 20 degrees, starting the focusing process of the imaging sensor, and when the angle between the acquisition optical axis and the vertical axis is less than 20 degrees, acquiring image data by utilizing vertical imaging;
processing the image data content to generate ground data comprising ground image data; and the number of the first and second electrodes,
the terrestrial data is transmitted to the end-user remote system.
42. The method of claim 41, wherein said processing comprises analyzing said image data content to identify selected agronomically significant data within said image data content; and processing the agronomic critical data to provide agronomic data.
43. The method of claim 41, wherein the processing comprises analyzing the image data content to identify selected ground significance data within the image data content; and processes the surface significant data to provide surface data indicative of soil quality of the surface area.
44. The method of claim 41, wherein the processing comprises applying computerized processing algorithms to the image data content to distinguish areas of the ground area having different types of soil.
45. The method of claim 41, wherein the image data content is based on image data acquired at a set of imaging locations along the flight path, the imaging locations being less than 20 meters above the ground area.
46. The method of claim 41, wherein the processing comprises determining a material composition in the ground area, and generating the ground data based on a result of the determining.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL236606 | 2015-01-11 | ||
IL236606A IL236606B (en) | 2015-01-11 | 2015-01-11 | Systems and methods for agricultural monitoring |
PCT/IL2015/051169 WO2016110832A1 (en) | 2015-01-11 | 2015-12-02 | Systems and methods for agricultural monitoring |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107426958A CN107426958A (en) | 2017-12-01 |
CN107426958B true CN107426958B (en) | 2020-08-25 |
Family
ID=56355589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201580073902.0A Active CN107426958B (en) | 2015-01-11 | 2015-12-02 | Agricultural monitoring system and method |
Country Status (11)
Country | Link |
---|---|
US (2) | US10182214B2 (en) |
EP (1) | EP3242544A4 (en) |
CN (1) | CN107426958B (en) |
AU (1) | AU2015376053B2 (en) |
BR (1) | BR112017014855B1 (en) |
CA (1) | CA2973319C (en) |
EA (1) | EA037035B1 (en) |
IL (1) | IL236606B (en) |
MX (1) | MX2017009061A (en) |
WO (1) | WO2016110832A1 (en) |
ZA (1) | ZA201705448B (en) |
Families Citing this family (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL236606B (en) | 2015-01-11 | 2020-09-30 | Gornik Amihay | Systems and methods for agricultural monitoring |
US10597156B2 (en) * | 2015-04-15 | 2020-03-24 | Pierre Emmanuel VIEL | Cleaning drone |
US10322801B1 (en) * | 2015-06-12 | 2019-06-18 | Amazon Technologies, Inc. | Unmanned aerial vehicle based surveillance as a service |
US10313638B1 (en) | 2015-06-12 | 2019-06-04 | Amazon Technologies, Inc. | Image creation using geo-fence data |
CN113238581B (en) | 2016-02-29 | 2024-08-20 | 星克跃尔株式会社 | Method and system for flight control of unmanned aerial vehicle |
US10474144B2 (en) * | 2016-08-01 | 2019-11-12 | The United States Of America, As Represented By The Secretary Of The Navy | Remote information collection, situational awareness, and adaptive response system for improving advance threat awareness and hazardous risk avoidance |
EP3500086B1 (en) * | 2016-08-18 | 2023-07-26 | Tevel Aerobotics Technologies Ltd | System and method for drone fleet management for harvesting and dilution |
US11244398B2 (en) | 2016-09-21 | 2022-02-08 | Iunu, Inc. | Plant provenance and data products from computer object recognition driven tracking |
US11538099B2 (en) | 2016-09-21 | 2022-12-27 | Iunu, Inc. | Online data market for automated plant growth input curve scripts |
US10635274B2 (en) * | 2016-09-21 | 2020-04-28 | Iunu, Inc. | Horticultural care tracking, validation and verification |
US10791037B2 (en) | 2016-09-21 | 2020-09-29 | Iunu, Inc. | Reliable transfer of numerous geographically distributed large files to a centralized store |
US10627386B2 (en) | 2016-10-12 | 2020-04-21 | Aker Technologies, Inc. | System for monitoring crops and soil conditions |
RU2749033C2 (en) * | 2016-10-13 | 2021-06-03 | Маккейн Фудс Лимитед | Method, medium and system for detecting potato virus in agricultural crop image |
WO2018085452A1 (en) | 2016-11-07 | 2018-05-11 | FarmX Inc. | Systems and Methods for Soil Modeling and Automatic Irrigation Control |
US10533956B2 (en) | 2016-12-14 | 2020-01-14 | FarmX Inc. | Multi-depth soil moisture monitoring systems and methods to evaluate soil type, packaged in small round polyvinyl chloride tube, with potting and rodent protection, for effective measurements and installation |
US10586105B2 (en) * | 2016-12-30 | 2020-03-10 | International Business Machines Corporation | Method and system for crop type identification using satellite observation and weather data |
US10664702B2 (en) | 2016-12-30 | 2020-05-26 | International Business Machines Corporation | Method and system for crop recognition and boundary delineation |
US10445877B2 (en) | 2016-12-30 | 2019-10-15 | International Business Machines Corporation | Method and system for crop recognition and boundary delineation |
US11519896B2 (en) | 2017-01-13 | 2022-12-06 | FarmX Inc. | Soil moisture monitoring systems and methods for measuring mutual inductance of area of influence using radio frequency stimulus |
US10746720B2 (en) | 2017-01-13 | 2020-08-18 | FarmX Inc. | Soil moisture monitoring systems and methods for measuring mutual inductance of area of influence using radio frequency stimulus |
US11266054B2 (en) | 2017-01-24 | 2022-03-08 | Cnh Industrial America Llc | System and method for automatically estimating and adjusting crop residue parameters as a tillage operation is being performed |
US10123475B2 (en) | 2017-02-03 | 2018-11-13 | Cnh Industrial America Llc | System and method for automatically monitoring soil surface roughness |
US10909367B2 (en) * | 2017-03-02 | 2021-02-02 | Basecamp Networks, LLC | Automated diagnosis and treatment of crop infestations |
US11319067B2 (en) * | 2017-03-12 | 2022-05-03 | Nileworks | Drone for capturing images of field crops |
JP7218722B2 (en) * | 2017-04-05 | 2023-02-07 | ソニーグループ株式会社 | Information processing device, information processing method, program |
CN108688801A (en) * | 2017-04-05 | 2018-10-23 | 中交遥感载荷(北京)科技有限公司 | A kind of unmanned plane of band colour rice disease geo-radar image identifier prevention rice sheath blight disease |
CN110914780B (en) * | 2017-04-28 | 2023-08-08 | 株式会社OPTiM | Unmanned aerial vehicle operation plan creation system, method, and program |
US10262206B2 (en) | 2017-05-16 | 2019-04-16 | Cnh Industrial America Llc | Vision-based system for acquiring crop residue data and related calibration methods |
US10438302B2 (en) | 2017-08-28 | 2019-10-08 | The Climate Corporation | Crop disease recognition and yield estimation |
US10514554B2 (en) * | 2017-09-30 | 2019-12-24 | Pixart Imaging Inc. | Optical motion detecting device for a flight vehicle |
US10423850B2 (en) * | 2017-10-05 | 2019-09-24 | The Climate Corporation | Disease recognition from images having a large field of view |
LT6619B (en) * | 2017-10-10 | 2019-05-10 | Robotopia, UAB | Spraying device for liquid means of chemical treatment with replaceable liquid subsystem and spraying systems on the basis thereof |
US10631477B2 (en) | 2017-10-30 | 2020-04-28 | Valmont Industries, Inc. | System and method for irrigation management |
US10779458B2 (en) * | 2017-12-01 | 2020-09-22 | International Business Machines Corporation | Monitoring aerial application tasks and recommending corrective actions |
US10705204B2 (en) * | 2017-12-08 | 2020-07-07 | International Business Machines Corporation | Crop classification and growth tracking with synthetic aperture radar |
WO2019121097A1 (en) * | 2017-12-21 | 2019-06-27 | Basf Se | Apparatus for determining agricultural relevant information |
US10621434B2 (en) * | 2018-01-25 | 2020-04-14 | International Business Machines Corporation | Identification and localization of anomalous crop health patterns |
US10607406B2 (en) * | 2018-01-25 | 2020-03-31 | General Electric Company | Automated and adaptive three-dimensional robotic site surveying |
US11062516B2 (en) | 2018-02-07 | 2021-07-13 | Iunu, Inc. | Augmented reality based horticultural care tracking |
US10769466B2 (en) | 2018-02-20 | 2020-09-08 | International Business Machines Corporation | Precision aware drone-based object mapping based on spatial pattern recognition |
CN110309933A (en) * | 2018-03-23 | 2019-10-08 | 广州极飞科技有限公司 | Plant plants data measuring method, work route method and device for planning, system |
US10719709B2 (en) * | 2018-04-06 | 2020-07-21 | Cnh Industrial America Llc | Augmented reality for plant stand management |
US10679056B2 (en) * | 2018-04-06 | 2020-06-09 | Cnh Industrial America Llc | Augmented reality for plant stand management |
CN114982602A (en) * | 2018-05-21 | 2022-09-02 | 自主枢转有限公司 | System and method for converting irrigation hub into social network of autonomous AI agricultural robot |
EP3574751A1 (en) * | 2018-05-28 | 2019-12-04 | Bayer Animal Health GmbH | Apparatus for fly management |
CN108594856B (en) * | 2018-05-29 | 2024-08-06 | 农业部南京农业机械化研究所 | Multi-source information fusion intelligent decision-making autonomous flight plant protection unmanned aerial vehicle and control method |
US11144775B2 (en) * | 2018-06-25 | 2021-10-12 | Cnh Industrial Canada, Ltd. | System and method for illuminating the field of view of a vision-based sensor mounted on an agricultural machine |
US11094193B2 (en) * | 2018-06-28 | 2021-08-17 | Intel Corporation | Real-time vehicle-based data gathering |
DE102018120753A1 (en) * | 2018-08-24 | 2020-02-27 | naiture GmbH & Co. KG | Mobile analysis and processing device |
US11166404B2 (en) | 2018-09-02 | 2021-11-09 | FarmX Inc. | Systems and methods for virtual agronomic sensing |
US10660277B2 (en) * | 2018-09-11 | 2020-05-26 | Pollen Systems Corporation | Vine growing management method and apparatus with autonomous vehicles |
US10779476B2 (en) * | 2018-09-11 | 2020-09-22 | Pollen Systems Corporation | Crop management method and apparatus with autonomous vehicles |
US11108849B2 (en) | 2018-12-03 | 2021-08-31 | At&T Intellectual Property I, L.P. | Global internet of things (IOT) quality of service (QOS) realization through collaborative edge gateways |
CN109711272A (en) * | 2018-12-04 | 2019-05-03 | 量子云未来(北京)信息科技有限公司 | Crops intelligent management method, system, electronic equipment and storage medium |
US20200217830A1 (en) * | 2019-01-08 | 2020-07-09 | AgroScout Ltd. | Autonomous crop monitoring system and method |
EP3679776A1 (en) * | 2019-01-11 | 2020-07-15 | GE Aviation Systems Limited | Method of collecting soil data via an uav |
US10659144B1 (en) | 2019-01-31 | 2020-05-19 | At&T Intellectual Property I, L.P. | Management of massively distributed internet of things (IOT) gateways based on software-defined networking (SDN) via fly-by master drones |
JP2020166584A (en) * | 2019-03-29 | 2020-10-08 | トヨタ自動車株式会社 | Image information collection system and vehicle |
CN110070417A (en) * | 2019-04-19 | 2019-07-30 | 北方天途航空技术发展(北京)有限公司 | Agricultural plant protection unmanned plane job management system and method |
CN110050619A (en) * | 2019-05-10 | 2019-07-26 | 广西润桂科技有限公司 | Unmanned plane sugarcane prevention and control of plant diseases, pest control spray method based on accurate meteorological support technology |
US10957036B2 (en) | 2019-05-17 | 2021-03-23 | Ceres Imaging, Inc. | Methods and systems for crop pest management utilizing geospatial images and microclimate data |
JP6765738B1 (en) * | 2019-06-21 | 2020-10-07 | 株式会社センシンロボティクス | Flight management server and flight management system for unmanned aerial vehicles |
JP7415348B2 (en) * | 2019-07-03 | 2024-01-17 | ソニーグループ株式会社 | Information processing equipment, information processing method, program, sensing system |
TWI760782B (en) * | 2019-07-08 | 2022-04-11 | 國立臺灣大學 | System and method for orchard recognition on geographic area |
CN110691181B (en) * | 2019-09-09 | 2021-10-08 | 苏州臻迪智能科技有限公司 | Camera equipment, camera and unmanned aerial vehicle |
AU2019466291A1 (en) * | 2019-09-20 | 2022-03-31 | Seeing Systems Pty Ltd | Systems and methods for gathering data relating to crops and for managing crop growing operations |
CN110647935B (en) * | 2019-09-23 | 2023-07-25 | 云南电网有限责任公司电力科学研究院 | Method and device for predicting tree growth trend in power transmission line area |
US11357153B2 (en) * | 2019-12-11 | 2022-06-14 | Cnh Industrial Canada, Ltd. | System and method for determining soil clod size using captured images of a field |
CN111114814B (en) * | 2020-01-16 | 2022-04-08 | 四川川测研地科技有限公司 | Self-adaptive focusing holder for linear engineering object based on unmanned aerial vehicle |
US11720980B2 (en) | 2020-03-25 | 2023-08-08 | Iunu, Inc. | Crowdsourced informatics for horticultural workflow and exchange |
WO2021221704A1 (en) * | 2020-04-29 | 2021-11-04 | Pollen Systems Corporation | Crop management method and apparatus with autonomous vehicles |
US20220036537A1 (en) * | 2020-07-28 | 2022-02-03 | The Board Of Trustees Of The University Of Alabama | Systems and methods for detecting blight and code violations in images |
US11464179B2 (en) | 2020-07-31 | 2022-10-11 | FarmX Inc. | Systems providing irrigation optimization using sensor networks and soil moisture modeling |
US12131393B2 (en) | 2020-10-02 | 2024-10-29 | Ecoation Innovative Solutions Inc. | Platform for real-time identification and resolution of spatial production anomalies in agriculture |
US11666004B2 (en) | 2020-10-02 | 2023-06-06 | Ecoation Innovative Solutions Inc. | System and method for testing plant genotype and phenotype expressions under varying growing and environmental conditions |
CN112129757A (en) * | 2020-10-15 | 2020-12-25 | 安阳工学院 | Adaptive detection system and method for plant diseases and insect pests |
US11719681B2 (en) | 2020-10-30 | 2023-08-08 | International Business Machines Corporation | Capturing and analyzing data in a drone enabled environment for ecological decision making |
US11719682B2 (en) | 2020-10-30 | 2023-08-08 | International Business Machines Corporation | Capturing and analyzing data in a drone enabled environment for ecological decision making |
US20220414795A1 (en) * | 2020-11-18 | 2022-12-29 | Hijo Ip Pte Ltd. | Crop disease prediction and treatment based on artificial intelligence (ai) and machine learning (ml) models |
CN112612299B (en) * | 2020-12-01 | 2023-05-23 | 北京麦飞科技有限公司 | Miniature unmanned aerial vehicle cluster variable plant protection method |
US20220222819A1 (en) * | 2021-01-11 | 2022-07-14 | Agtonomy | Crop view and irrigation monitoring |
CN112836725A (en) * | 2021-01-11 | 2021-05-25 | 中国农业科学院农业信息研究所 | A Weakly Supervised LSTM Recurrent Neural Network for Rice Field Recognition Based on Time Series Remote Sensing Data |
US12272136B2 (en) * | 2021-04-30 | 2025-04-08 | Cnh Industrial America Llc | Agricultural systems and methods using image quality metrics for vision-based detection of surface conditions |
US11941880B2 (en) * | 2021-06-02 | 2024-03-26 | Ping An Technology (Shenzhen) Co., Ltd. | System and method for image-based crop identification |
CN114066887B (en) * | 2022-01-11 | 2022-04-22 | 安徽高哲信息技术有限公司 | Rice chalkiness area detection method, device, equipment and storage medium |
CN114402995B (en) * | 2022-01-19 | 2023-03-24 | 北京市农林科学院智能装备技术研究中心 | Corn emasculation method, system and aerial unmanned emasculation machine coordinated by air and ground |
CN114461741B (en) * | 2022-01-24 | 2022-11-04 | 北京师范大学 | A kind of monitoring sample point layout method and device |
CN114994036B (en) * | 2022-05-26 | 2025-01-24 | 浙江大学 | Rice bacterial blight disease severity prediction method and system based on multiple phenotypic parameters |
CN116434126B (en) * | 2023-06-13 | 2023-09-19 | 清华大学 | Micro-vibration speed detection method and device for crops |
CN118603050B (en) * | 2024-08-08 | 2024-10-29 | 广东新禾道信息科技有限公司 | High-standard farmland monitoring method and system based on multi-source remote sensing technology |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6422508B1 (en) * | 2000-04-05 | 2002-07-23 | Galileo Group, Inc. | System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods |
CN103523224A (en) * | 2013-10-31 | 2014-01-22 | 无锡同春新能源科技有限公司 | Unmanned aerial vehicle with colorful rice disease image recognition instrument and used for preventing and controlling rice bacterial leaf blight |
DE202014002338U1 (en) * | 2014-03-15 | 2014-05-14 | Volker Jung | Largely autonomous flying UAV helicopter drone for application of pesticides in agriculture, forestry and viticulture (up to a maximum take-off weight of 150kg) |
CN104199425A (en) * | 2014-09-15 | 2014-12-10 | 中国农业科学院农业信息研究所 | Intelligent agricultural monitoring pre-warning system and method |
Family Cites Families (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5517193A (en) | 1993-04-30 | 1996-05-14 | International Business Machines Corporation | Meteorological workstation |
US5467271A (en) * | 1993-12-17 | 1995-11-14 | Trw, Inc. | Mapping and analysis system for precision farming applications |
US5798786A (en) | 1996-05-07 | 1998-08-25 | Recon/Optical, Inc. | Electro-optical imaging detector array for a moving vehicle which includes two axis image motion compensation and transfers pixels in row directions and column directions |
US6266063B1 (en) | 1997-10-20 | 2001-07-24 | Baron Services, Inc. | Real-time three-dimensional weather display method and weathercast system |
JP3932222B2 (en) | 1998-02-23 | 2007-06-20 | ヤンマー農機株式会社 | Precision farming |
FI112402B (en) | 1999-10-28 | 2003-11-28 | Diware Oy | Method for determining the characteristics of tree stocks and computer programs for performing the procedure |
IL156424A0 (en) | 2000-12-15 | 2004-01-04 | Nooly Technologies Ltd | Location-based weather nowcast system and method |
US7149366B1 (en) * | 2001-09-12 | 2006-12-12 | Flight Landata, Inc. | High-definition hyperspectral imaging system |
RU2207504C1 (en) | 2001-12-06 | 2003-06-27 | Закрытое акционерное общество "ЦКМ" | Method for large-scale aerial photography |
US6653947B2 (en) | 2002-02-20 | 2003-11-25 | Honeywell International Inc. | Apparatus for the display of weather and terrain information on a single display |
US20090297049A1 (en) | 2005-07-07 | 2009-12-03 | Rafael Advanced Defense Systems Ltd. | Detection of partially occluded targets in ladar images |
US20070188605A1 (en) | 2006-02-14 | 2007-08-16 | Deere & Company, A Delaware Corporation | Irrigation remote sensing system |
US7417210B2 (en) | 2006-06-30 | 2008-08-26 | Northrop Grumman Corporation | Multi-spectral sensor system and methods |
US7917346B2 (en) | 2008-02-19 | 2011-03-29 | Harris Corporation | Geospatial modeling system providing simulated tree trunks and branches for groups of tree crown vegetation points and related methods |
US9274250B2 (en) | 2008-11-13 | 2016-03-01 | Saint Louis University | Apparatus and method for providing environmental predictive indicators to emergency response managers |
US8577518B2 (en) * | 2009-05-27 | 2013-11-05 | American Aerospace Advisors, Inc. | Airborne right of way autonomous imager |
US8537337B2 (en) | 2009-12-22 | 2013-09-17 | Weyerhaeuser Nr Company | Method and apparatus for analyzing tree canopies with LiDAR data |
JP5722349B2 (en) * | 2010-01-29 | 2015-05-20 | トムソン ライセンシングThomson Licensing | Block-based interleaving |
AU2011213545A1 (en) | 2010-02-02 | 2012-08-16 | Australian Rain Technologies Pty Limited | Estimation of weather modification effects |
US9752932B2 (en) * | 2010-03-10 | 2017-09-05 | Drexel University | Tunable electro-optic filter stack |
MY173920A (en) | 2010-06-04 | 2020-02-27 | Univ Malaysia Perlis | A flying apparatus for aerial agricultural application |
US8538695B2 (en) | 2010-06-30 | 2013-09-17 | Weyerhaeuser Nr Company | System and method for analyzing trees in LiDAR data using views |
US8768667B2 (en) * | 2010-10-25 | 2014-07-01 | Trimble Navigation Limited | Water erosion management incorporating topography, soil type, and weather statistics |
US8897483B2 (en) | 2010-11-09 | 2014-11-25 | Intelescope Solutions Ltd. | System and method for inventorying vegetal substance |
WO2012092554A1 (en) | 2010-12-30 | 2012-07-05 | Utility Risk Management Corporation, Llc | Method for locating vegetation having a potential to impact a structure |
US9756844B2 (en) | 2011-05-13 | 2017-09-12 | The Climate Corporation | Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application |
US8775081B2 (en) | 2011-09-26 | 2014-07-08 | Weyerhaeuser Nr Company | Method and apparatus for sorting LiDAR data |
WO2013056861A1 (en) | 2011-10-21 | 2013-04-25 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Optical device and method for measuring a complexly formed object |
WO2013078328A2 (en) | 2011-11-22 | 2013-05-30 | Precision Planting Llc | Stalk sensor apparatus, systems, and methods |
US9819964B2 (en) | 2012-05-04 | 2017-11-14 | Environmental Systems Research Institute, Inc. | Limited error raster compression |
US10520482B2 (en) | 2012-06-01 | 2019-12-31 | Agerpoint, Inc. | Systems and methods for monitoring agricultural products |
BR112015002799B1 (en) | 2012-08-10 | 2022-08-30 | Climate Llc | METHOD OF MONITORING AN AGRICULTURAL IMPLEMENT |
US9063544B2 (en) | 2012-09-19 | 2015-06-23 | The Boeing Company | Aerial forest inventory system |
US20140089045A1 (en) * | 2012-09-27 | 2014-03-27 | Superior Edge, Inc. | Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users |
CA2894568C (en) | 2012-12-17 | 2019-12-10 | Precision Planting Llc | Plot placement systems and methods |
US20140316614A1 (en) * | 2012-12-17 | 2014-10-23 | David L. Newman | Drone for collecting images and system for categorizing image data |
US10327393B2 (en) | 2013-03-07 | 2019-06-25 | Blue River Technology Inc. | Modular precision agriculture system |
US20140312165A1 (en) * | 2013-03-15 | 2014-10-23 | Armen Mkrtchyan | Methods, apparatus and systems for aerial assessment of ground surfaces |
WO2014186810A1 (en) | 2013-05-17 | 2014-11-20 | Precision Planting Llc | System for soil moisture monitoring |
US8849523B1 (en) * | 2013-05-20 | 2014-09-30 | Elwha Llc | Systems and methods for detecting soil characteristics |
US9767521B2 (en) | 2013-08-30 | 2017-09-19 | The Climate Corporation | Agricultural spatial data processing systems and methods |
WO2015102731A2 (en) * | 2013-10-18 | 2015-07-09 | Aerovironment, Inc. | Privacy shield for unmanned aerial systems |
CN103523226B (en) | 2013-10-31 | 2015-09-30 | 无锡同春新能源科技有限公司 | A kind of unmanned plane with the colored rice disease image identifying instrument water prevention sheath and culm blight of rice |
CN203528823U (en) | 2013-10-31 | 2014-04-09 | 无锡同春新能源科技有限公司 | Rice bacterial leaf blight preventing unmanned aerial vehicle with colored rice disease image identifier |
CN203528822U (en) | 2013-10-31 | 2014-04-09 | 无锡同春新能源科技有限公司 | Rice sheath blight disease preventing unmanned aerial vehicle with colored rice disease image identifier |
WO2015075700A1 (en) | 2013-11-25 | 2015-05-28 | First Resource Management Group Inc. | Apparatus for and method of forest-inventory management |
EP3097687A4 (en) | 2014-01-22 | 2017-04-26 | Izak Van Cruyningen | Forward motion compensated flight path |
US9974226B2 (en) | 2014-04-21 | 2018-05-22 | The Climate Corporation | Generating an agriculture prescription |
US9813601B2 (en) | 2014-05-06 | 2017-11-07 | Urugus S.A. | Imaging device for scenes in apparent motion |
CN104050649A (en) * | 2014-06-13 | 2014-09-17 | 北京农业信息技术研究中心 | Agricultural remote sensing system |
US9641736B2 (en) | 2014-06-20 | 2017-05-02 | nearmap australia pty ltd. | Wide-area aerial camera systems |
US9709987B2 (en) | 2014-07-31 | 2017-07-18 | Elwha Llc | Systems and methods for deactivating plant material outside of a growing region |
US9717178B1 (en) | 2014-08-08 | 2017-08-01 | The Climate Corporation | Systems and method for monitoring, controlling, and displaying field operations |
UA123573C2 (en) | 2014-08-22 | 2021-04-28 | Зе Клаймет Корпорейшн | Methods for agronomic and agricultural monitoring using unmanned aerial systems |
US10109024B2 (en) | 2014-09-05 | 2018-10-23 | The Climate Corporation | Collecting data to generate an agricultural prescription |
US9519861B1 (en) | 2014-09-12 | 2016-12-13 | The Climate Corporation | Generating digital models of nutrients available to a crop over the course of the crop's development based on weather and soil data |
CA2964275A1 (en) | 2014-10-21 | 2016-04-28 | Tolo, Inc. | Remote detection of insect infestation |
WO2016099723A2 (en) * | 2014-11-12 | 2016-06-23 | SlantRange, Inc. | Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles |
IL236606B (en) | 2015-01-11 | 2020-09-30 | Gornik Amihay | Systems and methods for agricultural monitoring |
USD768626S1 (en) | 2015-03-05 | 2016-10-11 | The Climate Corporation | Data processing device |
USD783609S1 (en) | 2015-05-07 | 2017-04-11 | The Climate Corporation | Data storage device |
US9609112B2 (en) | 2015-05-19 | 2017-03-28 | The Climate Corporation | Protective connector and applications thereof |
CN105116407B (en) | 2015-06-26 | 2017-08-08 | 北京师范大学 | A kind of method that vegetation coverage is measured using handheld laser range finder |
US9969492B2 (en) | 2015-09-04 | 2018-05-15 | Nutech Ventures | Crop height estimation with unmanned aerial vehicles |
US10025983B2 (en) | 2015-09-21 | 2018-07-17 | The Climate Corporation | Ponding water detection on satellite imagery |
US10046187B2 (en) | 2015-10-09 | 2018-08-14 | Leonard E. Doten | Wildfire aerial fighting system utilizing lidar |
US9721181B2 (en) | 2015-12-07 | 2017-08-01 | The Climate Corporation | Cloud detection on remote sensing imagery |
CN105527969B (en) | 2015-12-17 | 2018-07-06 | 中国科学院测量与地球物理研究所 | A kind of mountain garden belt investigation and monitoring method based on unmanned plane |
WO2017106874A1 (en) | 2015-12-18 | 2017-06-22 | Intellifarm, Inc. | Autonomous integrated farming system |
WO2017167793A1 (en) | 2016-03-31 | 2017-10-05 | Husqvarna Ab | Forestry management device |
US9881214B1 (en) | 2016-07-13 | 2018-01-30 | The Climate Corporation | Generating pixel maps from non-image data and difference metrics for pixel maps |
CN106408578A (en) | 2016-09-22 | 2017-02-15 | 北京数字绿土科技有限公司 | Single-tree segmentation method and device |
US10028451B2 (en) | 2016-11-16 | 2018-07-24 | The Climate Corporation | Identifying management zones in agricultural fields and generating planting plans for the zones |
US10204270B2 (en) | 2016-11-17 | 2019-02-12 | Fruitspec Ltd | Method and system for crop yield estimation |
CN107238574A (en) | 2017-06-07 | 2017-10-10 | 江苏大学 | The detection of plant growing way and the diagnostic method of fertilising are targetted towards cotton |
US11391820B2 (en) | 2019-04-26 | 2022-07-19 | Waymo L LC | Mirrors to extend sensor field of view in self-driving vehicles |
-
2015
- 2015-01-11 IL IL236606A patent/IL236606B/en active IP Right Grant
- 2015-12-02 EP EP15876753.3A patent/EP3242544A4/en not_active Ceased
- 2015-12-02 BR BR112017014855-2A patent/BR112017014855B1/en active IP Right Grant
- 2015-12-02 CA CA2973319A patent/CA2973319C/en active Active
- 2015-12-02 WO PCT/IL2015/051169 patent/WO2016110832A1/en active Application Filing
- 2015-12-02 MX MX2017009061A patent/MX2017009061A/en unknown
- 2015-12-02 US US15/542,853 patent/US10182214B2/en active Active
- 2015-12-02 AU AU2015376053A patent/AU2015376053B2/en active Active
- 2015-12-02 CN CN201580073902.0A patent/CN107426958B/en active Active
- 2015-12-02 EA EA201791589A patent/EA037035B1/en unknown
-
2017
- 2017-08-11 ZA ZA2017/05448A patent/ZA201705448B/en unknown
-
2018
- 2018-11-28 US US16/203,558 patent/US11050979B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6422508B1 (en) * | 2000-04-05 | 2002-07-23 | Galileo Group, Inc. | System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods |
CN103523224A (en) * | 2013-10-31 | 2014-01-22 | 无锡同春新能源科技有限公司 | Unmanned aerial vehicle with colorful rice disease image recognition instrument and used for preventing and controlling rice bacterial leaf blight |
DE202014002338U1 (en) * | 2014-03-15 | 2014-05-14 | Volker Jung | Largely autonomous flying UAV helicopter drone for application of pesticides in agriculture, forestry and viticulture (up to a maximum take-off weight of 150kg) |
CN104199425A (en) * | 2014-09-15 | 2014-12-10 | 中国农业科学院农业信息研究所 | Intelligent agricultural monitoring pre-warning system and method |
Non-Patent Citations (1)
Title |
---|
Hyperspectral kit goes airborne;http://optics.org/news/4/7/12;《Hyperspectral kit goes airborne》;20130705;第1-3页 * |
Also Published As
Publication number | Publication date |
---|---|
US20170374323A1 (en) | 2017-12-28 |
AU2015376053B2 (en) | 2019-07-11 |
ZA201705448B (en) | 2018-07-25 |
CN107426958A (en) | 2017-12-01 |
BR112017014855B1 (en) | 2021-06-08 |
EP3242544A1 (en) | 2017-11-15 |
CA2973319C (en) | 2020-10-13 |
CA2973319A1 (en) | 2016-07-14 |
MX2017009061A (en) | 2018-03-15 |
US11050979B2 (en) | 2021-06-29 |
EP3242544A4 (en) | 2018-08-22 |
IL236606B (en) | 2020-09-30 |
US20190253673A1 (en) | 2019-08-15 |
WO2016110832A1 (en) | 2016-07-14 |
BR112017014855A2 (en) | 2018-01-09 |
EA201791589A1 (en) | 2017-12-29 |
EA037035B1 (en) | 2021-01-28 |
AU2015376053A1 (en) | 2017-08-24 |
US10182214B2 (en) | 2019-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107426958B (en) | Agricultural monitoring system and method | |
Mukherjee et al. | A survey of unmanned aerial sensing solutions in precision agriculture | |
Krishna | Agricultural drones: a peaceful pursuit | |
CN107148633B (en) | Method for agronomic and agricultural monitoring using unmanned aerial vehicle system | |
Pajares | Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs) | |
Katsigiannis et al. | An autonomous multi-sensor UAV system for reduced-input precision agriculture applications | |
Zhang et al. | The application of small unmanned aerial systems for precision agriculture: a review | |
US12025602B2 (en) | Autonomous crop monitoring system and method | |
CN111225855A (en) | Unmanned plane | |
Swain et al. | Rice crop monitoring with unmanned helicopter remote sensing images | |
US20180348760A1 (en) | Automatic Change Detection System | |
Tahir et al. | Application of unmanned aerial vehicles in precision agriculture | |
AU2019466291A1 (en) | Systems and methods for gathering data relating to crops and for managing crop growing operations | |
Chavez et al. | A decade of unmanned aerial systems in irrigated agriculture in the Western US | |
WO2011160159A1 (en) | A system and a method for generating a spectral image for a plot of land | |
Rathod et al. | Autonomous aerial system (UAV) for sustainable agriculture: a review | |
Yang | Remote sensing technologies for crop disease and pest detection | |
do Amaral et al. | Application of drones in agriculture | |
Ivošević et al. | A drone view for agriculture | |
Ristorto et al. | Mission planning for the estimation of the field coverage of unmanned aerial systems in monitoring mission in precision farming | |
Izere | Plant Height Estimation Using RTK-GNSS Enabled Unmanned Aerial Vehicle (UAV) Photogrammetry | |
Jackson et al. | How remote sensing is offering complementing and diverging opportunities for precision agriculture users and researchers | |
Ruwanpathirana et al. | Assessment of the Optimal Flight Time of RGB Image Based Unmanned Aerial Vehicles for Crop Monitoring | |
Percival et al. | Potential for commercial unmanned aerial vehicle use in wild blueberry production | |
Bhargaw et al. | Chapter-3 Unmanned Aerial Vehicles and Its Application in Agriculture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |