Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a patrol equipment, which runs on a track in an agricultural area, wherein the track is a single track deployed along a crop planting position in the agricultural area, and comprises:
Binocular camera module includes:
the first camera is used for acquiring a first image of the crop;
the second camera is used for acquiring a second image of the crop;
the image processing module is connected with the binocular shooting module and is used for processing the first image and the second image to obtain the actual size of the target characteristic of the crop and the growth information of the non-target characteristic of the crop;
And the wireless transmission module is connected with the image processing module and used for transmitting the growth condition information of the crops to the cloud.
Preferably, the image processing module further comprises a ranging module;
The range finding module includes:
A target determining unit for determining target features of crops appearing in the first image and the second image at the same time, respectively, and detecting imaging dimensions of the target features:
And the depth calculation unit is connected with the target determination unit and is used for calculating the depth of the target feature through the imaging size of the target feature and determining the actual size of the target feature according to the depth of the target feature.
Preferably, the image processing module further comprises an image stitching module for stitching the first image information and the second image to obtain stitched images, wherein the image information of the stitched images is convolutionally extracted by a convolutional neural network of the image processing module to obtain non-target feature images of crops in the stitched images, and then network parameters of a full-connection layer, a Softmax classifier and frame regression are trained and adjusted by a migration learning method, and the detection results of the non-target features are output to obtain growth information of the non-target features of the crops.
Preferably, the binocular camera module is further used for detecting track information of the running track of the inspection equipment;
the inspection apparatus further includes:
The control module is connected with the binocular camera module and is used for at least one of the following:
according to the track information of the running track, when the track information indicates that the inspection equipment turns around on the track, the inspection equipment is controlled to turn around;
And according to the track information of the running track, when the track information indicates that the inspection equipment climbs a slope on the track in advance, controlling the inspection equipment to climb the slope.
Preferably, the inspection apparatus further comprises:
The inspection mode selection module is connected with the control module;
the inspection mode selection module comprises:
the daily inspection mode module is used for periodically and regularly inspecting crops;
The temporary inspection mode module is used for inspecting crops in severe weather.
The invention also provides a patrol method for carrying out patrol by adopting the patrol equipment, which comprises the following steps:
Driving the inspection equipment to run on a single track laid in a crop planting position area;
Respectively acquiring a first image and a second image of crops by using a first camera and a second camera of the rotating binocular camera module;
The first image and the second image are processed to obtain the actual size of the target feature of the crop and the growth information of the non-target feature of the crop;
and transmitting the growth condition information of the crops to a cloud by utilizing the wireless transmission module.
Preferably, the processing the first image and the second image to obtain the actual size of the target feature of the crop and the growth information of the non-target feature of the crop includes:
determining target features of a target crop simultaneously appearing in the first image and the second image, respectively, and detecting imaging dimensions of the target features:
And determining the actual size of the target feature according to the depth of the target feature.
Preferably, the processing the first image and the second image to obtain the actual size of the target feature of the crop and the growth information of the non-target feature of the crop includes:
And convolving and extracting non-target features of crops in the spliced image through a convolutional neural network of an image processing module to obtain a non-target feature map, training and adjusting network parameters of a full-connection layer, a Softmax classifier and frame regression through a transfer learning method, and outputting a detection result of the non-target features to obtain growth information of the non-target features of the crops.
Preferably, the driving inspection device runs on a single track paved in a crop planting position area, and comprises one of the following steps:
Under a daily inspection mode, in response to receiving the timing prompt information, driving inspection equipment to run on a single track paved in a crop planting position area;
in the temporary inspection mode, the inspection device is driven to run on a single track laid in the crop planting position area in response to detection of severe weather.
The invention also provides a patrol equipment, which comprises a processor and a memory for storing a computer program capable of running on the processor, wherein the processor is used for realizing the patrol method when the computer program runs.
Compared with the prior art, the invention has the beneficial effects that:
According to the invention, the inspection equipment for inspecting crops is arranged, and the binocular camera module is arranged on the inspection equipment, so that two cameras of the binocular camera module can move and photograph each crop while respectively moving, the photographed pictures are processed by the processing module to obtain the actual size of the target characteristics of each crop and the growth information of the non-target characteristics of each crop, then the growth condition information of each crop is determined, finally the growth information of all the crops in the crop planting area is determined, and the growth information of the crops is transmitted to the far end for management analysis, so that the accurate inspection of the crops is realized, and the inspection coverage is comprehensive and the inspection precision is high.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
In the embodiment of the invention, the application scene of the inspection equipment can be used for inspecting the planted crops, and it is understood that in the prior art, for the crops planted on a large scale, the inspection of the crops generally depends on manual inspection or unmanned aerial vehicle inspection. The defect during manual inspection is that the inspection coverage rate is low, the inspection is long in one time consumption, especially crops are planted in hilly areas, and the difficulty and the workload of manual inspection are increased due to the complexity of the topography of the planting area. The unmanned aerial vehicle is adopted for inspection, particularly, the camera is used for shooting from top to bottom, but the inspection mode can only inspect crops from one dimension, so that inspection is not fine enough, and the crops cannot be inspected omnidirectionally.
Based on the above, how to carry out inspection on crops in agricultural lands in an omnibearing and efficient manner becomes a technical problem to be solved urgently.
The following is an example of an agricultural land for an orchard and a crop for a fruit tree. Of course, in other embodiments, the agricultural land may be, for example, a vegetable garden, and the crop may be vegetables, etc., without any limitation.
Fig. 1 is a schematic structural diagram of an inspection device according to an embodiment of the present invention, where the inspection device is shown in fig. 1 to run on a track in a fruit tree planting area, and the track is a single track deployed along a fruit tree planting position in an agricultural area, and includes:
Binocular camera module includes:
The first camera is used for collecting a first image of each fruit tree;
the second camera is used for collecting a second image of each fruit tree;
The image processing module is connected with the binocular shooting module and is used for processing the first image and the second image to obtain the actual size of the target feature of the fruit tree and the growth information of the non-target feature of the fruit tree;
and the wireless transmission module is connected with the image processing module and used for transmitting the growth condition information of the fruit trees to the cloud.
The agricultural land may be located in a plain or hilly area.
The planting of the fruit trees adopts the distribution according to one row or one row (or called strip),
The binocular camera module is arranged at the top of the inspection equipment, and the first camera and the second camera of the binocular camera module can rotate clockwise or anticlockwise. When advancing on the track for the fruit tree is located which side of track, first camera and second camera all can shoot the fruit tree. And the first camera collects a first image of each fruit tree and the second camera collects a second image of the fruit tree, wherein the second image has an overlapping area.
The first camera and the second camera can be set as visible light cameras or infrared cameras. The camera can shoot fruit trees under the condition of better illumination conditions, and the infrared camera is used for forming video images by receiving the infrared rays which are diffusely reflected after the shot object is irradiated, so that the infrared camera is suitable for shooting fruit trees under the condition of poor illumination conditions or no illumination.
The target features of the fruit tree herein include, but are not limited to, the fruit of the fruit tree and/or the trunk of the fruit tree, etc., and the actual dimensions of the target features herein include, but are not limited to, the actual size of the fruit and/or the actual cross-sectional diameter of the trunk.
Non-target characteristics of fruit trees herein include, but are not limited to, leaves of fruit trees and/or flowers of fruit trees. And the growth information of the non-target characteristics of the fruit tree herein includes, but is not limited to, the morphology of the fruit tree leaves at different times during a growing period of the fruit tree and/or the morphology of the flowers of the fruit tree during the growing period.
The growth status information of the fruit tree refers to the growth situation of the fruit tree in one growth period.
According to the embodiment, the binocular shooting assembly shoots a picture of a fruit tree, target features such as fruits and/or trunks and non-target features such as leaves of the fruit tree and/or flowers of the fruit tree are extracted from the picture, the actual size of the target features is calculated, the non-target features are extracted from the image, and the form of the non-target features is obtained, so that the growth situation of the fruit tree in one growth period is determined. And transmitting the growth situation data to a remote end for management analysis. The embodiment realizes the inspection of the fruit trees, the inspection covers the whole orchard, the inspection precision is high, and the grower can more accurately know the growth condition of each fruit tree in one growth period.
Fig. 2 is a schematic structural diagram of an image processing module according to an embodiment of the present invention, where, as shown in fig. 2, the image processing module includes a ranging module and an image stitching module, and in particular:
Fig. 3 is a schematic structural diagram of a ranging module according to an embodiment of the present invention, as shown in fig. 3, where the image processing module in this embodiment is used to obtain an actual size of a target feature of a fruit tree, and specifically includes a ranging module, where the ranging module includes:
The target determining unit is used for determining target features of the fruit trees simultaneously appearing in the first image and the second image respectively and detecting imaging sizes of the target features:
The depth calculating unit is connected with the target determining unit and is used for calculating the depth of the target feature through the imaging size of the target feature and determining the actual size of the target feature according to the depth of the target feature.
The target determining unit extracts the target features of the fruit trees in the first image and the second image by selecting a neural network model which is trained on a large data set, determines the imaging size of the target features, and determines the depth of the target features by the imaging size of the target features.
There are many ways to calculate the depth by imaging the size, and the depth may be a relative depth or an absolute depth, where the relative depth refers to a change in the depth of the target object, for example, the greater the depth is when the imaging size is smaller, and the absolute depth may correspond to the depth value. The depth of the target object may be determined by continuous measurement of the physical relative depth or absolute depth of the target.
The depth of the fruit tree target feature is determined by the imaging size of the target feature, the relative depth of the target feature can be determined by the ratio of the imaging length or width or area of the target feature to the whole image, or the relative depth of the target feature is determined by the continuous imaging size of the target feature, or the depth of the target feature is determined by the comparison relation between the imaging size and the depth which are prefabricated and queried by the imaging size of the target feature.
And obtaining the actual size of the target feature by adopting scaling after determining the depth of the target feature.
The depth of the determined target feature may also be obtained by ranging, as shown in fig. 4:
P is a certain point on the target feature, OR and OT are optical centers of the first camera and the second camera respectively, imaging points of the point P on photoreceptors of the two cameras are P and P '(an imaging plane of the camera is placed in front of a lens after rotation), f is a focal length of the camera, B is a center distance of the two cameras, Z is a depth of the target feature, and a distance from the setting point P to the point P' is dis, wherein:
;
according to the principle of similar triangles:
;
the following steps are obtained:
the focal length f and the center distance B of the two cameras can be obtained through calibration measurement,
Accordingly, depth information can be obtained by obtaining the parallax between the first image and the second image.
And a method of obtaining the parallax of the first image and the second image:
respectively estimating a left-to-right disparity map sequence and a right-to-left disparity map sequence of a first image and a second image frame by adopting a stereo region matching method;
the method specifically comprises the steps of performing over-segmentation on a binocular stereoscopic video image by using a mean-shift algorithm;
and secondly, constructing an energy function as follows:
Eenergy=Edata+λEsmooth
Wherein E data is a data item, E smooth is a smoothing item, lambda is a weight coefficient,
And constructing a corresponding grid graph according to the data item and the smooth item, and calculating parallax information corresponding to each pixel by adopting an algorithm to obtain the parallax graph.
The intra-frame error comprehensively uses a brightness absolute difference average MAD criterion and a gradient absolute difference sum SGRAD criterion, and each pixel adaptively adjusts weight according to the importance degree in a window;
and (3) performing consistency check on the parallax image sequence from left to right and the parallax image sequence from right to left, and calculating parallax after correcting unreliable field matching information. And finally obtaining depth information through the calculated parallax.
In other embodiments, the image processing module is used for splicing the crop pictures shot by the first camera and the second camera to obtain spliced images, and the spliced images are used for obtaining the images of all fruit trees in the planting area, so that the situation that the first camera or the second camera only shoots partial images of a certain fruit tree and cannot obtain the growth information of the fruit tree is avoided.
In order to achieve the above effects, the image processing module comprises an image stitching module, wherein the image stitching module is used for stitching first image information with second images to obtain stitched images, and the stitching specifically adopts the following method:
Shooting a first image and a second image of the same fruit tree, selecting one of the first image and the second image as a reference image, and selecting the other one as an image to be spliced, wherein the reference image and the image to be spliced have an overlapping area;
Respectively carrying out first remapping on the reference image and the image to be spliced;
extracting rough matching characteristic point pairs between the remapped reference image and the remapped image to be spliced;
obtaining a fine matching feature point pair from the coarse matching feature point pair by utilizing second remapping;
Estimating a rotation matrix and an offset matrix of the first camera and the second camera by utilizing the fine matching characteristic point pairs;
And respectively splicing the reference image and the image to be spliced according to the rotation matrix and the offset matrix to obtain a spliced image.
The method for respectively carrying out first remapping on the reference image and the image to be spliced comprises the following steps:
and respectively carrying out first remapping on the reference image and the image to be spliced by using the equidistant cylindrical projections to obtain corresponding equidistant cylindrical projection images which are respectively used as the remapped reference image and the remapped image to be spliced.
Wherein obtaining the fine matching feature point pair from the coarse matching feature point pair using the second remapping comprises:
Performing second remapping on the coordinates of the rough matching feature point pairs by using plane projection;
and obtaining the characteristic point pairs which accord with homography in the coarse matching characteristic point pairs as the fine matching characteristic point pairs.
According to the embodiment, by adopting a static stitching mode, the feature points of the original image set are extracted and matched by utilizing the geometric property of projection of the acquired image, so that an accurate control point pair is obtained, and the high quality of the stitched image is realized.
The spliced image of the embodiment is sent into a 101-layer convolutional neural network to output a non-target feature map of the fruit tree;
And training and adjusting network parameters of the final full-connection layer, the Softmax classifier and the frame regression by a transfer learning method to obtain a recognition result of the non-target characteristics of the fruit tree, and judging the growth form of the non-target characteristics by the recognition result.
And obtaining the growth condition information of the fruit tree through the obtained growth form of the fruit tree non-target characteristics and the actual size of the fruit tree target characteristics.
In other embodiments, the binocular camera module may be further configured to detect track information of the track of the inspection apparatus;
the inspection equipment further comprises a control module which is connected with the binocular camera module and used for at least one of the following:
according to the track information of the running track, when the track information is determined to indicate the inspection equipment to turn around on the track, controlling the inspection equipment to turn around;
and according to the track information of the running track, when the track information is determined to indicate the inspection equipment to climb a slope on the track, controlling the inspection equipment to climb the slope.
The running track information of the embodiment can be obtained through certain special position information points on the track, and the special positions can be the turning positions of the track or the climbing positions of the track. The signal transceiver may be provided at the specific location described above. For example, photoelectric sensors can be arranged at all corners of the track, or proximity switches are arranged at the climbing position of the track, and the photoelectric sensors and the proximity switches are electrically connected with the control module. The inspection equipment can automatically realize turning and climbing when running on the track.
After the inspection equipment shoots a fruit tree, the inspection equipment automatically moves along the track to shoot the next fruit tree.
The inspection equipment further comprises an inspection mode selection module, and the inspection mode selection module is connected with the control module;
the inspection mode selection module comprises:
the daily inspection mode module is used for periodically and regularly inspecting the fruit trees;
The temporary inspection mode module is used for inspecting crops in severe weather.
The regular inspection in the daily inspection mode can be set to inspect the fruit trees of the orchard every three days as one inspection period, and the regular inspection can inspect the fruit trees in three time periods of six a.m., one noon and eight night within the inspection day. Of course, the period and time period of the inspection can be determined according to actual requirements, and the invention is not limited in any way herein.
Furthermore, the inspection mode may be temporary inspection performed in some special weather, where the special weather in this embodiment may be when the temperature of the air in the orchard is higher than 30 ℃ or the temperature of the air in the orchard is lower than 4 ℃ or the humidity of the air in the orchard is higher than 85%, and the control modules all drive the inspection device to perform temporary inspection. The temperature and humidity of the air in the orchard can be measured by arranging a temperature sensor and a humidity sensor on the inspection equipment, and the sensors are connected with a control module. Of course, the temperature and humidity of the temporary inspection can be determined according to the variety and actual requirements of the fruit tree, and the invention is not limited in any way.
It should be noted that, the parameter setting of regular inspection and the temperature and humidity parameter setting of temporary inspection can be set by the control panel on the inspection equipment, and the parameter setting can also be performed by the remote control terminal of the inspection equipment.
In the embodiment of the invention, the regular inspection of the fruit trees and the temporary inspection of the fruit trees in severe weather such as high temperature, low temperature, heavy rain and the like can be realized by setting the daily inspection mode and the temporary inspection mode.
Fig. 5 is a schematic flow chart of an inspection method according to an embodiment of the present invention, as shown in fig. 1, where the inspection method is applied to the inspection device to inspect fruit trees, and the inspection method includes:
driving the inspection equipment to run on a single track laid in a fruit tree planting position area;
Respectively acquiring a first image and a second image of each fruit tree by using a first camera and a second camera of the rotating binocular camera module;
The method comprises the steps of processing a first image and a second image to obtain the actual size of target features of the fruit tree and the growth information of non-target features of the fruit tree;
and transmitting the growth condition information of the fruit trees to the cloud by utilizing the wireless transmission module.
In some alternative embodiments, processing the first image and the second image to obtain the actual size of the target feature of the fruit tree and the growth information of the non-target feature of the fruit tree includes:
Determining target features of fruit trees simultaneously appearing in the first image and the second image respectively, and detecting imaging sizes of the target features:
and determining the actual size of the target feature according to the depth of the target feature.
In some alternative embodiments, processing the first image and the second image to obtain the actual size of the target feature of the fruit tree and the growth information of the non-target feature of the fruit tree includes:
And convolving and extracting non-target features of crops in the spliced image by a convolutional neural network of an image processing module to obtain a non-target feature map, training and adjusting network parameters of a full-connection layer, a Softmax classifier and frame regression by a transfer learning method, and outputting a detection result of the non-target features to obtain growth information of the non-target features of the fruit trees.
In some alternative embodiments, the inspection device is driven to run on a single track laid in the fruit tree planting position area, comprising one of the following:
Under a daily inspection mode, in response to receiving the timing prompt information, driving inspection equipment to run on a single track paved in a fruit tree planting position area;
in the temporary inspection mode, the inspection equipment is driven to run on a single track laid in the fruit tree planting position area in response to the detection of severe weather.
It should be noted that the description of the above inspection method item is similar to the description of the inspection equipment item, and the description of the beneficial effects of the same method is omitted. For technical details not disclosed in the embodiments of the inspection method of the present invention, please refer to the description of the embodiments of the inspection apparatus of the present invention.
As shown in fig. 6, the embodiment of the invention further provides a patrol equipment, which comprises a memory 1, a processor 2 and computer instructions stored in the memory 1 and capable of running on the processor 2, wherein the steps applied to the vulnerability scanning method are realized when the processor 2 executes the instructions.
In some embodiments, the memory 1 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATA RATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and Direct memory bus random access memory (DRRAM). The memory 1 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
While the processor 2 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 2 or by instructions in the form of software. The Processor 2 may be a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), an off-the-shelf programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1 and the processor 2 reads the information in the memory 1 and in combination with its hardware performs the steps of the above method.
In some embodiments, the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application SPECIFIC INTEGRATED Circuits (ASICs), digital signal processors (DIGITAL SIGNAL Processing, DSPs), digital signal Processing devices (DSP DEVICE, DSPD), programmable logic devices (Programmable Logic Device, PLDs), field-Programmable gate arrays (Field-Programmable GATE ARRAY, FPGA), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units for performing the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
A further embodiment of the present invention provides a computer storage medium storing an executable program which, when executed by the processor 2, implements the steps applied to the driving method. Such as one or more of the methods shown in fig. 1.
In some embodiments, the computer storage medium may include a U disk, a removable hard disk, a Read Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk or an optical disk, etc. various media that may store program codes.
The technical schemes described in the embodiments of the present invention may be arbitrarily combined without any conflict.
Finally, only specific embodiments of the present invention have been described in detail above. The invention is not limited to the specific embodiments described above. Equivalent modifications and substitutions of the invention will occur to those skilled in the art, and are intended to be within the scope of the present invention. Accordingly, equivalent changes and modifications are intended to be included within the scope of the present invention without departing from the spirit and scope thereof.