[go: up one dir, main page]

CN115019215B - Hyperspectral image-based soybean disease and pest identification method and device - Google Patents

Hyperspectral image-based soybean disease and pest identification method and device Download PDF

Info

Publication number
CN115019215B
CN115019215B CN202210947014.9A CN202210947014A CN115019215B CN 115019215 B CN115019215 B CN 115019215B CN 202210947014 A CN202210947014 A CN 202210947014A CN 115019215 B CN115019215 B CN 115019215B
Authority
CN
China
Prior art keywords
hyperspectral
image
soybean
data set
spectral characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210947014.9A
Other languages
Chinese (zh)
Other versions
CN115019215A (en
Inventor
魏日令
徐晓刚
王军
马寅星
虞舒敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202210947014.9A priority Critical patent/CN115019215B/en
Publication of CN115019215A publication Critical patent/CN115019215A/en
Application granted granted Critical
Publication of CN115019215B publication Critical patent/CN115019215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/778Active pattern-learning, e.g. online learning of image or video features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/194Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Image Processing (AREA)

Abstract

本发明公开一种基于高光谱图像的大豆病虫害识别方法和装置,该方法包括:步骤一,利用无人机搭载的高光谱相机与RGB相机采集高光谱数据集及其对应的RGB数据集;步骤二,对采集的高光谱数据集进行数据增广;步骤三,对RGB图像进行植株区域分割后与对应的高光谱图像进行像素点相乘得到含植株区域的图像,对该图像进行预处理计算出各类别平均光谱特性曲线;步骤四,输入高光谱数据集至大豆病虫害识别网络模型,采用课程学习方式以及各类别平均光谱特性曲线进行模型训练;步骤五,采用训练好的大豆病虫害识别网络模型,对采集输入的高光谱图像进行预测分类,输出最终预测的虫害类别。本发明能有效提高大豆病虫害识别的准确度。

Figure 202210947014

The invention discloses a method and device for identifying soybean diseases and insect pests based on hyperspectral images. The method includes: step 1, using a hyperspectral camera and an RGB camera carried by a drone to collect a hyperspectral data set and its corresponding RGB data set; step Second, perform data augmentation on the collected hyperspectral data set; step three, perform pixel multiplication on the RGB image after segmenting the plant area with the corresponding hyperspectral image to obtain an image containing the plant area, and perform preprocessing calculation on the image Obtain the average spectral characteristic curve of each category; step 4, input the hyperspectral data set into the soybean pest identification network model, and use the course learning method and the average spectral characteristic curve of each category for model training; step 5, use the trained soybean pest identification network model , predict and classify the collected and input hyperspectral images, and output the final predicted pest category. The invention can effectively improve the accuracy of identification of soybean diseases and insect pests.

Figure 202210947014

Description

Hyperspectral image-based soybean disease and pest identification method and device
Technical Field
The invention relates to the field of hyperspectral image processing, in particular to a method and a device for identifying soybean diseases and insect pests based on hyperspectral images.
Background
In the process of planting soybeans in China, diseases and insect pests are main reasons for the yield reduction and quality failure of soybean grains in China. The main cause of disease and insect damage is due to the nature of the plant, soybean.
Among diseases of soybean, soybean sheath blight, sclerotinia, gray leaf spot and root rot are the main types. Gray leaf spot is the most common type of soybean disease and has been classified as a worldwide disease. In China, gray leaf spot is mainly common in soybean production areas of the three provinces in northeast; once the soybean plants are ill, the problem of serious yield reduction can be caused, the yield reduction rate is generally different from 10 to 50 percent, and the serious influence is caused to the economy of China. The main influence of the gray leaf disease is the leaves and seeds of soybeans, and the state of the leaves can be greatly different when the soybean is attacked.
Among the pests of soybeans, the types of pests that are common are: budworm, aphid, red spider, etc. As a main soybean producing area in northeast China, aphids are most seriously harmful, and soybean leaves infected with the aphids are often expressed as follows: leaves curl, secreting a clear viscous liquid. After the soybean blooms, the saturation degree of buds affected by aphids is reduced, and the yield of the soybean is directly influenced. Moreover, too much aphid causes death of soybeans.
With the continuous development of electronic technology, hyperspectral remote sensing technology has been gradually applied to crop nutrient diagnosis, classification and identification, quality identification and other applications. The hyperspectral image consists of images of hundreds of bands, which makes it possible to identify or detect materials at a fine-grained level, especially with very similar spectral features from a visual point of view. The classification of hyperspectral images generally consists of the following steps: preprocessing (denoising), reducing dimensions, and extracting features to obtain final classification. Among them, the feature extraction stage is receiving wide attention. Over the past decades, manual features have been largely applied in the feature extraction stage; the characteristic extraction method has a good effect when the sample size is small, and the effect is gradually weakened along with the increase of the sample size.
With the development of deep learning, neural networks are gradually used in feature extraction of images, but the conventional CNN network does not perform well enough in extracting global features.
Disclosure of Invention
In order to solve the technical problems in the prior art, the invention provides a soybean pest and disease identification method based on a hyperspectral image, which takes a Transformer as a model of a trunk network and avoids the over-fitting problem in a course learning mode in a training stage, and the specific technical scheme is as follows:
a soybean disease and pest identification method based on hyperspectral images comprises the following steps:
shooting hyperspectral images and corresponding RGB images at different heights by using a hyperspectral camera and an RGB camera carried by an unmanned aerial vehicle to obtain an acquired hyperspectral data set and a corresponding RGB data set;
step two, performing data augmentation on the hyperspectral data set acquired in the step one based on the open source RGB data set;
thirdly, carrying out plant region segmentation on the open source RGB data set and the images in the collected RGB data set to obtain a mask image, carrying out pixel point multiplication on the mask image and the corresponding hyperspectral image to obtain an image containing a plant region, and then carrying out pretreatment to calculate an average spectral characteristic curve of each category;
inputting the hyperspectral data set after data augmentation into a soybean pest and disease identification network model, and performing model training by adopting a course learning mode and the average spectral characteristic curve of each category obtained in the step three to obtain a trained soybean pest and disease identification network model;
and step five, adopting the trained soybean pest and disease identification network model to predict and classify the collected and input hyperspectral images and outputting the finally predicted pest type.
Further, the second step specifically includes the following substeps:
step 2.1, loading a hyperspectral image reconstruction network trained and completed on an open source RGB data set, wherein the hyperspectral image reconstruction network adopts an MST + + algorithm;
step 2.2, inputting the soybean plant RGB images collected on the internet into a hyperspectral image reconstruction network after loading, obtaining and storing corresponding hyperspectral images generated by reconstruction, obtaining a generated hyperspectral data set, merging the generated hyperspectral data set into a hyperspectral data set collected by an unmanned aerial vehicle, recording the hyperspectral data set as a total hyperspectral data set, wherein labels of the data set are respectively the number of categories
Figure 100002_DEST_PATH_IMAGE002
Wherein, in the process,
Figure 100002_DEST_PATH_IMAGE004
is the number of categories, which is expressed by
Figure 720669DEST_PATH_IMAGE004
Class image data;
and 2.3, randomly overturning and cutting all the hyperspectral images and the corresponding RGB images in the total hyperspectral data set.
Further, the third step specifically includes the following substeps:
step 3.1, performing soybean plant area segmentation on each RGB image by applying the existing segmentation image segmentation algorithm to obtain a Mask image of the soybean plant area;
step 3.2, multiplying pixel points of the obtained Mask image and the image of each spectral frequency band of the corresponding hyperspectral image to obtain an image only containing a soybean plant area in the hyperspectral image;
step 3.3, then, normalizing the image only containing the soybean plant area, wherein the expression is as follows:
Figure 100002_DEST_PATH_IMAGE006
wherein,
Figure 100002_DEST_PATH_IMAGE008
the image is an input image, namely an input image only containing soybean areas;
Figure 100002_DEST_PATH_IMAGE010
is the output normalized image;
Figure 100002_DEST_PATH_IMAGE012
are respectively input images
Figure 559181DEST_PATH_IMAGE008
Of medium to maximumA pixel value and a minimum pixel value;
reducing the image size to after normalization
Figure 100002_DEST_PATH_IMAGE014
W is width and h is height;
and 3.4, respectively counting the spectral characteristic curves of the images with reduced sizes in each category to obtain the average spectral characteristic curve of each category.
Further, the step 3.4 specifically includes: calculating the average pixel value of each spectral band of the image with reduced size, calculating the average pixel values of all spectral bands by analogy, and finally arranging the spectral characteristic curves of the image in sequence; then, all the spectral characteristic curves in each class are respectively calculated, and all the spectral characteristic curves are averaged to finally obtain the average value
Figure 135656DEST_PATH_IMAGE004
Average spectral characteristics.
Furthermore, the soybean pest and disease identification network model takes a Transformer as a backbone network and comprises a spectral characteristic curve extraction module and a classification prediction module; the spectral characteristic curve extraction module is used for extracting a spectral characteristic curve of the soybean plant region image and then calculating a loss function with the average spectral characteristic curve of the same category; the classification prediction module classifies the extracted spectral characteristic curves.
Further, the course learning mode is to gradually increase the weight of the loss function which is difficult to train in the iterative training process, and the specific expression is as follows:
Figure 100002_DEST_PATH_IMAGE016
Figure 100002_DEST_PATH_IMAGE018
wherein,
Figure 100002_DEST_PATH_IMAGE020
all are the weights for course learning;
Figure 100002_DEST_PATH_IMAGE022
Figure 100002_DEST_PATH_IMAGE024
respectively is an initial weight, a current iteration number and a total iteration number;
Figure 100002_DEST_PATH_IMAGE026
respectively a classification loss function, a negative pearson correlation coefficient, a time domain loss function and a frequency domain loss function; wherein,
Figure 100002_DEST_PATH_IMAGE028
calculating a Pearson correlation coefficient of a spectral characteristic curve of the current hyperspectral image and an average spectral characteristic curve of a corresponding category, and taking a negative value;
Figure 100002_DEST_PATH_IMAGE030
the calculation of (2) is carried out by respectively taking the spectrum of Fourier transform from the spectrum characteristic curve of the current hyperspectral image and the average spectrum characteristic curve of the corresponding category, and then taking the average absolute error of the two spectrums.
A soybean disease and pest identification device based on hyperspectral images comprises one or more processors and is used for achieving the soybean disease and pest identification method based on the hyperspectral images.
A computer-readable storage medium having stored thereon a program which, when executed by a processor, implements the hyperspectral image-based soybean pest identification method.
Has the beneficial effects that:
first, at the data level: shooting data at different heights by using an unmanned aerial vehicle to obtain images at different scales, so that the model has generalization capability; in addition, the data amplification is completed through a hyperspectral image reconstruction network based on the MST + + algorithm, the cost for acquiring hyperspectral data is reduced, and the balance between the data amount and the category is ensured;
secondly, at the model level: a structure with a Transformer as a backbone network is used, so that the global property is enhanced; then, a course learning mode is used for training during training, so that overfitting is avoided, and convergence is accelerated;
finally, the combination of the frequency domain and the time domain is fully considered in the loss function, so that the accuracy of the prediction classification identification of the model is higher.
Drawings
FIG. 1 is a schematic flow chart of a soybean disease and pest identification method based on hyperspectral images;
FIG. 2 is a schematic diagram of a network module of the method of the present invention;
FIG. 3 is a schematic overall flow diagram of the data processing of the present invention;
fig. 4 is a schematic structural diagram of a soybean disease and pest identification device based on hyperspectral images according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and technical effects of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments of the specification.
As shown in figures 1 and 2, the soybean pest and disease identification method based on the hyperspectral image comprises the following steps:
the method comprises the steps of firstly, shooting hyperspectral images and corresponding RGB images at different heights by using a hyperspectral camera and an RGB camera carried by an unmanned aerial vehicle to obtain an acquired hyperspectral data set and a corresponding RGB data set.
In the embodiment of the invention, the heights of the unmanned aerial vehicle are respectively 3 meters, 5 meters and 7 meters; the shot hyperspectral image and the RGB image have the same required size, and the pixel points are in one-to-one correspondence; after the hyperspectral image and the RGB image are obtained, the hyperspectral image is stored as a mat file (the mat file is a standard binary file for data storage of matlab).
And step two, performing data augmentation on the hyperspectral data set acquired in the step one based on the open source RGB data set.
The hyperspectral data has high acquisition cost and long acquisition time; in order to fully train the model, firstly, a hyperspectral image reconstruction network is utilized to reconstruct and generate more hyperspectral images from an open RGB image, so that the full training of the model is ensured; specifically, the method comprises the following substeps:
step 2.1, loading a hyperspectral image reconstruction network (MST + + model) trained on an open source RGB data set, wherein the hyperspectral image reconstruction network model adopts an MST + + algorithm;
step 2.2, inputting the soybean plant RGB images collected on the internet into the loaded MST + + model to obtain corresponding hyperspectral images generated by reconstruction and storing the hyperspectral images as mat files to obtain a generated hyperspectral data set, merging the generated hyperspectral data set into the hyperspectral data set collected by the unmanned aerial vehicle before, and recording the hyperspectral data set as a total hyperspectral data set, wherein labels of the data set are respectively the number of the categories
Figure 42782DEST_PATH_IMAGE002
Wherein
Figure 388313DEST_PATH_IMAGE004
is the number of categories, which is expressed by
Figure 741934DEST_PATH_IMAGE004
Class image data;
and 2.3, randomly overturning all the hyperspectral images and corresponding RGB images in the total hyperspectral data set, and cutting to further improve the data volume.
Thirdly, carrying out plant region segmentation on the image in the RGB data set of the existing source and the collected RGB data set to obtain a mask image, carrying out pixel point multiplication on the mask image and the corresponding hyperspectral image to obtain an image containing a plant region, and then carrying out pretreatment to calculate an average spectral characteristic curve of each category.
Specifically, the method comprises the following substeps:
and 3.1, performing soybean plant area segmentation on each RGB image by using the existing segmentation image segmentation algorithm to obtain a Mask image of the soybean plant area. In the Mask image, the pixel value of the soybean-containing area is 255, and the pixel values of other areas are 0;
step 3.2, multiplying pixel points of the obtained Mask image and the image of each spectral frequency band of the corresponding hyperspectral image to obtain an image only containing a soybean plant area in the hyperspectral image;
step 3.3, then, normalizing the image only containing the soybean plant area, wherein the expression is as follows:
Figure DEST_PATH_IMAGE006A
wherein,
Figure 336863DEST_PATH_IMAGE008
the image is an input image, namely an input image only containing soybean areas;
Figure 66922DEST_PATH_IMAGE010
is the output normalized image;
Figure 266959DEST_PATH_IMAGE012
are respectively input images
Figure 791481DEST_PATH_IMAGE008
The maximum pixel value and the minimum pixel value;
reducing the image size to after normalization
Figure 811390DEST_PATH_IMAGE014
W is width and h is height;
step 3.4, respectively counting the spectral characteristic curves of the images with reduced sizes in each category, and specifically comprising the following steps: calculating the average pixel value of each spectral band of the image with reduced size, calculating the average pixel values of all spectral bands by analogy, and finally arranging the spectral characteristic curves of the image in sequence; then, all spectral characteristics in each class are calculated separately and allThe spectral characteristic curve is averaged to finally obtain
Figure 79560DEST_PATH_IMAGE004
The average spectral characteristics are shown in fig. 3.
And step four, inputting the hyperspectral data set after data augmentation into a soybean disease and insect pest identification network model, and performing model training by adopting a course learning mode and the average spectral characteristic curves of all categories obtained in the step three to obtain the trained soybean disease and insect pest identification network model.
Inputting the images of the hyperspectral data sets with augmented data into a soybean pest and disease identification network in batches; wherein, soybean plant diseases and insect pests identification network's major structure does: the device comprises a spectral characteristic curve extraction module and a classification prediction module.
The spectral characteristic curve extraction module mainly extracts the spectral characteristic curve of the soybean plant region image through a neural network, and the spectral characteristic curves of the soybean leaves with different plant diseases and insect pests are different; in this module, the final output is a spectral profile of this class, and then a loss function is calculated with the average spectral profile of the same class, as described below.
The classification prediction module classifies the extracted spectral characteristic curves, and the number of classes is
Figure DEST_PATH_IMAGE032
In the training phase, the main strategy adopted is Curriculum Learning (Curriculum Learning); the main embodiment mode of course learning is that along with the deepening of training, the weight of a certain less-trained Loss part is gradually increased, the model convergence is accelerated from shallow to deep just like a learning course, and after N times of iterative training convergence, the trained model is stored, and the specific mode is as follows:
Figure DEST_PATH_IMAGE016A
Figure 464930DEST_PATH_IMAGE018
in the formula, the first step is that,
Figure 425932DEST_PATH_IMAGE020
all are the weights for course learning;
Figure 198716DEST_PATH_IMAGE022
Figure 739419DEST_PATH_IMAGE024
respectively is an initial weight, a current iteration number and a total iteration number;
Figure 914048DEST_PATH_IMAGE026
respectively as follows: a classification loss function, a negative pearson correlation coefficient, a time domain + frequency domain loss function; wherein,
Figure 780373DEST_PATH_IMAGE028
calculating a Pearson correlation coefficient of a spectral characteristic curve of the current hyperspectral image and an average spectral characteristic curve of a corresponding category and taking a negative value;
Figure 40453DEST_PATH_IMAGE030
the calculation of (2) is carried out by firstly taking the frequency spectrum of Fourier transform for the spectral characteristic curve of the current hyperspectral image and the average spectral characteristic curve of the corresponding category and then taking the average absolute error for the two frequency spectrums.
And step five, adopting the trained soybean pest and disease identification network model to predict and classify the collected and input hyperspectral images and outputting the finally predicted pest type.
Firstly, adjusting a model to an eval model evaluation mode, and then loading a stored model file; in the inference stage, only hyperspectral images are input into the recognition network, normalization and size adjustment are carried out on the images before input, and then the images are input into the classification prediction network to obtain the final output category.
Corresponding to the embodiment of the soybean disease and insect pest identification method based on the hyperspectral image, the invention also provides an embodiment of a soybean disease and insect pest identification device based on the hyperspectral image.
Referring to fig. 4, the soybean pest and disease identification device based on the hyperspectral image provided by the embodiment of the invention comprises one or more processors, and is used for realizing the soybean pest and disease identification method based on the hyperspectral image in the embodiment.
The soybean pest and disease identification device based on the hyperspectral image can be applied to any equipment with data processing capacity, and the any equipment with data processing capacity can be equipment or devices such as computers. The apparatus embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a device in a logical sense, a processor of any device with data processing capability reads corresponding computer program instructions in the nonvolatile memory into the memory for operation. In terms of hardware, as shown in fig. 4, the present invention is a hardware structure diagram of any device with data processing capability where the soybean pest identification apparatus based on hyperspectral image is located, and besides the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 4, in an embodiment, any device with data processing capability where the apparatus is located may also include other hardware according to the actual function of the any device with data processing capability, which is not described again.
The specific details of the implementation process of the functions and actions of each unit in the above device are the implementation processes of the corresponding steps in the above method, and are not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the present invention. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiment of the invention also provides a computer readable storage medium, wherein a program is stored on the computer readable storage medium, and when the program is executed by a processor, the soybean pest and disease identification method based on the hyperspectral image in the embodiment is realized.
The computer readable storage medium may be an internal storage unit, such as a hard disk or a memory, of any data processing device described in any previous embodiment. The computer readable storage medium may also be an external storage device such as a plug-in hard disk, a Smart Media Card (SMC), an SD Card, a Flash memory Card (Flash Card), etc. provided on the device. Further, the computer readable storage medium may include both an internal storage unit and an external storage device of any data processing capable device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the arbitrary data processing capable device, and may also be used for temporarily storing data that has been output or is to be output.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way. Although the foregoing has described the practice of the present invention in detail, it will be apparent to those skilled in the art that modifications may be made to the practice of the invention as described in the foregoing examples, or that certain features may be substituted in the practice of the invention. All changes, equivalents and the like which come within the spirit and principles of the invention are desired to be protected.

Claims (6)

1. A soybean disease and pest identification method based on hyperspectral images is characterized by comprising the following steps:
shooting hyperspectral images and corresponding RGB images at different heights by using a hyperspectral camera and an RGB camera carried by an unmanned aerial vehicle to obtain an acquired hyperspectral data set and a corresponding RGB data set;
step two, performing data augmentation on the hyperspectral data set acquired in the step one based on the open source RGB data set;
thirdly, carrying out plant region segmentation on the images in the open source RGB data set and the collected RGB data set to obtain a mask image, carrying out pixel point multiplication on the mask image and the corresponding hyperspectral image to obtain an image containing a plant region, and then carrying out pretreatment to calculate an average spectral characteristic curve of each category;
inputting the hyperspectral data set after data augmentation into a soybean pest and disease identification network model, and performing model training by adopting a course learning mode and the average spectral characteristic curve of each category obtained in the step three to obtain a trained soybean pest and disease identification network model;
the soybean pest and disease identification network model takes a Transformer as a main network and comprises a spectral characteristic curve extraction module and a classification prediction module; the spectral characteristic curve extraction module is used for extracting a spectral characteristic curve of the soybean plant region image and then calculating a loss function with the average spectral characteristic curve of the same category; the classification prediction module classifies the extracted spectral characteristic curve;
the course learning mode is to gradually increase the weight of a loss function which is difficult to train in the iterative training process, and the specific expression is as follows:
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE004
wherein,
Figure DEST_PATH_IMAGE006
all are the weights for course learning;
Figure DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE010
respectively an initial weight, a current iteration number and a total iteration number;
Figure DEST_PATH_IMAGE012
respectively a classification loss function, a negative pearson correlation coefficient, a time domain loss function and a frequency domain loss function; wherein,
Figure DEST_PATH_IMAGE014
calculating a Pearson correlation coefficient of a spectral characteristic curve of the current hyperspectral image and an average spectral characteristic curve of a corresponding category and taking a negative value;
Figure DEST_PATH_IMAGE016
the calculation of (2) firstly takes the spectrum of Fourier transform for the spectral characteristic curve of the current hyperspectral image and the average spectral characteristic curve of the corresponding category, and then takes the average absolute error for the two spectrums;
and step five, adopting the trained soybean pest and disease identification network model, predicting and classifying the collected and input hyperspectral images, and outputting the finally predicted pest type.
2. The hyperspectral image-based soybean pest and disease identification method according to claim 1 is characterized in that the second step specifically comprises the following substeps:
step 2.1, loading a hyperspectral image reconstruction network trained and completed on an open source RGB data set, wherein the hyperspectral image reconstruction network adopts an MST + + algorithm;
step 2.2, inputting the RGB images of the soybean plants collected on the network into a hyperspectral image reconstruction network after the loading is finished, obtaining and storing corresponding hyperspectral images generated by reconstruction, obtaining a generated hyperspectral data set, merging the generated hyperspectral data set into a hyperspectral data set collected by an unmanned aerial vehicle, and recording the hyperspectral data set as total highlightSpectral data set, the labels of the data set according to the category number are respectively
Figure DEST_PATH_IMAGE018
Wherein, in the process,
Figure DEST_PATH_IMAGE020
is the number of categories, which is expressed by
Figure 243450DEST_PATH_IMAGE020
Class image data;
and 2.3, randomly turning and cutting all the hyperspectral images and the corresponding RGB images in the total hyperspectral data set.
3. The hyperspectral image-based soybean pest and disease identification method according to claim 2 is characterized in that the third step specifically comprises the following substeps:
3.1, performing soybean plant area segmentation on each RGB image by applying the existing segmentation image segmentation algorithm to obtain a Mask image of the soybean plant area;
step 3.2, multiplying pixel points of the obtained Mask image and the image of each spectral frequency band of the corresponding hyperspectral image to obtain an image only containing a soybean plant area in the hyperspectral image;
step 3.3, then normalizing the image only containing the soybean plant area, wherein the expression is as follows:
Figure DEST_PATH_IMAGE022
wherein,
Figure DEST_PATH_IMAGE024
the image is an input image, namely an input image only containing soybean areas;
Figure DEST_PATH_IMAGE026
to be transportedOutputting a normalized image;
Figure DEST_PATH_IMAGE028
are respectively input images
Figure 111656DEST_PATH_IMAGE024
The maximum pixel value and the minimum pixel value;
reducing the image size to after normalization
Figure DEST_PATH_IMAGE030
W is width and h is height;
and 3.4, respectively counting the spectral characteristic curves of the images with reduced sizes in each category to obtain the average spectral characteristic curve of each category.
4. The soybean pest and disease identification method based on the hyperspectral image according to claim 3, wherein the step 3.4 specifically comprises: calculating the average pixel value of each spectral band of the image with reduced size, calculating the average pixel values of all spectral bands by analogy, and finally arranging the spectral characteristic curves of the image in sequence; then, all the spectral characteristic curves in each class are respectively calculated, and the average value of all the spectral characteristic curves is obtained to finally obtain the average value
Figure 953710DEST_PATH_IMAGE020
Average spectral characteristics.
5. A soybean pest and disease identification device based on hyperspectral images, which is characterized by comprising one or more processors and is used for realizing the soybean pest and disease identification method based on hyperspectral images in any one of claims 1 to 4.
6. A computer-readable storage medium, characterized in that a program is stored thereon, which when executed by a processor, implements the hyperspectral image-based soybean pest identification method according to any one of claims 1 to 4.
CN202210947014.9A 2022-08-09 2022-08-09 Hyperspectral image-based soybean disease and pest identification method and device Active CN115019215B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210947014.9A CN115019215B (en) 2022-08-09 2022-08-09 Hyperspectral image-based soybean disease and pest identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210947014.9A CN115019215B (en) 2022-08-09 2022-08-09 Hyperspectral image-based soybean disease and pest identification method and device

Publications (2)

Publication Number Publication Date
CN115019215A CN115019215A (en) 2022-09-06
CN115019215B true CN115019215B (en) 2022-12-09

Family

ID=83065811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210947014.9A Active CN115019215B (en) 2022-08-09 2022-08-09 Hyperspectral image-based soybean disease and pest identification method and device

Country Status (1)

Country Link
CN (1) CN115019215B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115456879A (en) * 2022-09-23 2022-12-09 华南农业大学 A microscopic super-resolution multispectral reconstruction method based on deep learning
CN117011316B (en) * 2023-10-07 2024-02-06 之江实验室 A method and system for identifying the internal structure of soybean stems based on CT images
CN119672543B (en) * 2025-02-20 2025-04-11 深圳恒升应急科技有限公司 Spraying method and system for spraying area of unmanned aerial vehicle for forest pest control
CN120259896A (en) * 2025-06-05 2025-07-04 北京观微科技有限公司 Soybean mosaic disease detection method, device and equipment based on drone images
CN120451840B (en) * 2025-07-02 2025-09-05 绵阳恒持金属设备有限公司 Target monitoring method and system based on unmanned aerial vehicle cruising

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052904A (en) * 2020-09-09 2020-12-08 陕西理工大学 An identification method of pests and diseases based on transfer learning and convolutional neural network
CN114494777A (en) * 2022-01-24 2022-05-13 西安电子科技大学 A hyperspectral image classification method and system based on 3D CutMix-Transformer

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10902577B2 (en) * 2017-06-19 2021-01-26 Apeel Technology, Inc. System and method for hyperspectral image processing to identify object
CN108444928B (en) * 2018-03-12 2020-10-09 浙江大学 Method for identifying cereal seed frostbite condition by using seed embryo spectrogram characteristic wave band
CN112016392B (en) * 2020-07-17 2024-05-28 浙江理工大学 A small sample detection method for soybean pest severity based on hyperspectral images
CN112784774B (en) * 2021-01-27 2022-08-23 山东农业大学 Small sample hyperspectral classification method based on data enhancement
CN113011354A (en) * 2021-03-25 2021-06-22 浙江农林大学 Unmanned aerial vehicle hyperspectral image pine wood nematode disease identification method based on deep learning
CN113837000A (en) * 2021-08-16 2021-12-24 天津大学 A Few-Sample Fault Diagnosis Method Based on Task Ranking Meta-Learning
CN113962258B (en) * 2021-10-09 2025-06-06 中国农业科学院烟草研究所(中国烟草总公司青州烟草研究所) A tobacco disease identification and prevention method, system and storage medium
CN113989639B (en) * 2021-10-20 2024-04-16 华南农业大学 Automatic identification method and device for litchi diseases based on hyperspectral image analysis and processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052904A (en) * 2020-09-09 2020-12-08 陕西理工大学 An identification method of pests and diseases based on transfer learning and convolutional neural network
CN114494777A (en) * 2022-01-24 2022-05-13 西安电子科技大学 A hyperspectral image classification method and system based on 3D CutMix-Transformer

Also Published As

Publication number Publication date
CN115019215A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN115019215B (en) Hyperspectral image-based soybean disease and pest identification method and device
Lv et al. Maize leaf disease identification based on feature enhancement and DMS-robust alexnet
Kunduracioglu et al. Advancements in deep learning for accurate classification of grape leaves and diagnosis of grape diseases
Yadav et al. AFD-Net: Apple Foliar Disease multi classification using deep learning on plant pathology dataset
CN110378381B (en) Object detection method, device and computer storage medium
CN106485251B (en) Classification of egg embryos based on deep learning
Arivazhagan et al. Detection of unhealthy region of plant leaves and classification of plant leaf diseases using texture features
Yun et al. PNN based crop disease recognition with leaf image features and meteorological data
CN109961024A (en) Weed detection method in wheat field based on deep learning
CN114693616B (en) A rice disease detection method, device and medium based on improved target detection model and convolutional neural network
CN111639587B (en) Hyperspectral image classification method based on multi-scale spectrum space convolution neural network
CN110110596B (en) Hyperspectral image feature extraction, classification model construction and classification method
CN106971160A (en) Winter jujube disease recognition method based on depth convolutional neural networks and disease geo-radar image
CN111126361B (en) SAR target identification method based on semi-supervised learning and feature constraints
CN114445715A (en) A method of crop disease identification based on convolutional neural network
CN113627240B (en) A Method for Recognition of UAV Tree Types Based on Improved SSD Learning Model
Li et al. Maize leaf disease identification based on WG-MARNet
Singh et al. Performance analysis of CNN models with data augmentation in rice diseases
CN113052130A (en) Hyperspectral image classification method based on depth residual error network and edge protection filtering
CN110503140A (en) Classification method based on depth migration study and neighborhood noise reduction
Bhujade et al. Role of digital, hyper spectral, and SAR images in detection of plant disease with deep learning network
Sahu et al. Classification and activation map visualization of banana diseases using deep learning models
Yang et al. Convolutional neural network-based automatic image recognition for agricultural machinery
Sharma et al. Optimum RBM encoded SVM model with ensemble feature Extractor-based plant disease prediction
CN110991554B (en) Improved PCA (principal component analysis) -based deep network image classification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant