[go: up one dir, main page]

CN111784688A - Flower automatic grading method based on deep learning - Google Patents

Flower automatic grading method based on deep learning Download PDF

Info

Publication number
CN111784688A
CN111784688A CN202010722613.1A CN202010722613A CN111784688A CN 111784688 A CN111784688 A CN 111784688A CN 202010722613 A CN202010722613 A CN 202010722613A CN 111784688 A CN111784688 A CN 111784688A
Authority
CN
China
Prior art keywords
flower
image
deep learning
flowers
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010722613.1A
Other languages
Chinese (zh)
Inventor
方志斌
王岩松
和江镇
石海军
杨清鉴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Focusight Technology Co Ltd
Original Assignee
Focusight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Focusight Technology Co Ltd filed Critical Focusight Technology Co Ltd
Priority to CN202010722613.1A priority Critical patent/CN111784688A/en
Publication of CN111784688A publication Critical patent/CN111784688A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/05Programmable logic controllers, e.g. simulating logic interconnections of signals according to ladder diagrams or function charts
    • G05B19/054Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a flower automatic grading method based on deep learning, which comprises the following steps of S1, collecting images, and collecting overlooking images of flowers in a production line through an image collecting and processing system; s2, preprocessing the image, using a masking tool to mask a background area, and using a marking tool to mark a characteristic area of the image; s3, establishing a deep learning model, inputting the preprocessed image into a GoogLeNet deep learning algorithm training classification model, and realizing the openness classification, the flower damage detection and the elbow detection of the flowers by using the deep learning model. The invention uses deep learning algorithm to grade openness and detect defects, which can solve the core problem of automatic flower grading, and replace artificial flower grading; realize the automatic high accuracy grade of flower, can improve the production speed in flower field, reduce the cost of labor, ensure the quality of the flower that flows into market.

Description

Flower automatic grading method based on deep learning
Technical Field
The invention relates to the technical field of image visual processing, in particular to a flower automatic grading method based on deep learning.
Background
Openness is the most important index for flower grading, and elbow and flower damage defect samples also need to be removed, so that the realization of openness grading and automatic detection of defects is a key problem for realizing automatic grading of flowers. However, the background of the fresh flowers is various and the postures are variable, so that the traditional image processing algorithm cannot accurately grade the openness and detect the defects of elbows and flower damage.
The existing machine vision-based technology mainly uses image areas to classify the openness of flowers, but the flowers are irregular crops, and the area size cannot accurately identify the openness of the flowers. When the difference of the flower openness is small, the area difference is small, and the original method cannot accurately distinguish different openness. The image area of the product with the deformed elbow is basically close to the area of the good product, and the product is difficult to distinguish.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the automatic flower grading method based on deep learning is characterized in that a camera assembly based on a detection system collects flower images on a production line, an interested region in the images is extracted, and a classification model is trained by utilizing a deep learning algorithm, so that the flowers are accurately classified and defective products are distinguished, and the outgoing quality of the flowers is improved.
The technical scheme adopted by the invention for solving the technical problems is as follows: a flower automatic grading method based on deep learning comprises the following steps,
s1, collecting images, and collecting overhead images of flowers in a production line through an image collecting and processing system;
s2, preprocessing the image, using a masking tool to mask a background area, and using a marking tool to mark a characteristic area of the image;
s3, establishing a deep learning model, inputting the preprocessed image into a GoogLeNet deep learning algorithm training classification model, and realizing the openness classification, the flower damage detection and the elbow detection of the flowers by using the deep learning model.
Further, the image acquisition processing system in step S1 of the present invention includes an image processing server, an acquisition controller, an encoder, a triggering photoelectric sensor, a length measuring photoelectric sensor, a camera module, and a PLC; the camera component comprises a camera and a light source, is arranged right above the flowers and is used for collecting overlook images of the flowers; the overlook image is a 3-channel color image; the length measurement photoelectric sensor is a grating type photoelectric sensor, is arranged on the side surface of the flower and is used for measuring the length of a flower stem of the flower.
Still further, the encoder and the triggering photoelectric sensor input the position information of each flower in the production line to the acquisition controller, the acquisition controller converts the position information into a line signal and a triggering signal to trigger the length measuring photoelectric sensor and the camera to work, the camera and the length measuring photoelectric sensor simultaneously image and measure the same flower and simultaneously send image and length data to the image processing server; the image processing server processes the flower image based on a deep learning algorithm and obtains a discrimination result by combining the length of the flower; the image processing server sends the discrimination result to the PLC to control the mechanical motion to flow the flowers into the corresponding collecting bin.
In step S2, a model data set is established using the sample set of the region of interest, the sample set of the region of interest is divided into a training set and a prediction set, and the training set and the prediction set together form the model data set; the sample set comprises flower images with different openness degrees, elbow flower images and flower loss flower images.
Further, in step S3 of the present invention, the training process using the google lenet model is as follows:
A. taking a sample (Xp, Yp) from the sample set, and inputting the Xp into the network, wherein Xp is the p-th input sample, and Yp is an ideal output result;
B. calculating a corresponding actual output result Op; at this stage, the information is transferred from the input layer to the output layer via a step-by-step conversion; this process is also a process that the network performs during normal operation after training is completed; in this process, the network performs the calculations:
Op=Fn(…(F2(F1(XpW(1))W(2))…)W(n))
where Fn is the calculation function of the nth layer, and W (n) is the weight matrix of the nth layer.
C. Calculating the difference between the actual output Op and the corresponding ideal output Yp;
D. and reversely transmitting and adjusting the weight matrix according to a method of minimizing errors.
The invention has the advantages of overcoming the defects in the background technology,
the core problem of automatic flower grading can be solved by accurately grading the openness and detecting the defects by using a deep learning algorithm, and the artificial flower grading is replaced; the automatic high-precision grading of the fresh flowers is realized, the production speed of a flower field can be improved, the labor cost is reduced, and the quality of the fresh flowers flowing into the market is guaranteed;
the flower deep learning model classification results and the length measurement results are comprehensively classified to obtain accurate classification information and defective products, the results are sent to the PLC according to set control commands, the mechanical part of the equipment is controlled to move, and flowers are accurately distributed to corresponding collecting bins according to the requirements of users.
Drawings
FIG. 1 is a block diagram of the image acquisition and processing system of the present invention;
FIG. 2 is a schematic diagram of the structure of an image acquired by the imaging system of the present invention;
in the figure, 1, camera; 2. a light source; 3. fresh flowers; 4. a photoelectric sensor.
Detailed Description
The invention will now be described in further detail with reference to the drawings and preferred embodiments. These drawings are simplified schematic views illustrating only the basic structure of the present invention in a schematic manner, and thus show only the constitution related to the present invention.
The automatic flower grading method based on deep learning shown in fig. 1-2 specifically comprises three steps of image acquisition, image preprocessing and model training.
S1: collecting images, namely collecting overlooking images of flowers in a production line through an imaging system;
s2: image preprocessing, namely reducing a background area by using a masking tool, and labeling a characteristic area of an image by using a marking tool;
s3: and (3) training a model, inputting the preprocessed image into a GoogLeNet deep learning algorithm to train a classification model, and realizing the openness classification, elbow detection and flower damage detection of flowers by using the deep learning model.
The image acquisition and processing system is used for acquiring a top view of flowers and measuring the length of a flower rod; the device comprises a trigger photoelectric device, an encoder, an acquisition controller, a first assembly, a second assembly, an image processing server and a PLC.
The first component is an imaging system, comprises a camera and a light source, is arranged right above the flowers, acquires digital image data of the flowers and collects overlooking images of the flowers. Carrying out histogram equalization processing on the image acquired by the first component; and performing median filtering processing on the processed image. The top view image is a 3-channel color image.
The second component is a grating type length measurement photoelectric sensor, is arranged on the side surface of the flower and is used for measuring the length of a flower rod of the flower.
The working flow is as follows,
image acquisition: the encoder inputs position information of each flower to the acquisition controller in the production line, the acquisition controller converts the position information into a line signal and triggers the length measuring photoelectric device and the camera to work, and the camera and the length measuring photoelectric device simultaneously image and measure the same flower and simultaneously send image and length data to the image processing server.
Image processing: then the image processing server processes the flower image based on the deep learning algorithm and obtains a discrimination result by combining the length of the flower.
And (3) judging and processing: and finally, the image processing server sends the judgment result to the PLC, and the PLC controls mechanical motion to flow the flowers into the corresponding collection bin.
And respectively establishing an openness model, an elbow model and a flower damage model according to the openness, elbow and flower damage indexes of the fresh flowers to realize the grading of the fresh flowers. Establishing a model data set by using an interested area sample set, wherein the interested area sample set is divided into a training set and a prediction set, and the training set and the prediction set jointly form the model data set; the sample set comprises flower images with different openness degrees, elbow flower images and flower damage flower images.
Establishing an openness model by the specific steps of
S1: setting a sample set: collecting 100 images with openness of 1 degree, 2 degrees, 3 degrees and 4 degrees respectively, wherein the number of the images is 400; and setting the training set and the prediction set in a ratio of 1:1 according to a random grouping mode, wherein sample sets of 1 degree, 2 degrees, 3 degrees and 4 degrees are represented by X.
S2: image preprocessing: the flower center areas with the openness degrees of 1 degree, 2 degrees, 3 degrees and 4 degrees are marked by using different types of marking tools, and the types of 1 degree, 2 degrees, 3 degrees and 4 degrees are represented by Y.
S3: training deep learning model
S3-1: taking a sample (Xp, Yp) from the sample set, and inputting the Xp into the network, wherein Xp is the p-th input sample, and Yp is an ideal output result;
s3-2: calculating a corresponding actual output result Op; at this stage, information is passed from the input layer to the output layer via a stepwise transformation. This process is also the process that the network performs during normal operation after training is completed. In this process, the network performs the calculations:
Op=Fn(…(F2(F1(XpW(1))W(2))…)W(n))
where Fn is the calculation function of the nth layer, and W (n) is the weight matrix of the nth layer.
S3-3: calculating the difference between the actual output Op and the corresponding ideal output Yp;
s3-4: and reversely transmitting and adjusting the weight matrix according to a method of minimizing errors.
The elbow model is established by the specific steps of
S1: setting a sample set: collecting 100 images of normal and elbow samples respectively, and collecting 200 images in total; the training set and the prediction set were set in a 1:1 ratio in a random component manner, with the normal and elbow sample sets denoted by X'.
S2: image preprocessing: covering the area where the elbow flower center appears on the elbow sample by using a mask tool; for normal samples, the same mask is added and the heart area is marked using a marking tool, normal and elbow categories are denoted by Y'.
S3: training deep learning model
S3-1: taking a sample (X ' p, Y ' p) from the sample set, and inputting the X ' p into the network, wherein X ' p is the p-th input sample, and Y ' p is an ideal output result;
s3-2: calculating a corresponding actual output result O' p; at this stage, information is passed from the input layer to the output layer via a stepwise transformation. This process is also the process that the network performs during normal operation after training is completed. In this process, the network performs the calculations:
O’p=Fn(…(F2(F1(X’pW(1))W(2))…)W(n))
where Fn is the calculation function of the nth layer, and W (n) is the weight matrix of the nth layer.
S3-3: calculating the difference between the actual output O 'p and the corresponding ideal output Y' p;
s3-4: and reversely transmitting and adjusting the weight matrix according to a method of minimizing errors.
Establishing a pattern loss model by the specific steps of
S1: setting a sample set: collecting 100 images of normal, bacterial plaque, stain and crush injury samples, wherein the total number of the images is 400; training and prediction sets were set at a 1:1 ratio in a randomized component fashion, with normal, plaque, stain, and crush sample sets denoted by X ".
S2: image preprocessing: for normal samples, no marking is carried out; for plaque, stain and crush samples, the plaque, stain and crush area was marked using a marking instrument, with normal, plaque, stain, crush categories indicated by Y ".
S3: training deep learning model
S3-1: taking a sample (X ' p, Y ' p) from the sample set, and inputting the X ' p into the network, wherein the X ' p is the p-th input sample, and the Y ' p is an ideal output result;
s3-2: calculating a corresponding actual output result O' p; at this stage, information is passed from the input layer to the output layer via a stepwise transformation. This process is also the process that the network performs during normal operation after training is completed. In this process, the network performs the calculations:
O”p=Fn(…(F2(F1(X”pW(1))W(2))…)W(n))
where Fn is the calculation function of the nth layer, and W (n) is the weight matrix of the nth layer.
S3-3: calculating the difference between the actual output O "p and the corresponding ideal output Y" p;
s3-4: and reversely transmitting and adjusting the weight matrix according to a method of minimizing errors.
The deep learning model of the openness model, the elbow model and the flower damage model is GoogLeNet.
The structure of the google lenet network is as follows:
0. input device
The original input image was 224x224x3, and all were pre-processed with zero averaging (image per pixel minus mean).
1. First layer (coiled layer)
Using a convolution kernel of 7x7 (sliding step 2, padding is 3), 64 channels, and output is 112x112x64, and performing a ReLU operation after convolution;
after max pooling (step size 2) of 3x3, the output is ((112-3+1)/2) +1 ═ 56, i.e. 56x56x64, and then the ReLU operation is performed;
2. second layer (convolution layer)
Using a convolution kernel of 3x3 (sliding step is 1, padding is 1), 192 channels, and the output is 56x56x192, and performing ReLU operation after convolution;
after max poling (step size 2) of 3x3, the output is ((56-3+1)/2) +1 ═ 28, namely 28x28x192, and then the ReLU operation is performed;
3a, third layer (inclusion 3a layer)
Dividing the branch into four branches, and processing by adopting convolution kernels with different scales;
(1)64 convolution kernels of 1x1, then RuLU, output 28x28x 64;
(2)96 convolution kernels of 1x1, which are used as dimensionality reduction before a convolution kernel of 3x3, become 28x28x96, then ReLU calculation is carried out, 128 convolutions of 3x3 are carried out (padding is 1), and 28x28x128 is output;
(3) the 16 convolution kernels of 1x1 are used as dimensionality reduction before a convolution kernel of 5x5 and become 28x28x16, after ReLU calculation, 32 convolutions of 5x5 are carried out (padding is 2), and 28x28x32 is output;
(4) pool layer, using a kernel of 3x3 (padding is 1), outputs 28x28x192, and then performs 32 convolutions of 1x1, outputting 28x28x 32.
Connecting the four results, and connecting the three dimensions of the four output results in parallel, namely, 256 is obtained by 64+128+32+32, and finally 28x28x256 is output;
3b, third layer (inclusion 3b layer)
(1)128 convolution kernels of 1x1, then RuLU, output 28x28x 128;
(2)128 convolution kernels of 1x1, which are used as dimensionality reduction before a convolution kernel of 3x3, become 28x28x128, ReLU is carried out, 192 convolution of 3x3 (padding is 1) is carried out, and 28x28x192 is output;
(3)32 convolution kernels of 1x1 are used as dimensionality reduction before a convolution kernel of 5x5, namely, 28x28x32, after ReLU calculation, 96 convolutions of 5x5 are carried out (padding is 2), and 28x28x96 is output;
(4) pool layer, using a kernel of 3x3 (padding is 1), outputs 28x28x256, and then performs 64 convolutions of 1x1, outputting 28x28x 64.
Connecting the four results, and connecting the three dimensions of the four output results in parallel, namely 128+192+96+64 is 480, and finally outputting the result as 28x28x 480;
fourth layer (4a,4b,4c,4d,4e), fifth layer (5a,5b) … …, similar to 3a, 3b, are not repeated here.
While particular embodiments of the present invention have been described in the foregoing specification, various modifications and alterations to the previously described embodiments will become apparent to those skilled in the art from this description without departing from the spirit and scope of the invention.

Claims (5)

1. A flower automatic grading method based on deep learning is characterized in that: comprises the following steps of (a) carrying out,
s1, collecting images, and collecting overhead images of flowers in a production line through an image collecting and processing system;
s2, preprocessing the image, using a masking tool to mask a background area, and using a marking tool to mark a characteristic area of the image;
s3, establishing a deep learning model, inputting the preprocessed image into a GoogLeNet deep learning algorithm training classification model, and realizing the openness classification, the flower damage detection and the elbow detection of the flowers by using the deep learning model.
2. The deep learning-based flower automated grading method of claim 1, characterized in that: the image acquisition processing system in the step S1 comprises an image processing server, an acquisition controller, an encoder, a triggering photoelectric sensor, a length measuring photoelectric sensor, a camera assembly and a PLC; the camera component comprises a camera and a light source, is arranged right above the flowers and is used for collecting overlook images of the flowers; the overlook image is a 3-channel color image; the length measurement photoelectric sensor is a grating type photoelectric sensor, is arranged on the side surface of the flower and is used for measuring the length of a flower stem of the flower.
3. The deep learning-based flower automated grading method of claim 2, characterized in that: the device comprises an encoder, a trigger photoelectric sensor, a length measuring photoelectric sensor and a camera, wherein the encoder and the trigger photoelectric sensor input position information of each flower in a production line to an acquisition controller, the acquisition controller converts the position information into a line signal and a trigger signal to trigger the length measuring photoelectric sensor and the camera to work, the camera and the length measuring photoelectric sensor simultaneously image and measure the same flower and simultaneously send image and length data to an image processing server; the image processing server processes the flower image based on a deep learning algorithm and obtains a discrimination result by combining the length of the flower; the image processing server sends the discrimination result to the PLC to control the mechanical motion to flow the flowers into the corresponding collecting bin.
4. The deep learning-based flower automated grading method of claim 1, characterized in that: in the step S2, a model data set is established by using the sample set of the region of interest, the sample set of the region of interest is divided into a training set and a prediction set, and the training set and the prediction set jointly form the model data set; the sample set comprises flower images with different openness degrees, elbow flower images and flower loss flower images.
5. The deep learning-based flower automated grading method of claim 1, characterized in that: in step S3, the training process using the google lenet model is as follows:
A. taking a sample (Xp, Yp) from the sample set, and inputting the Xp into the network, wherein Xp is the p-th input sample, and Yp is an ideal output result;
B. calculating a corresponding actual output result Op; at this stage, the information is transferred from the input layer to the output layer via a step-by-step conversion; this process is also a process that the network performs during normal operation after training is completed; in this process, the network performs the calculations:
Op=Fn(…(F2(F1(XpW(1))W(2))…)W(n))
where Fn is the calculation function of the nth layer, and W (n) is the weight matrix of the nth layer.
C. Calculating the difference between the actual output Op and the corresponding ideal output Yp;
D. and reversely transmitting and adjusting the weight matrix according to a method of minimizing errors.
CN202010722613.1A 2020-07-24 2020-07-24 Flower automatic grading method based on deep learning Pending CN111784688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010722613.1A CN111784688A (en) 2020-07-24 2020-07-24 Flower automatic grading method based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010722613.1A CN111784688A (en) 2020-07-24 2020-07-24 Flower automatic grading method based on deep learning

Publications (1)

Publication Number Publication Date
CN111784688A true CN111784688A (en) 2020-10-16

Family

ID=72764089

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010722613.1A Pending CN111784688A (en) 2020-07-24 2020-07-24 Flower automatic grading method based on deep learning

Country Status (1)

Country Link
CN (1) CN111784688A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113361642A (en) * 2021-07-02 2021-09-07 柒久园艺科技(北京)有限公司 Fresh cut flower grading method, device and medium
CN115158736A (en) * 2022-07-20 2022-10-11 征图新视(江苏)科技股份有限公司 Flower automatic packing machine
CN115350934A (en) * 2022-10-21 2022-11-18 昆明泰仕达科技有限公司 Automatic flower sorting equipment with flower branches
CN117094997A (en) * 2023-10-18 2023-11-21 深圳市睿阳精视科技有限公司 Flower opening degree detection and evaluation method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101592969A (en) * 2008-05-30 2009-12-02 深圳市能联电子有限公司 Tracking control method and tracking control device for solar power generation
CN107945182A (en) * 2018-01-02 2018-04-20 东北农业大学 Maize leaf disease recognition method based on convolutional neural networks model GoogleNet
CN209985822U (en) * 2019-01-28 2020-01-24 云南安视智能设备有限公司 Fresh cut flower shape detection device
CN111428990A (en) * 2020-03-20 2020-07-17 浙江大学城市学院 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101592969A (en) * 2008-05-30 2009-12-02 深圳市能联电子有限公司 Tracking control method and tracking control device for solar power generation
CN107945182A (en) * 2018-01-02 2018-04-20 东北农业大学 Maize leaf disease recognition method based on convolutional neural networks model GoogleNet
CN209985822U (en) * 2019-01-28 2020-01-24 云南安视智能设备有限公司 Fresh cut flower shape detection device
CN111428990A (en) * 2020-03-20 2020-07-17 浙江大学城市学院 Deep neural network-based method for evaluating flower grade of water-cultured flowers in flowering period

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHRISTIAN SZEGEDY等: ""Going Deeper with Convolutions"", 《CVPR2015》 *
姜岱箐: ""基于图像识别的月季切花检测与分级指标研究"", 《中国优秀硕士学位论文全文数据库农业科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113361642A (en) * 2021-07-02 2021-09-07 柒久园艺科技(北京)有限公司 Fresh cut flower grading method, device and medium
CN113361642B (en) * 2021-07-02 2024-03-19 柒久园艺科技(北京)有限公司 Fresh cut flower grading method, device and medium
CN115158736A (en) * 2022-07-20 2022-10-11 征图新视(江苏)科技股份有限公司 Flower automatic packing machine
CN115158736B (en) * 2022-07-20 2024-04-12 征图新视(江苏)科技股份有限公司 Automatic baling press of fresh flowers
CN115350934A (en) * 2022-10-21 2022-11-18 昆明泰仕达科技有限公司 Automatic flower sorting equipment with flower branches
CN117094997A (en) * 2023-10-18 2023-11-21 深圳市睿阳精视科技有限公司 Flower opening degree detection and evaluation method
CN117094997B (en) * 2023-10-18 2024-02-02 深圳市睿阳精视科技有限公司 Flower opening degree detection and evaluation method

Similar Documents

Publication Publication Date Title
CN111784688A (en) Flower automatic grading method based on deep learning
WO2022236876A1 (en) Cellophane defect recognition method, system and apparatus, and storage medium
CN109840900B (en) A fault online detection system and detection method applied to intelligent manufacturing workshops
CN109785337B (en) A method of counting mammals in pen based on instance segmentation algorithm
CN107804514B (en) Toothbrush sorting method based on image recognition
CN110473173A (en) A kind of defect inspection method based on deep learning semantic segmentation
CN107657603A (en) A kind of industrial appearance detecting method based on intelligent vision
CN110806736A (en) Method for detecting quality information of forge pieces of die forging forming intelligent manufacturing production line
CN107229930A (en) A kind of pointer instrument numerical value intelligent identification Method and device
CN113487533B (en) A digital inspection system and method for parts assembly quality based on machine learning
CN108711148A (en) A kind of wheel tyre defect intelligent detecting method based on deep learning
CN105574897A (en) Crop growth situation monitoring Internet of Things system based on visual inspection
CN112497219B (en) Columnar workpiece classifying and positioning method based on target detection and machine vision
CN115170572A (en) BOPP composite film surface gluing quality monitoring method
CN114596243A (en) Defect detection method, apparatus, device, and computer-readable storage medium
CN111487192A (en) Machine vision surface defect detection device and method based on artificial intelligence
CN110826552A (en) Grape non-destructive automatic detection device and method based on deep learning
CN117314829A (en) Industrial part quality inspection method and system based on computer vision
CN110935646A (en) Full-automatic crab grading system based on image recognition
CN115880209A (en) Surface defect detection system, method, device and medium suitable for steel plate
CN119303860A (en) An intelligent picking method for defective red dates
CN114662594B (en) Target feature recognition analysis system
Janardhana et al. Computer aided inspection system for food products using machine vision—A review
CN119048427A (en) Sagging defect detection method and system for surface spraying of aviation workpiece
CN109063738B (en) Automatic online detection method for compressed sensing ceramic water valve plate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201016

RJ01 Rejection of invention patent application after publication