[go: up one dir, main page]

CN119295564B - Plant overall measurement calibration method, system and device based on computer vision - Google Patents

Plant overall measurement calibration method, system and device based on computer vision Download PDF

Info

Publication number
CN119295564B
CN119295564B CN202411850171.3A CN202411850171A CN119295564B CN 119295564 B CN119295564 B CN 119295564B CN 202411850171 A CN202411850171 A CN 202411850171A CN 119295564 B CN119295564 B CN 119295564B
Authority
CN
China
Prior art keywords
calibration
scale
identifier
image
relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411850171.3A
Other languages
Chinese (zh)
Other versions
CN119295564A (en
Inventor
陈渝阳
陈曦
王闯
徐宏利
吕士平
谢朝明
孙哲
刘荣利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Top Cloud Agri Technology Co ltd
Original Assignee
Zhejiang Top Cloud Agri Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Top Cloud Agri Technology Co ltd filed Critical Zhejiang Top Cloud Agri Technology Co ltd
Priority to CN202411850171.3A priority Critical patent/CN119295564B/en
Publication of CN119295564A publication Critical patent/CN119295564A/en
Application granted granted Critical
Publication of CN119295564B publication Critical patent/CN119295564B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Mathematics (AREA)
  • Geometry (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明公开一种基于计算机视觉的植物全貌测量标定方法、系统及装置,方法包括:获取标定板的相关参数及标识符的位置;获取第一图像采集装置对标定板的侧面采集的侧面标定图像;获取第二图像采集装置对处于若干标定位的标定板的顶面采集的顶面标定图像;对侧面标定图像及顶面标定图像分别进行预处理,得到原始标识符位置,结合标识符的实际物理距离比例关系确定标准状态下标识符位置,得到第一变换矩阵、第一比例尺关系、第二变换矩阵及第二比例尺关系;基于第一变换矩阵及当前第二比例尺关系对待测物体进行表型测量得到待测物体实际测量信息。本发明能够实现物体二维全貌测量,适用于中小型应用场景中物体二维表型的快速、准确、高效的测量。

The present invention discloses a computer vision-based plant panoptic measurement calibration method, system and device, the method comprising: obtaining relevant parameters of a calibration plate and the position of an identifier; obtaining a side calibration image acquired by a first image acquisition device from the side of the calibration plate; obtaining a top surface calibration image acquired by a second image acquisition device from the top surface of the calibration plate at a number of calibration positions; preprocessing the side calibration image and the top surface calibration image respectively to obtain the original identifier position, combining the actual physical distance ratio of the identifier to determine the identifier position under the standard state, and obtaining a first transformation matrix, a first scale relationship, a second transformation matrix and a second scale relationship; performing phenotypic measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain the actual measurement information of the object to be measured. The present invention can realize two-dimensional panoptic measurement of an object, and is suitable for fast, accurate and efficient measurement of two-dimensional phenotypes of objects in small and medium-sized application scenarios.

Description

Plant overall measurement calibration method, system and device based on computer vision
Technical Field
The invention relates to the technical field of image processing, in particular to a plant overall measurement calibration method, system and device based on computer vision.
Background
The application of the object phenotype measurement technology mainly comprises object height measurement, object organ phenotype measurement, object canopy coverage measurement and the like. The application has the characteristics that the camera is far away from the object, the depth of the object is small, the crown width at the top end of the object is flat and dense, the organ is required to be cut and tiled during the phenotype measurement of the organ, the crown coverage can only calculate the proportional relation, and the physical measurement can not be realized.
In the application of object height measurement, corn images are acquired at a short distance through an RGB-D camera, single-plant corn height information is obtained through measurement through an image analysis method, cucumber seedling height measurement in a greenhouse is carried out based on an RGB-D equipment imaging device and a calibration object reference comprehensive method, measurement errors are measured in practice and are 7.6% on average through reference object setting and point cloud analysis, and the height measurement is carried out by projecting laser lines emitted by a laser onto an agricultural object based on a laser vision plant height measurement system. The prior main scheme for measuring the canopy coverage of the object is based on a semi-automatic measurement mode, needs to manually customize an analysis area, performs measurement and statistical analysis on the object area, finally obtains canopy coverage information, and has strong subjectivity and low efficiency.
Disclosure of Invention
A plant overall measurement calibration method based on computer vision comprises the following steps:
Acquiring relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the central positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate;
acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate;
Acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
preprocessing the side surface calibration image and the top surface calibration image respectively to obtain an original identifier position, and determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
constructing a current second scale relationship by calibrating a plurality of heights of the plates and each second scale relationship;
And performing phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
As an implementation manner, the preprocessing is performed on the side calibration image and the top calibration image respectively to obtain the original identifier position, which includes the following steps:
Respectively carrying out gray scale treatment on the side surface calibration image and the top surface calibration image to obtain a side surface calibration gray scale image and a top surface calibration gray scale image;
Obtaining an average gray value of the side image and an average gray value of the top image based on the side calibration gray image and the top calibration gray image;
threshold segmentation is carried out on the side gray level image through the average gray level value of the side image to obtain a side binary image;
performing feature screening of identifiers on the side binary images and the top binary images to obtain identifier areas, and fitting the identifier areas to obtain area circumscribing circles, wherein the identifiers are circular identifiers;
Obtaining the regional circularity based on the area of the identifier region and the area of the region circumscribing circle, and analyzing the identifier region obtained by screening the identifier features based on the regional circularity and the area of the identifier region to obtain the original identifier position;
the zone circularity is expressed as follows:
Wherein, The degree of circularity of the region is indicated,The area of the identifier area is represented,Indicating the area circumscribed circle.
As an implementation manner, the physical information of the calibration plate includes the height of the calibration plate and the width of the calibration plate, and the actual physical distance proportional relationship of the identifier is the proportional relationship of the height of the calibration plate and the width of the calibration plate.
As an embodiment, the first transformation matrix, the first scale relationship, the second transformation matrix, and the second scale relationship are obtained by:
based on the original identifier position and the identifier position in the standard state, the pixel width of the centers of two identifiers in the same row in the turning state and the pixel height of the centers of two identifiers in the same column in the turning state are respectively obtained, and are expressed as follows:
fitting is carried out through the position relation between the original identifier position and the identifier position in the standard state, and an original standard fitting relation is obtained and expressed as follows:
And obtaining a first transformation matrix or a second transformation matrix and a first scale relation or a second scale relation through the original identifier position, the identifier position in the standard state and the original standard fitting relation, wherein the first transformation matrix or the second transformation matrix and the first scale relation or the second scale relation are expressed as follows:
Wherein, The width of the pixel representing the center of the rotated identifier,Representing the distance between standard identifiers in the same row,Representing the distance between the upper right and lower right angular coordinates in the standard identifier,The pixel height representing the center of the rotated identifier,Representing the set of identifier position points in the side standard state or the set of identifier position points in the top standard state,Representing either a set of side original identifier location points or a set of top original identifier location points,Representing either the first transformation matrix or the second transformation matrix,Representing either a first scale relationship or a second scale relationship,Indicating the width of the calibration plate,Indicating the height of the calibration plate.
As an implementation manner, the step of constructing the current second scale relationship by calibrating the plurality of heights of the plates and each second scale relationship includes the following steps:
Fitting is carried out based on the height of the calibration plate and the relation of the second scale, and a fitting relation is obtained, wherein the fitting relation is expressed as follows:
based on the fitting relation, a current second scale relation is obtained, and is expressed as follows:
Wherein, A second scale relationship is indicated and,Indicating the height of the calibration plate,Indicating the number of calibration plates,Respectively representing parameters.
As an implementation manner, before the actual measurement information of the object to be measured is obtained, the method further includes the following steps:
According to the object measurement height, combining the second transformation matrix and the scale fitting matrix to obtain the current second scale relation of the object, wherein the current second scale relation is expressed as follows:
Wherein, A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the actual width of the object,Representing the parameters.
As an implementation manner, the actual measurement information of the object to be measured is obtained through the following steps:
processing the object side calibration image to be detected based on the first transformation matrix to obtain object side phenotype information, wherein the object side phenotype information at least comprises object measurement height, converting the object measurement height based on a side scale to obtain the actual height of the object, and representing as follows:
Performing scale transformation on the object top surface phenotype information based on the current second scale relationship of the object to obtain object actual top surface phenotype information, wherein the object actual top surface phenotype information at least comprises an object actual width, an object actual width and an object actual area index, and is expressed as follows:
Wherein, Indicating the actual height of the object,The height of the object to be measured is indicated,A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the measured width of the object,Representing the actual width of the object,Representing the measurement of the spread of the object,Indicating the actual spread of the object,Representing the index of the measured area of the object,Representing the actual area index of the object.
A plant overall measurement calibration system based on computer vision comprises an image acquisition module, a preprocessing and analysis module, a scale fitting module and a phenotype information calculation module;
The image acquisition module is used for acquiring relevant parameters and positions of identifiers of the calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the center positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate; acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate; acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
the preprocessing and analyzing module is used for respectively preprocessing the side calibration image and the top calibration image to obtain an original identifier position, determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier, and further obtaining a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
The scale fitting module constructs a current second scale relation through a plurality of heights of the calibration plates and each second scale relation;
the phenotype information calculation module performs phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
A computer readable storage medium storing a computer program which when executed by a processor performs the method of:
Acquiring relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the central positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate;
acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate;
Acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
preprocessing the side surface calibration image and the top surface calibration image respectively to obtain an original identifier position, and determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
constructing a current second scale relationship by calibrating a plurality of heights of the plates and each second scale relationship;
And performing phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
A plant general vision measurement calibration device based on computer vision, comprising a memory, a processor and a computer program stored in the memory and running on the processor, the processor executing the computer program to implement the method as follows:
Acquiring relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the central positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate;
acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate;
Acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
preprocessing the side surface calibration image and the top surface calibration image respectively to obtain an original identifier position, and determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
constructing a current second scale relationship by calibrating a plurality of heights of the plates and each second scale relationship;
And performing phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
The invention has the remarkable technical effects due to the adoption of the technical scheme:
The calibration method for the two-dimensional overall measurement can flexibly adjust the scale and shape characteristics of the calibration object, adopts a color-based segmentation algorithm and a shape-based characteristic selection algorithm to extract the calibration object, realizes the positioning and scale calculation of the calibration object according to the physical characteristics of the calibration object, obtains the corresponding scale of a top surface camera according to the fitting relation between the height sequence of the calibration object and the corresponding scale sequence and combines the actual height information of the object in the actual measurement process, and is adaptive to the scale calculation in the top surface phenotype analysis process of the object with different heights, thereby being used for the two-dimensional overall measurement of the object. The calibration scheme and the scale calculation method of the invention process and analyze the images of the side and top surfaces of the object, realize the automation and simplification of the two-dimensional overall measurement of the object, improve the universality, the robustness and the operation convenience of the application and the method, and can adapt to the objects with different heights based on the designed calibration scheme by the scale transformation relationship, realize the function suitable for one-time calibration and the whole life under the condition of unchanged system structure, and provide a reference basis for the analysis of the two-dimensional overall measurement. The method provided by the invention can be suitable for rapid, accurate and efficient measurement of the two-dimensional phenotype of the object in the medium-and-small-sized application scene.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained according to these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a schematic overall flow diagram of the method of the present invention;
FIG. 2 is a schematic diagram of the overall structure of the system of the present invention;
FIG. 3 is a schematic diagram of a two-dimensional phenotype platform architecture of the present invention;
FIG. 4 is an exemplary diagram of a calibration plate;
FIGS. 5-6 are exemplary diagrams of side camera and top camera calibration image acquisition;
FIG. 7 is an exemplary view of calibration plate image acquisition;
The reference numbers in the drawings illustrate 101, a top RGB camera, 102, a side RGB camera, 103, a box bottom surface platform, 104, an object, 201, a computer, 202, a cloud platform, 100, an image acquisition module, 200, a preprocessing and analyzing module, 300, a scale fitting module and 400, a phenotype information calculating module.
Detailed Description
The present invention will be described in further detail with reference to the following examples, which are illustrative of the present invention and are not intended to limit the present invention thereto.
Example 1:
A plant overall measurement calibration method based on computer vision, as shown in figure 1, comprises the following steps:
S100, acquiring relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular and at least four identifiers are preset, the center positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate;
S200, acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate;
S300, acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
s400, respectively preprocessing the side calibration image and the top calibration image to obtain an original identifier position, and determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
S500, constructing a current second scale relationship through a plurality of heights of the calibration plates and each second scale relationship;
s600, performing phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
In one embodiment, the schematic diagram of the calibration plate is shown as 7, the size of the calibration plate is larger than the size of an actually measured object or an object to be measured to ensure the accuracy of scale conversion, the color of the calibration plate and the contrast ratio of the background color are high to facilitate the segmentation, in the embodiment, 4 dark color round identifiers and a rectangular calibration plate with a bright background are arranged, 4 identifiers form a rectangle, the physical diameter of the round identifiers is R, the physical distance of the two identifiers in the same row with wide center position relationship is Width, the physical distance of the two identifiers in the same column with high center position relationship is Height, and the center coordinates C1, C2, C3 and C4 of the identifiers on the calibration plate are assumed, C1 is the center coordinate of the identifier of the upper left corner, C2 is the center coordinate of the identifier of the upper right corner, C3 is the center coordinate of the identifier of the lower left corner, C4 is the center coordinate of the identifier of the lower right corner, the identifier center coordinates under the initial standard state are fitted according to the distribution, namely, the rectangles taking C1 and C4 as diagonal vertexes are fitted, so that the center coordinates C2 'of the upper right corner and the center coordinates C3' of the lower left corner under the standard state are obtained, a transformation matrix transformMatrix is obtained according to [ C1, C2, C3, C4], [ C1, C2', C3', C4], the pixel distance of each center is pixelWidth, pixelHeight, the original image src is affine transformed, a corrected result image dest is obtained, the initial scale is scaleSrc = (Width/pixelWidth +height/pixelHeight)/2.0, and the unit is mm/pixel.
In addition, in the top surface calibration process, a plurality of calibration positions are set, and according to the fitting relation between the position information and the scale, the scale corresponding to the top surface camera is calculated according to the actual object height, and the error of the measurement of the top surface camera due to the uncertainty of the object height is reduced, wherein in the embodiment, 3 calibration positions are set to be 100mm, 200mm and 300mm away from the bottom surface respectively, as shown in fig. 6, a corresponding transformation matrix is calculated according to a calibration algorithm, the scale is verticalTransformMatrix1、verticalTransformMatrix2、verticalTransformMatrix1、verticalScaleSrc1、verticalScaleSrc2、verticalScaleSrc3,, and the physical height heights = [100,200 and 300] of the calibration object is taken as an independent variable, and the scale VERTICALSCALESRCS = [ VERTICALSCALESRC1, VERTICALSCALESRC and VERTICALSCALESRC ] is taken as a dependent variable to obtainThe unitary linear equation of (2) can be used as independent variables according to the actual height of the object measured by the side camera, the corresponding scale relation of the top view camera under the current object height can be calculated, the whole scheme is low in cost, easy to verify and expand, and the size of the calibration object and the selection of the calibration position can be adjusted according to actual needs.
In one embodiment, the acquisition and analysis process implemented by the apparatus of FIG. 3 specifically includes a top RGB camera 101, a side RGB camera 102, a case floor platform 103, an object 104, a computer 201, and a cloud platform 202. The schematic diagram of the calibration plate is shown in fig. 7, the actual measurement is carried out on the object to be measured, the side calibration image and the top calibration image sequence are firstly collected respectively, the corresponding transformation matrix and the initial scale are calculated, the relation formula is fitted according to the height sequence and the corresponding scale of the calibration plate when the top camera is calibrated, the side and the top images of the object are collected, the side image is firstly corrected and analyzed to obtain the side phenotype parameters containing the actual height information, the current top camera scale of the object is obtained according to the relation formula of the height of the object and the top camera scale, and the top camera object image is corrected and analyzed and calculated by combining the calculated scales to obtain the top phenotype parameters of the object. Therefore, the calibration object can be accurately positioned, the scale is calculated, and the overall measurement of the side surface and the top surface of the object is realized.
In one embodiment, the preprocessing is performed on the side calibration image and the top calibration image to obtain the original identifier position, and the method includes the following steps:
Respectively carrying out gray scale treatment on the side surface calibration image and the top surface calibration image to obtain a side surface calibration gray scale image and a top surface calibration gray scale image;
Obtaining an average gray value of the side image and an average gray value of the top image based on the side calibration gray image and the top calibration gray image;
threshold segmentation is carried out on the side gray level image through the average gray level value of the side image to obtain a side binary image;
performing feature screening of identifiers on the side binary images and the top binary images to obtain identifier areas, and fitting the identifier areas to obtain area circumscribing circles, wherein the identifiers are circular identifiers;
Obtaining the regional circularity based on the area of the identifier region and the area of the region circumscribing circle, and analyzing the identifier region obtained by screening the identifier features based on the regional circularity and the area of the identifier region to obtain the original identifier position;
the zone circularity is expressed as follows:
Wherein, The degree of circularity of the region is indicated,The area of the identifier area is represented,Indicating the area circumscribed circle.
In one embodiment, the first transformation matrix, the first scale relationship, the second transformation matrix, and the second scale relationship are obtained by:
based on the original identifier position and the identifier position in the standard state, the pixel width of the centers of two identifiers in the same row in the turning state and the pixel height of the centers of two identifiers in the same column in the turning state are respectively obtained, and are expressed as follows:
fitting is carried out through the position relation between the original identifier position and the identifier position in the standard state, and an original standard fitting relation is obtained and expressed as follows:
And obtaining a first transformation matrix or a second transformation matrix and a first scale relation or a second scale relation through the original identifier position, the identifier position in the standard state and the original standard fitting relation, wherein the first transformation matrix or the second transformation matrix and the first scale relation or the second scale relation are expressed as follows:
Wherein, The width of the pixel representing the center of the rotated identifier,Representing the distance between standard identifiers in the same row,Representing the distance between the upper right and lower right angular coordinates in the standard identifier,The pixel height representing the center of the rotated identifier,Representing the set of identifier position points in the side standard state or the set of identifier position points in the top standard state,Representing either a set of side original identifier location points or a set of top original identifier location points,Representing either the first transformation matrix or the second transformation matrix,Representing either a first scale relationship or a second scale relationship,Indicating the width of the calibration plate,Indicating the height of the calibration plate.
The process can be understood as a process of calibrating the side camera and the top camera, as shown in fig. 5, in which a calibration plate is placed in the side camera calibration process, the plane of the calibration plate faces the side camera, and the calibration plate is located at the center of the placement of the object to be measured, so that a side calibration image1 is acquired. Since the larger the calibration plate size is than the actual object size, the higher the phenotypic measurement accuracy is, provided that the calibration plate size is within the effective imaging size range of the camera.
Therefore, the calibration parameters need to be calculated according to the following principle:
The diameter of the identifier of the calibration plate is R, the relationship between the distances of all circle centers is wide, and the high physical distance is Width and Height, respectively, and the unit is mm;
the algorithm process is as follows:
inputting side calibration image1, physical information Width and Height of the calibration plate;
the algorithm flow is as follows:
The method comprises the steps of performing image processing on a calibration image to obtain identifier center point positions C1, C2, C3 and C4, wherein the identifier center point (circle center) positions correspond to the positions of the upper left corner, the upper right corner, the lower left corner and the lower right corner of the calibration plate respectively, and then the identifier detection algorithm on the calibration plate is as follows:
The calibration plate identifier has higher contrast, so that an image2 is obtained by graying a side calibration image1, an average gray value aveGray of the image2 is calculated, the image2 is subjected to threshold segmentation by taking aveGray/2.0 as a threshold value to obtain a binary image3 only containing a region smaller than aveGray/2.0, the image3 is subjected to feature screening based on circularity, the actual area of a single region is assumed to be area, the minimum circumcircle fitting is carried out on the region to obtain a circumcircle area, the circle center position is distributed as area ', the area circularity is circlePor =area/area', and the region is reserved only when circlePor > thre1, area < thre2 and area > thre3, and finally the identifier position coordinates C1, C2, C3 and C4 are obtained;
If the actual physical distance proportional relation of the center points of the identifiers is Width/Height, as shown in fig. 4, taking C1 and C4 as rectangles of diagonal vertexes, thereby obtaining an upper right corner center coordinate C2 'and a lower left corner center coordinate C3' under a standard state, and obtaining the center positions of the original identifiers and the center positions of the identifiers [ C1, C2, C3, C4], [ C1, C2', C3', C4], wherein the pixel distance of C2 'between C1 is pixelDis1, and the pixel distance of C4 between C3' is pixelDis;
Step three, calculating to obtain that the circle center pixel distance of the corrected calibration plate identifier is pixelWidth =max (pixelDis 1, pixelDis 2) and pixelHeight =min (pixelDis 1, pixelDis 2) according to [ C1, C2, C3, C4], [ C1, C2', C3', C4], respectively, and according to a formula WhereinCorresponding to the coordinates of the C1, C2', C3', C4 point set,Corresponding to the coordinates of the C1, C2, C3 and C4 point sets to obtain a matrixI.e. the first transformation matrix or the side transformation matrix, SIDESCALE = (Width/pixelWidth + Height/pixelHeight)/2.0 is the side camera scale relationship, i.e. the first scale relationship, in mm/pixel.
The final algorithm outputs the side camera image correction matrix sideTransformMatrix, the scale SIDESCALE.
The step also comprises the process of calibrating the top camera, namely, in order to be capable of self-adapting to top vision phenotype measurement of objects with different heights, the specific process is as follows:
as shown in fig. 6, fig. 6 is a placement mode of a calibration plate in the calibration process of the top-surface camera, the plane of the calibration plate faces the top-surface camera, the calibration plate is located at the center of the platform, 3 calibration positions are set in the embodiment, the distances between the calibration plate and the bottom surface are 100mm, 200mm and 300mm respectively, and the acquired top-surface calibration images with corresponding heights are respectively a top-surface calibration image4, a top-surface calibration image5 and a top-surface calibration image6;
According to the calculation process of the transformation matrix and the scale obtained by the side calibration image processing mode, respectively obtaining a second transformation matrix and a second scale corresponding to each height, wherein the second transformation matrix and the second scale are respectively verticalTransformMatrix1、verticalTransformMatrix2、verticalTransformMatrix3、verticalScaleSrc1、verticalScaleSrc2、verticalScaleSrc3.
In a specific embodiment, the step of constructing the current second scale relationship by calibrating the plurality of heights of the plates and each second scale relationship includes the following steps:
Fitting is carried out based on the height of the calibration plate and the relation of the second scale, and a fitting relation is obtained, wherein the fitting relation is expressed as follows:
based on the fitting relation, a current second scale relation is obtained, and is expressed as follows:
Wherein, A second scale relationship is indicated and,Indicating the height of the calibration plate,Indicating the number of calibration plates,Respectively representing parameters.
That is, since the calibration plates are at different heights, in order to obtain more accurate height information of the object to be measured or the actually measured object and other information related to the height information, a top-surface camera scale relational fitting method is also designed, specifically:
Inputs are the height of the calibration plate [ heihgt, height2, height3, ] and the scale matrix [ VERTICALSCALESRC1, VERTICALSCALESRC2, VERTICALSCALESRC3, ];
According to the formula Obtaining
This relationship can be understood as the present second scale relationship, which is a relationship of the second scale relationship and the height indicating the calibration plate and the number of calibration plates.
If the information is used for measuring the actual object, the following transformation can be performed during the measurement, or the transformation can be performed directly during the measurement, that is, the operation of this step can be performed before the actual measurement information of the object to be measured is obtained, specifically:
According to the object measurement height, combining the second transformation matrix and the scale fitting matrix to obtain the current second scale relation of the object, wherein the current second scale relation is expressed as follows:
Wherein, A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the actual width of the object,Representing the parameters.
In one embodiment, the actual measurement information of the object to be measured is obtained by the following steps:
processing the object side calibration image to be detected based on the first transformation matrix to obtain object side phenotype information, wherein the object side phenotype information at least comprises object measurement height, converting the object measurement height based on a side scale to obtain the actual height of the object, and representing as follows:
Performing scale transformation on the object top surface phenotype information based on the current second scale relationship of the object to obtain object actual top surface phenotype information, wherein the object actual top surface phenotype information at least comprises an object actual width, an object actual width and an object actual area index, and is expressed as follows:
Wherein, Indicating the actual height of the object,The height of the object to be measured is indicated,A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the measured width of the object,Representing the actual width of the object,Representing the measurement of the spread of the object,Indicating the actual spread of the object,Representing the index of the measured area of the object,Representing the actual area index of the object.
Specifically, a first transformation matrix sideTransformMatrix (side camera transformation matrix), a first scale relationship or a second scale relationship SIDESCALE, and second transformation matrices verticalTransformMatrix1, verticalTransformMatrix, verticalTransformMatrix (top camera transformation matrix) and a current second scale relationshipThe phenotype measurement is carried out on the object to be measured, wherein the object to be measured can be crops, and can be other objects, and the specific process is as follows:
Analyzing an image acquired by a side camera, and using the analyzed object height information as the selection of a top-surface camera transformation matrix and the calculation of an actual scale;
the side camera and the top camera acquire object information respectively to obtain a corresponding side calibration image7 and a corresponding top calibration image8;
performing image processing on the side calibration image7 to obtain object phenotype information, converting the phenotype information according to a side camera scale SIDESCALE to obtain actual side phenotype information containing the object height, wherein the image processing process comprises correcting the side image, and measuring to obtain the object actual height PLANTHEIGHT by using a transformation matrix sideTransformMatrix in the correction process;
According to the object height PLANTHEIGHT, calculating the position information of the nearest calibration plate in the calibration process of the top-surface camera, wherein the position corresponding transformation matrix is the matrix verticalTransformMatrix for correcting the acquired image of the top-surface camera in the current phenotype analysis process, and corresponds to a scale ;
Performing image processing on the top surface calibration image8 to obtain object phenotype information, and converting the phenotype information according to a top surface camera scale VERTICALSCALE to obtain actual top surface phenotype information of the object, wherein the image processing process comprises correcting the top surface image, and a transformation matrix verticalTransformMatrix is used in the correction process;
The object phenotyping parameters include, but are not limited to, object height PLANTHEIGHT, object width PLANTWIDE, object span PLANTSTENTER, object leaf area index PLANTAREA, wherein PLANTHEIGHT is side information, PLANTWIDE, PLANTSTENTER, PLANTAREA is top information, and actual measurement information is obtained by converting based on a calibrated scale:
In this way, the actual measurement information of the object to be measured is finally obtained.
Example 2:
The plant overall measurement calibration system based on computer vision, as shown in fig. 2, comprises an image acquisition module 100, a preprocessing and analysis module 200, a scale fitting module 300 and a phenotype information calculation module 400;
The image acquisition module 100 acquires relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular and at least four identifiers are preset, the central positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate; acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate; acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
The preprocessing and analyzing module 200 performs preprocessing on the side calibration image and the top calibration image respectively to obtain an original identifier position, and determines the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier, so as to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
the scale fitting module 300 constructs a current second scale relationship through calibrating a plurality of heights of the plates and each second scale relationship;
the phenotype information calculation module 400 performs phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship, and obtains actual measurement information of the object to be measured.
All changes and modifications that come within the spirit and scope of the invention are desired to be protected and all equivalent thereto are deemed to be within the scope of the invention.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that identical and similar parts of each embodiment are mutually referred to.
It will be apparent to those skilled in the art that embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that:
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearances of the phrase "one embodiment" or "an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment.
In addition, the specific embodiments described in the present specification may differ in terms of parts, shapes of components, names, and the like. All equivalent or simple changes of the structure, characteristics and principle according to the inventive concept are included in the protection scope of the present invention. Those skilled in the art may make various modifications or additions to the described embodiments or substitutions in a similar manner without departing from the scope of the invention as defined in the accompanying claims.

Claims (6)

1. The plant overall measurement calibration method based on computer vision is characterized by comprising the following steps of:
Acquiring relevant parameters and positions of identifiers of a calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the central positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate;
acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate;
Acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
preprocessing the side surface calibration image and the top surface calibration image respectively to obtain an original identifier position, and determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier to obtain a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
The side calibration image and the top calibration image are respectively preprocessed to obtain the original identifier position, and the method comprises the following steps:
Respectively carrying out gray scale treatment on the side surface calibration image and the top surface calibration image to obtain a side surface calibration gray scale image and a top surface calibration gray scale image;
Obtaining an average gray value of the side image and an average gray value of the top image based on the side calibration gray image and the top calibration gray image;
threshold segmentation is carried out on the side gray level image through the average gray level value of the side image to obtain a side binary image;
performing feature screening of identifiers on the side binary images and the top binary images to obtain identifier areas, and fitting the identifier areas to obtain area circumscribing circles, wherein the identifiers are circular identifiers;
Obtaining the regional circularity based on the area of the identifier region and the area of the region circumscribing circle, and analyzing the identifier region obtained by screening the identifier features based on the regional circularity and the area of the identifier region to obtain the original identifier position;
the zone circularity is expressed as follows:
Wherein, The degree of circularity of the region is indicated,The area of the identifier area is represented,A circle circumscribing the representation area;
the first transformation matrix, the first scale relation, the second transformation matrix and the second scale relation are obtained through the following steps:
based on the original identifier position and the identifier position in the standard state, the pixel width of the centers of two identifiers in the same row in the turning state and the pixel height of the centers of two identifiers in the same column in the turning state are respectively obtained, and are expressed as follows:
fitting is carried out through the position relation between the original identifier position and the identifier position in the standard state, and an original standard fitting relation is obtained and expressed as follows:
And obtaining a first transformation matrix or a second transformation matrix and a first scale relation or a second scale relation through the original identifier position, the identifier position in the standard state and the original standard fitting relation, wherein the first transformation matrix or the second transformation matrix and the first scale relation or the second scale relation are expressed as follows:
Wherein, The width of the pixel representing the center of the rotated identifier,Representing the distance between standard identifiers in the same row,Representing the distance between the upper right and lower right angular coordinates in the standard identifier,The pixel height representing the center of the rotated identifier,Representing the set of identifier position points in the side standard state or the set of identifier position points in the top standard state,Representing either a set of side original identifier location points or a set of top original identifier location points,Representing either the first transformation matrix or the second transformation matrix,Representing either a first scale relationship or a second scale relationship,Indicating the width of the calibration plate,Representing the height of the calibration plate;
constructing a current second scale relationship by calibrating a plurality of heights of the plates and each second scale relationship;
Fitting is carried out based on the height of the calibration plate and the relation of the second scale, and a fitting relation is obtained, wherein the fitting relation is expressed as follows:
based on the fitting relation, a current second scale relation is obtained, and is expressed as follows:
Wherein, A second scale relationship is indicated and,Indicating the height of the calibration plate,Indicating the number of calibration plates,Respectively representing parameters;
before the actual measurement information of the object to be measured is obtained, the method further comprises the following steps:
According to the object measurement height, combining the second transformation matrix and the scale fitting matrix to obtain the current second scale relation of the object, wherein the current second scale relation is expressed as follows:
Wherein, A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the actual width of the object,Representing the parameters;
And performing phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
2. The method for calibrating plant overall measurement based on computer vision according to claim 1, wherein the physical information of the calibration plate comprises the height of the calibration plate and the width of the calibration plate, and the actual physical distance proportional relationship of the identifier is the proportional relationship of the height of the calibration plate and the width of the calibration plate.
3. The method for calibrating plant overall measurement based on computer vision according to claim 1, wherein the actual measurement information of the object to be measured is obtained by:
processing the object side calibration image to be detected based on the first transformation matrix to obtain object side phenotype information, wherein the object side phenotype information at least comprises object measurement height, converting the object measurement height based on a side scale to obtain the actual height of the object, and representing as follows:
Performing scale transformation on the object top surface phenotype information based on the current second scale relationship of the object to obtain object actual top surface phenotype information, wherein the object actual top surface phenotype information at least comprises an object actual width, an object actual width and an object actual area index, and is expressed as follows:
Wherein, Indicating the actual height of the object,The height of the object to be measured is indicated,A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the measured width of the object,Representing the actual width of the object,Representing the measurement of the spread of the object,Indicating the actual spread of the object,Representing the index of the measured area of the object,Representing the actual area index of the object.
4. The plant overall measurement calibration system based on computer vision is characterized by comprising an image acquisition module, a preprocessing and analyzing module, a scale fitting module and a phenotype information calculating module;
The image acquisition module is used for acquiring relevant parameters and positions of identifiers of the calibration plate, wherein the calibration plate is rectangular, at least four identifiers are preset, the center positions of the identifiers are respectively positioned at four corners of the calibration plate and form a rectangle, and the relevant parameters comprise physical information of the calibration plate; acquiring a side face calibration image acquired by a first image acquisition device on the side face of the calibration plate, wherein the first image acquisition device is positioned at the side of the calibration plate and at the center of the calibration plate; acquiring top surface calibration images acquired by a second image acquisition device on the top surfaces of a plurality of calibration plates positioned in a calibration mode, wherein the calibration plates are arranged according to a plurality of preset heights, and the second image acquisition device is positioned above the calibration plates and positioned in the center of the calibration plates;
the preprocessing and analyzing module is used for respectively preprocessing the side calibration image and the top calibration image to obtain an original identifier position, determining the identifier position in a standard state by combining with the actual physical distance proportional relation of the identifier, and further obtaining a first transformation matrix, a first proportional relation, a second transformation matrix and a second proportional relation;
The side calibration image and the top calibration image are respectively preprocessed to obtain the original identifier position, and the method comprises the following steps:
Respectively carrying out gray scale treatment on the side surface calibration image and the top surface calibration image to obtain a side surface calibration gray scale image and a top surface calibration gray scale image;
Obtaining an average gray value of the side image and an average gray value of the top image based on the side calibration gray image and the top calibration gray image;
threshold segmentation is carried out on the side gray level image through the average gray level value of the side image to obtain a side binary image;
performing feature screening of identifiers on the side binary images and the top binary images to obtain identifier areas, and fitting the identifier areas to obtain area circumscribing circles, wherein the identifiers are circular identifiers;
Obtaining the regional circularity based on the area of the identifier region and the area of the region circumscribing circle, and analyzing the identifier region obtained by screening the identifier features based on the regional circularity and the area of the identifier region to obtain the original identifier position;
the zone circularity is expressed as follows:
Wherein, The degree of circularity of the region is indicated,The area of the identifier area is represented,A circle circumscribing the representation area;
the first transformation matrix, the first scale relation, the second transformation matrix and the second scale relation are obtained through the following steps:
based on the original identifier position and the identifier position in the standard state, the pixel width of the centers of two identifiers in the same row in the turning state and the pixel height of the centers of two identifiers in the same column in the turning state are respectively obtained, and are expressed as follows:
fitting is carried out through the position relation between the original identifier position and the identifier position in the standard state, and an original standard fitting relation is obtained and expressed as follows:
And obtaining a first transformation matrix or a second transformation matrix and a first scale relation or a second scale relation through the original identifier position, the identifier position in the standard state and the original standard fitting relation, wherein the first transformation matrix or the second transformation matrix and the first scale relation or the second scale relation are expressed as follows:
Wherein, The width of the pixel representing the center of the rotated identifier,Representing the distance between standard identifiers in the same row,Representing the distance between the upper right and lower right angular coordinates in the standard identifier,The pixel height representing the center of the rotated identifier,Representing the set of identifier position points in the side standard state or the set of identifier position points in the top standard state,Representing either a set of side original identifier location points or a set of top original identifier location points,Representing either the first transformation matrix or the second transformation matrix,Representing either a first scale relationship or a second scale relationship,Indicating the width of the calibration plate,Representing the height of the calibration plate;
The scale fitting module constructs a current second scale relation through a plurality of heights of the calibration plates and each second scale relation;
Fitting is carried out based on the height of the calibration plate and the relation of the second scale, and a fitting relation is obtained, wherein the fitting relation is expressed as follows:
based on the fitting relation, a current second scale relation is obtained, and is expressed as follows:
Wherein, A second scale relationship is indicated and,Indicating the height of the calibration plate,Indicating the number of calibration plates,Respectively representing parameters;
before the actual measurement information of the object to be measured is obtained, the method further comprises the following steps:
According to the object measurement height, combining the second transformation matrix and the scale fitting matrix to obtain the current second scale relation of the object, wherein the current second scale relation is expressed as follows:
Wherein, A scale of the side face is shown,Representing the current second scale relationship of the object,Representing the actual width of the object,Representing the parameters;
the phenotype information calculation module performs phenotype measurement on the object to be measured based on the first transformation matrix and the current second scale relationship to obtain actual measurement information of the object to be measured.
5. A computer readable storage medium storing a computer program, which when executed by a processor performs the method of any one of claims 1 to 3.
6. A plant general vision measurement calibration device based on computer vision, comprising a memory, a processor and a computer program stored in the memory and running on the processor, characterized in that the processor implements the method according to any one of claims 1 to 3 when executing the computer program.
CN202411850171.3A 2024-12-16 2024-12-16 Plant overall measurement calibration method, system and device based on computer vision Active CN119295564B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411850171.3A CN119295564B (en) 2024-12-16 2024-12-16 Plant overall measurement calibration method, system and device based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411850171.3A CN119295564B (en) 2024-12-16 2024-12-16 Plant overall measurement calibration method, system and device based on computer vision

Publications (2)

Publication Number Publication Date
CN119295564A CN119295564A (en) 2025-01-10
CN119295564B true CN119295564B (en) 2025-05-27

Family

ID=94165664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411850171.3A Active CN119295564B (en) 2024-12-16 2024-12-16 Plant overall measurement calibration method, system and device based on computer vision

Country Status (1)

Country Link
CN (1) CN119295564B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315049A (en) * 2023-11-28 2023-12-29 浙江托普云农科技股份有限公司 Scale calibration method, system and device for three-dimensional overall measurement
CN117788601A (en) * 2023-12-26 2024-03-29 同方威视技术股份有限公司 Calibration methods, devices, systems and storage media for image acquisition equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017053505A2 (en) * 2015-09-25 2017-03-30 General Electric Company Method and device for measuring features on or near an object
CA3009798A1 (en) * 2017-07-12 2019-01-12 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
CN109191520B (en) * 2018-09-30 2020-08-04 湖北工程学院 Plant leaf area measuring method and system based on color calibration
CN109708578B (en) * 2019-02-25 2020-07-24 中国农业科学院农业信息研究所 Device, method and system for measuring plant phenotype parameters
CN110689579B (en) * 2019-10-18 2022-08-30 华中科技大学 Rapid monocular vision pose measurement method and measurement system based on cooperative target
US11425852B2 (en) * 2020-10-16 2022-08-30 Verdant Robotics, Inc. Autonomous detection and control of vegetation
CN114170148B (en) * 2021-11-12 2025-05-09 浙江托普云农科技股份有限公司 Corn plant type parameter measurement method, system, device and storage medium
CN114463412B (en) * 2022-02-07 2025-06-27 浙江托普云农科技股份有限公司 Plant phenotype measurement method, system and device based on computer vision
CN114612574B (en) * 2022-03-18 2025-09-09 南昌智能新能源汽车研究院 Vehicle-mounted panoramic camera panoramic aerial view calibration and conversion splicing method based on unmanned aerial vehicle
CN115690215B (en) * 2022-11-04 2025-08-12 上海市特种设备监督检验技术研究院 Dynamic measurement helmet based on vision and pose fusion
CN117173668A (en) * 2023-09-25 2023-12-05 同济大学 Road pothole area measurement method using chassis camera combined with instance segmentation network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315049A (en) * 2023-11-28 2023-12-29 浙江托普云农科技股份有限公司 Scale calibration method, system and device for three-dimensional overall measurement
CN117788601A (en) * 2023-12-26 2024-03-29 同方威视技术股份有限公司 Calibration methods, devices, systems and storage media for image acquisition equipment

Also Published As

Publication number Publication date
CN119295564A (en) 2025-01-10

Similar Documents

Publication Publication Date Title
CN112797915B (en) Calibration method, calibration device and system for line structured light measurement system
CN110163918B (en) Line structure cursor positioning method based on projective geometry
US7356201B2 (en) Process and device for the automatic rectification of single-channel or multi-channel images
CN112700488B (en) Method, system and device for analyzing leaf area of living long leaves based on image stitching
CN115187612B (en) A plane area measurement method, device and system based on machine vision
US9224198B2 (en) Analysis of the digital image of the surface of a tyre and processing of non-measurement points
CN102246205A (en) Method for measuring leaf disc growth of plants and suitable device therefor
CN116778344B (en) Land arrangement boundary line dividing method based on visual technology
KR102023087B1 (en) Method for camera calibration
CN114972633B (en) Fast scanning point cloud interpolation method under constraint of crossed laser lines
CN116071240B (en) Image mosaic method, device, computer equipment and storage medium
CN104182757A (en) Method of acquiring actual coverage area of measured target and device
CN117036954A (en) Plant area growth condition identification method and system
CN110689586B (en) Tongue image identification method in traditional Chinese medicine intelligent tongue diagnosis and portable correction color card used for same
CN119295564B (en) Plant overall measurement calibration method, system and device based on computer vision
US20230349687A1 (en) Plant dimension measuring method
CN111738936A (en) Image processing-based multi-plant rice spike length measuring method
CN111882621A (en) An automatic measurement method of rice thickness parameters based on binocular images
CN115423768B (en) Lens size measurement method, system, storage medium and intelligent terminal
CN118247355A (en) Camera external parameter calibration method, device, computer equipment and storage medium
JP7639281B2 (en) How to measure plant dimensions
CN115578407A (en) Checkerboard angular point detection method and device
JP2006059270A (en) Method for correcting distortion of image
CN118657825B (en) Plant nodule detection method, system and device based on machine vision
CN120726142B (en) An AI-based vision-based method and system for automatic wound matching and color calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant