CN109000587A - The method for obtaining accurate high density point cloud - Google Patents
The method for obtaining accurate high density point cloud Download PDFInfo
- Publication number
- CN109000587A CN109000587A CN201811003887.4A CN201811003887A CN109000587A CN 109000587 A CN109000587 A CN 109000587A CN 201811003887 A CN201811003887 A CN 201811003887A CN 109000587 A CN109000587 A CN 109000587A
- Authority
- CN
- China
- Prior art keywords
- striped
- image
- high frequency
- point cloud
- code
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000005070 sampling Methods 0.000 claims abstract description 38
- 235000009508 confectionery Nutrition 0.000 claims description 19
- 238000013519 translation Methods 0.000 claims description 8
- 238000001914 filtration Methods 0.000 claims description 5
- 238000010606 normalization Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 3
- 238000005286 illumination Methods 0.000 abstract description 4
- 239000000523 sample Substances 0.000 description 15
- 238000005259 measurement Methods 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000003708 edge detection Methods 0.000 description 2
- 230000035800 maturation Effects 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000005481 NMR spectroscopy Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/254—Projection of a pattern, viewing through a pattern, e.g. moiré
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of methods for obtaining accurate high density point cloud, including, coding structure light pattern is projected to target workpiece, i.e., target workpiece is projected and generates high-frequency coding striped and high frequency sampling striped;The image of target workpiece after acquisition projection light pattern;Described image is pre-processed;The judgement of positive and inverse code binaryzation is carried out to pretreated image;Image after the judgement of positive and inverse code binaryzation is decoded;Decoded image is sampled, sampled point is obtained;The sampled point is subjected to Stereo matching;Three-dimensional coordinate calculating is carried out to matched sampled point.Coding structure light pattern is projected to target workpiece, projection generates high frequency and samples striped, improves sampling density, eliminate image binaryzation error, improve decoding and sampling precision, highdensity Accurate Points cloud is obtained under the influence of reflected light.It avoids to reach due to the problems such as reflected light causes testee reconstruction point cloud missing, sampling precision is low in global illumination.
Description
Technical field
The present invention relates to three-dimensional reconstruction fields, and in particular, to a method of obtain accurate high density point cloud.
Background technique
All there is important research and application to computer vision and robotics based on the three-dimensional reconstruction of structure light coding
Value, and there is many advantages, such as precision is high, speed is fast, applied widely.It is widely used for recognition of face, automation at present
The multiple fields such as measurement, aerospace, reverse-engineering.The method for three-dimensional measurement of object mainly includes contact and contactless two
Major class.Contact type measurement instrument uses the trigger-types such as mechanical probes measuring head directly to contact testee surface, measurement accuracy mostly
Height, but it is not suitable for measuring flexible article, and the defects such as measuring speed is slow, measurement data density is low also affect the technology
Application.
Contactless measuring system mainly uses the modes such as light, sound, electromagnetism to measure, as ultrasonic distance measurement, nuclear magnetic resonance at
As etc..With photoelectric detecting technology, the increasingly maturation of computer technology, three-dimensional is become based on optical non-contact measuring technology
The main method of measurement.Non-contact measurement includes active and two kinds of passive type.Passive measurement method does not provide light source, root
The three-dimensional data of testee, such as stereo vision method are obtained according to the Image Feature Point Matching information that multiple cameras obtain.And structure
Light method is otherwise known as active trigonometry, projects pre-designed structure light image to testee by light source, projects object
After coding pattern on body surface face is modulated, it is formed by deforming stripe and contains testee apparent height information, these letters
Breath is captured by left and right two video cameras and is recorded, and obtains the three-dimensional information of object by image processing method by computer.With it is double
Unlike item stereo vision, binocular method of structured light can assign the characteristic point of h coding to texture-free body surface, pass through
Stereo Matching Technology searches out the corresponding characteristic point position in left and right, can calculate depth value.To all volumes on testee surface
After its depth value is all calculated in code-point, the three-dimensional point cloud of object can be obtained, three-dimensional mould can be obtained by proper treatment in point cloud
Type.
Structure light coding method can be divided into space encoding and time encoding two according to projection is different with decoding principle aspect
Kind;Space second level gray value coding and multi-stage grey scale value coding can be divided by the grey value profile difference of structured light patterns;It presses
Structured light projection pattern and image camera Color Channel difference can be divided into coloud coding and monochromatic coding.Space encoding only needs
Single frames or a small amount of frame number can encode scene, are suitable for dynamic scene and encode, but this coding pattern is vulnerable to object
Body surface properties affect and generate noise jamming.And time encoding scheme carries out encoding and decoding, object table using sequential coding pattern
Error caused by face characteristic influences is greatly reduced, but this method requires object scene stillness, not applicable and dynamic scene.But with
Quickly DLP shadow casting technique and high-speed industrial camera maturation, can be completed in a short time a large amount of coding patterns at present
Projection and acquisition.
Currently, structural light measurement technology is applied in interfering fewer scene mostly, can be approximately considered at this time tested
Object is only by the direct irradiation for projecting single light source.But in the industrial environment of practical application, object to be detected in addition to by
Outside direct projection source, it is also possible to be also likely to be present multiple light by the irradiation of the mutual reflected light in surrounding objects surface, and in environment
The interference etc. in source, these can all make the collected structure light coding information of camera that mistake occur, to cause decoding error.
Summary of the invention
It is an object of the present invention in view of the above-mentioned problems, a kind of method for taking accurate high density point cloud be proposed, to realize extremely
Least a portion of solution problems of the prior art.
To achieve the above object, the technical solution adopted by the present invention is that:
A method of accurate high density point cloud is obtained, including,
Coding structure light pattern is projected to target workpiece, i.e., generation high-frequency coding striped is projected to target workpiece and high frequency is adopted
Batten line;
The image of target workpiece after acquisition projection light pattern;
Described image is pre-processed;
The judgement of positive and inverse code binaryzation is carried out to pretreated image;
Image after the judgement of positive and inverse code binaryzation is decoded;
Decoded image is sampled, sampled point is obtained;
The sampled point is subjected to Stereo matching;
Three-dimensional coordinate calculating is carried out to matched sampled point.
Preferably, described project to target workpiece generates high-frequency coding striped, comprising:
Gray code candy strip is projected to target workpiece;
High frequency fringes are chosen in the Gray code candy strip as reference stripe;
Frequency in the Gray code candy strip is less than the striped of reference stripe frequency and reference stripe carries out exclusive or change
It changes.
Preferably, the high frequency fringes of choosing in the Gray code candy strip are as reference stripe, comprising:
By bit patterns binaryzation each in Gray code;
Judge the size that binarization result is influenced by reflected light;
It is benchmark striped that choosing, which is influenced small high frequency fringes by reflected light,.
Preferably, it is described in the Gray code candy strip choose high frequency fringes be used as reference stripe, specifically: select
The high frequency fringes of m kind frequency are benchmark striped, m >=2.
Preferably, the m=2.
Preferably, target workpiece is projected and generates high frequency sampling striped, comprising:
The highest bit patterns of frequency are selected to sample striped as high frequency in Gray code candy strip;
On the basis of high frequency sampling striped initial position, n times are respectively symmetrically translated to two sides, obtain 2n width sample graph
Case;
The two width sample patterns that every sub-symmetry is translated are as one group.
Preferably, it on the basis of the sampling striped initial position by the high frequency, is respectively symmetrically translated in n times to two sides:
Each translation distance is D=i × d, wherein i=1,2 ..., n, D are less than minimum fringes width in sampling striped,C is gray encoding digit, and L is coded image width.
It is preferably, described that described image is pre-processed, comprising:
Median filtering and gray scale normalization processing are carried out to image.
Preferably, the described pair of image after the judgement of positive and inverse code binaryzation is decoded, comprising:
Decoded code value is corrected, i.e., the different code values obtained decoding carry out subtraction, and subtraction is transported
The result of calculation take absolute value after compared with given threshold.
Preferably, the judgement of positive and inverse code binaryzation is carried out to pretreated image, specifically: determine that pixel is according to formula
The no region being illuminated in strip encoding, the formula are as follows:
Wherein, I is coded image, and I` is the negated coded image of I.
Technical solution of the present invention has the advantages that
Technical solution of the present invention projects coding structure light pattern to target workpiece, and projection generates high frequency and samples striped, mentions
High sampling density, eliminates image binaryzation error, improves decoding and sampling precision, obtain height under the influence of reflected light
The Accurate Points cloud of density.It avoids to reach since reflected light causes testee reconstruction point cloud missing, sampling in global illumination
The problems such as precision is low.
Below by drawings and examples, technical scheme of the present invention will be described in further detail.
Detailed description of the invention
Fig. 1 is the flow chart that the method for accurate high density point cloud is obtained described in the embodiment of the present invention;
Fig. 2 is the process that one specific embodiment of method of accurate high density point cloud is obtained described in the embodiment of the present invention
Figure;
Fig. 3 is to make using the high frequency fringes in Gray code as reference stripe to low frequency fringes described in the embodiment of the present invention
Exclusive or converts schematic diagram;
Fig. 4 a is the stripe pattern after projection Gray code striped described in the embodiment of the present invention;
Fig. 4 b is the stripe pattern after projection XOR6 described in the embodiment of the present invention;
Fig. 4 c is the stripe pattern after projection XOR7 described in the embodiment of the present invention;
Fig. 5 is that high frequency described in the embodiment of the present invention samples the moving distance of striped, moving direction schematic diagram;
Fig. 6 is Gray code edge and a high frequency sampled edge combination diagram described in the embodiment of the present invention.
Specific embodiment
Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings, it should be understood that preferred reality described herein
Apply example only for the purpose of illustrating and explaining the present invention and is not intended to limit the present invention.
The present invention mainly discloses a kind of machine vision metrology method, in particular to using high frequency standard striped to traditional Gray
Code improves, and obtains new coding structure striations, and carry out multiple high density to coding section using high frequency sampling striped
Sampling obtains the structural light three-dimensional measurement method of high density point cloud under the conditions of reflected light.
As shown in Figure 1, a kind of method for obtaining accurate high density point cloud, including,
S101: to target workpiece project coding structure light pattern, i.e., to target workpiece project generate high-frequency coding striped and
High frequency samples striped;
High frequency samples striped and generates:
By last Gray code, symmetrically translation n times, each translation distance are D to the left, to the right.Calculation formula is as follows:
Wherein, i=1,2...n, c are Gray code digit, and L is coded image width.
S102: the image of the target workpiece after acquisition projection light pattern;
Using two image acquisition device image informations of symmetry direction, symmetry direction can be left and right directions or preceding
Rear direction etc..
In a specific embodiment, using the target workpiece after the video camera acquisition projection light pattern of left and right two
Image predominantly obtains the left and right view of striped after being modulated.
S103: described image is pre-processed;
Preferably, systematic error is eliminated using median filtering;Using gray scale normalization, eliminate due to two camera apertures
It is influenced caused by of different sizes and illumination is different;
It, can be with the system noise in eliminating equipment and signals transmission, two dimension median filter formula using median filtering
It is as follows:
G (x, y)=med { f (x-k, y-l);K, l ∈ W },
Wherein, f (x, y), g (x, y) respectively represent original image and filtered image, and W is two dimension pattern plate, usually selection 3*
3 and 5*5 pattern, if W is 3*3 pattern, k, l ∈ { -1,0,1 }.
Gray scale normalization processing is carried out to input picture, is influenced caused by elimination left and right camera illumination is inconsistent, formula is such as
Under:
Wherein, v (x, y) and h (x, y) is respectively the gray value of image before and after normalized, max (v (x, y)), min (v
(x, y)) it is respectively that image grayscale is worth maximum value and minimum value.
S104: the judgement of positive and inverse code binaryzation is carried out to pretreated image;
The step is mainly by pretreated image binaryzation, it is preferred that predominantly according to formula, determines a certain pixel
The region whether being illuminated in strip encoding, formula are as follows:
Wherein, I is a certain coded image, and I` is the negated coded image of I.
S105: the image after the judgement of positive and inverse code binaryzation is decoded;
By the image binaryzation after Gray code striped and m group high-frequency coding fringe-adjusted, Gray code striped is according to Gray code
It is decoded with decimal number relationship;The binary result of the first corresponding benchmark high frequency patterns of high-frequency coding striped carries out same position
The XOR operation of pixel is reduced to the binary result of Gray code;Further according to binary code value, the corresponding decimal system is solved
Value, if obtaining high-frequency coding striped using m kind reference frequency, (m+1) of available same location of pixels organizes Gray code ten
Hex value;Verification is compared in (m+1) a code value of each pixel, abnormal point code value is corrected or rejected.
It by taking XOR6 image sequence as an example, is projected XOR6 as coding pattern, if the image difference of left camera acquisition
For L1,L2...L8.Binarization operation is carried out to these images first, formula is as follows:
L_new (i)=f (Li), i=1,2 ... 8.
According to obtained binary image, XOR6 and XOR7 sequence is restored, reduction formula is as follows:
In above formula, L_new is high-frequency coding fringes thresholding as a result, L_final is the two-value that L_new is reduced to Gray code
Change result.
Then it is decoded with the decoding process of traditional Gray code, obtains the decoding result of XOR6 striped sequence.Similarly
Available XOR7 striped sequential decoding result.The decoding result is binary gray code value.If the Gray code of a certain sampled point
Binary code value is (G0G1G2...Gi-1)2, then it is converted according to following formula:
L_h_code=2c-i+((G0G1G2…Gi-1)2)10·2c-i+1
Wherein, c be Gray code digit, i=1,2 ... c be binary image ordinal number, GiIt is this o'clock in the i-th width image
Binarization result, wherein G0=0.
Preferably, the described pair of image after the judgement of positive and inverse code binaryzation is decoded, comprising:
Decoded code value is corrected, i.e., the different code values obtained decoding carry out subtraction, and subtraction is transported
The result of calculation take absolute value after compared with given threshold.Given threshold is preferably 3.
Code value correction is specific as follows: for the pixel of same position, (m+1) group code value decoded it is compared
It is whether correct according to given threshold decision code value to verification, abnormal point code value is corrected or rejected;As m=2, such as
Obtained XOR6, XOR7 sequence is decoded, the decoding result of available three groups of different coding methods.
Assuming that any one pixel Q, after projection Gray code striped, XOR6 and XOR7 striped, projection Gray code striped,
Stripe pattern after XOR6 and XOR7 striped is respectively as shown in Fig. 4 a, Fig. 4 b and Fig. 4 c.Decoding obtained code value is respectively R1,R2,
R3, it is as follows to correct formula:
Wherein, tresh is given threshold, is usually arranged as 3.By the mutual comparison between different code values, to each
Pixel is corrected, and obtains final code value R.In actual measurement, it is only necessary to sampled point is decoded, corrected.
S106: decoded image is sampled, sampled point is obtained;
Sampling uses edge sample method: sampling Fringe Acquisition to Gray's code-bar line and high frequency using edge detection operator
Each obtained image zooming-out edge pixel;And micronization processes are carried out to obtained pixel, obtain Single pixel edge;By Gray
The Single pixel edge of each bit image merges in code sequence, obtains Gray code sampled edge;By what is obtained after the translation of every sub-symmetry
High frequency samples the Single pixel edge in stripe pattern and merges, and as the sampled edge of this time high frequency sampling, (n+1) group can be obtained altogether
Edge sample striped, including Gray code stripe edge and n group high frequency sample stripe edge.As shown in Figure 6.
It is as follows to the method for Gray code sequence edge point sampling:
Using edge detection operator to each image zooming-out edge pixel collected in gray code sequence, obtain
L1,…L8, this 8 edge images are subjected to logical "or" operation, obtain complete edge single pixel sampling in a certain direction
Figure.
Occur the case where one-to-many, multi-to-multi appearance in the matching process in order to prevent, need to be refined to edge is obtained
Processing.
If the coordinate of a certain edge non-zero pixels is (x, y), effective address is (x', y'), and corresponding image at (x, y)
Gray value is g (x, y), then for the edge sample of travers, valid pixel address are as follows:
(x', y') ∈ (x, y) | f (x+1, y)=0 },
The edge sample of longitudinal stripe, valid pixel address are as follows:
(x', y') ∈ (x, y) | f (x, y+1)=0 },
By carrying out Single pixel edge extraction to horizontal, vertical stripe, using Single pixel edge intersection point as sampled point.
To high frequency sampling striped marginal point sampling:
To be sampled to the method in Gray code sequence edge point sampling to each high frequency sampling striped marginal point, and with
Code value of the code value in gray encoding region as the sampled point where the point.
S107: the sampled point is subjected to Stereo matching;
In order to avoid the code value of sampled point conflicts in matching, need Gray code striped sampled point and n times high frequency
Striped sampled point carries out Stereo matching respectively, and the technical program carries out the method for Stereo matching and existing line moves coding method phase
Than not needing to recompile additional sampling striped, to reduce the operand of algorithm.
The repeating said steps S102 to step S106 on the direction vertical with step S101 projection pattern striped, two just
The point of the Single pixel edge intersection on direction is handed over to calculate as sampled point, and by the value combinations of both direction as final code
Value, since the code value of both direction is increased continuously, final code value is globally unique;It is obtained by step S106
(n+1) organize sampling point position code value, in same group of sampled point, according to epipolar-line constraint principle same level location of pixels into
The pixel matching of the left and right image of row.
S108: three-dimensional coordinate calculating is carried out to matched sampled point.
The same pixel obtained according to step S107 corresponding pixel coordinate in left and right image, utilizes triangulation
Method seeks the three-dimensional coordinate of the sampled point.(n+1) group three-dimensional coordinate point is obtained, final result is incorporated as.
Three-dimensional coordinate calculating is carried out to match point using binocular triangulation.Assuming that in left camera a certain sampled point seat
It is designated as Pl(ul,vl), P is sat in right magazine match point therewithr(ur,vr), then this sits P (x to the corresponding world of point setw,yy,
zw) calculation formula it is as follows:
Wherein, S1,S2For scale factor,WithFor the image transformation matrix of left and right camera.
Preferably, described project to target workpiece generates high-frequency coding striped, comprising:
Gray code candy strip is projected to target workpiece;
High frequency fringes are chosen in the Gray code candy strip as reference stripe;
Frequency in the Gray code candy strip is less than the striped of reference stripe frequency and reference stripe carries out exclusive or change
It changes.
Preferably, the high frequency fringes of choosing in the Gray code candy strip are as reference stripe, comprising:
By bit patterns binaryzation each in Gray code, binaryzation is using the method in positive and inverse code, that is, step S104;
Judge the size that binarization result is influenced by reflected light;
It is benchmark striped that choosing, which is influenced small high frequency fringes by reflected light,.
Preferably, it is described in the Gray code candy strip choose high frequency fringes be used as reference stripe, specifically: select
The high frequency fringes of m kind frequency are benchmark striped, m >=2.
Preferably, the m=2.
High-frequency coding striped generates specifically:
Gray code candy strip first is projected to target workpiece;Then using a certain high frequency fringes in Gray code striped as base
It is quasi-.Selection gist is to be influenced with binarization result by reflected light smaller successively by bit patterns binaryzation each in traditional Gray code
Striped start as high frequency fringes, usually since the 6th bit patterns.By XOR logic operation XOR, the item of the frequency will be less than
Line and reference stripe carry out the transformation of pixel exclusive or, remaining striped remains unchanged.Select m kind frequency (m >=2, it is proposed that the base of selection 2)
Floating screed line generates m group high frequency gray code sequence, successively the high frequency fringes after projection variation.As shown in Figure 3.
Due to being black and white pattern using coding pattern, its numerical value only has 0 and 1 after binaryzation.It is different for arbitrary logic
Or operation, meet following formula:
And to the striped P of two kinds of different frequencies of Mr. Yu1,P2, binarization meets following formula:
The binarization result of i.e. a certain frequency striped can be equivalent to be broken down into two kinds of frequencies after, respective binaryzation knot
The exclusive or of fruit combines, and obtains:
Therefore, a certain high frequency fringes are selectedAs reference stripe, to low frequency fringes PlIt is decomposed, it is available new
High frequency fringesAnd
Select the high frequency fringes in Gray codeAs reference stripe, to low frequency fringes PlIt is decomposed, it is available new
High frequency fringesAnd
With 8 gray code bits examples, if gray encoding striped sequence is respectively P1,P2,…,P8.Assuming that with the 6th Gray
Code pattern P6As reference pattern, the reference pattern and first five width figure are successively carried out to the XOR operation of respective pixel, i.e.,Obtain first five transformed striped and its radix-minus-one complement.It will be using the 6th gray code map as base
The coded sequence that quasi- figure obtains is named as XOR6, similarly available XOR7 coded sequence.
Preferably, target workpiece is projected and generates high frequency sampling striped, comprising:
The highest bit patterns of frequency are selected to sample striped as high frequency in Gray code candy strip;
On the basis of high frequency sampling striped initial position, n times are respectively symmetrically translated to two sides, obtain 2n width sample graph
Case;
The two width sample patterns that every sub-symmetry is translated are as one group.
Preferably, it on the basis of the sampling striped initial position by the high frequency, is respectively symmetrically translated in n times to two sides:
Each translation distance is D=i × d, wherein i=1,2 ..., n, D are less than minimum fringes width in sampling striped,C is gray encoding digit, and L is coded image width.As shown in Figure 5.
The highest bit patterns of frequency are selected to sample striped as high frequency, in original Gray code striped sequence with the striped
On the basis of initial position, n times are respectively symmetrically translated to two sides, each translation distance is D=i × d, 2n width sample pattern is obtained,
Two width of every sub-symmetry translation make one group.On the basis of having cast high-frequency coding striped, high frequency sampling bar is successively projected
Line.
In a specific embodiment, as shown in Fig. 2, respectively using the video camera acquisition projection light pattern of left and right two
The image of target workpiece afterwards predominantly obtains the left and right view of striped after being modulated.Then left and right view image is read respectively,
N Gray code+m are read respectively and organize high frequency conversion striped, and then the left and right view of reading is pre-processed, i.e., view is carried out
Median filtering and normalized carry out the judgement of positive and inverse code binaryzation to left and right view image, i.e., left and right view are carried out two-value
Change, the left and right view of binaryzation is then subjected to Gray code edge and high-frequency line moves stripe edge sampling, judges the code value of sampling
Whether group number is less than m+1, decodes if being not less than to the assembly coding of sampled point, such as less than then carries out code value correction, then right
Code value carries out global decoding, then combines the uniqueness coding that horizontal and vertical code value calculates each sampled pixel of left and right view, will
The equivalent code value matching of obtained uniqueness coding, then carries out three-dimensional coordinate calculating, i.e. completion three-dimensional space measurement.
In conclusion the present invention provides a kind of structural light three-dimensional surveys using the improved high-frequency coding of XOR logic operation
Amount method can obtain accurate high density point cloud under the influence of reflected light.This method generates high frequency knot using XOR logic operation
Structure pumped FIR laser striped, and joined high frequency sampling striped, avoid due to reflected light influence cause measured workpiece point cloud error, lack
It loses and the low problem of Points Sample precision;Has the characteristics that anti-interference ability to reflected light using high frequency fringes, by Gray code
Low frequency part carry out logic XOR operation with benchmark high frequency patterns, be translated into high frequency fringes and projected again, so
Afterwards by striped reduction, the decoding after acquisition, finally different groups of striped code values are carried out again mutually to compare correction, obtain final essence
Quasi- code value;Striped is sampled by projection high frequency, can be further improved sampling density and sampling precision;High frequency samples striped code value
It is identical as place Gray code section code value;Left and right view medium-high frequency sampled point is individually matched with Gray code sampled point;Relatively
In existing gray encoding method, the present invention, which uses, utilizes the improved high-frequency coding striped of XOR operation, in conjunction with high frequency sampling
Striped can be improved 50% or more to the workpiece reconstruction point cloud quantity under reflected light;Have in the stronger region of reflected light relatively strong
Resistance reflected light interference ability, available accurate code value.
Finally, it should be noted that the foregoing is only a preferred embodiment of the present invention, it is not intended to restrict the invention,
Although the present invention is described in detail referring to the foregoing embodiments, for those skilled in the art, still may be used
To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features.
All within the spirits and principles of the present invention, any modification, equivalent replacement, improvement and so on should be included in of the invention
Within protection scope.
Claims (10)
1. a kind of method for obtaining accurate high density point cloud, which is characterized in that including,
Coding structure light pattern is projected to target workpiece, i.e., target workpiece is projected and generates high-frequency coding striped and high frequency sampling bar
Line;
The image of target workpiece after acquisition projection light pattern;
Described image is pre-processed;
The judgement of positive and inverse code binaryzation is carried out to pretreated image;
Image after the judgement of positive and inverse code binaryzation is decoded;
Decoded image is sampled, sampled point is obtained;
The sampled point is subjected to Stereo matching;
Three-dimensional coordinate calculating is carried out to matched sampled point.
2. the method according to claim 1 for obtaining accurate high density point cloud, which is characterized in that described to throw target workpiece
It penetrates and generates high-frequency coding striped, comprising:
Gray code candy strip is projected to target workpiece;
High frequency fringes are chosen in the Gray code candy strip as reference stripe;
Frequency in the Gray code candy strip is less than the striped of reference stripe frequency and reference stripe carries out exclusive or transformation.
3. the method according to claim 2 for obtaining accurate high density point cloud, which is characterized in that described in the Gray code
High frequency fringes are chosen in candy strip as reference stripe, comprising:
By bit patterns binaryzation each in Gray code;
Judge the size that binarization result is influenced by reflected light;
It is benchmark striped that choosing, which is influenced small high frequency fringes by reflected light,.
4. the method according to claim 2 for obtaining accurate high density point cloud, which is characterized in that described in the Gray code
High frequency fringes are chosen in candy strip as reference stripe, specifically: select the high frequency fringes of m kind frequency for benchmark striped, institute
State m >=2.
5. the method according to claim 4 for obtaining accurate high density point cloud, which is characterized in that the m=2.
6. the method according to claim 2 for obtaining accurate high density point cloud, which is characterized in that project and give birth to target workpiece
Striped is sampled at high frequency, comprising:
The highest bit patterns of frequency are selected to sample striped as high frequency in Gray code candy strip;
On the basis of high frequency sampling striped initial position, n times are respectively symmetrically translated to two sides, obtain 2n width sample pattern;
The two width sample patterns that every sub-symmetry is translated are as one group.
7. the method according to claim 6 for obtaining accurate high density point cloud, which is characterized in that described to be adopted with the high frequency
On the basis of batten line initial position, respectively symmetrically translated in n times to two sides:
Each translation distance is D=i × d, wherein i=1,2 ..., n, D are less than minimum fringes width in sampling striped,
C is gray encoding digit, and L is coded image width.
8. the method according to claim 1 for obtaining accurate high density point cloud, which is characterized in that it is described to described image into
Row pretreatment, comprising:
Median filtering and gray scale normalization processing are carried out to image.
9. the method according to claim 1 for obtaining accurate high density point cloud, which is characterized in that described pair is passed through positive and inverse code
Image after binaryzation judgement is decoded, comprising:
Decoded code value is corrected, i.e., the obtained different code values of decoding are subjected to subtraction, and by subtraction
As a result after taking absolute value compared with given threshold.
10. the method according to claim 1 for obtaining accurate high density point cloud, which is characterized in that pretreated figure
As carrying out the judgement of positive and inverse code binaryzation, specifically: determine whether pixel is in the region being illuminated in strip encoding according to formula,
The formula are as follows:
Wherein, I is coded image, and I` is the negated coded image of I.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811003887.4A CN109000587A (en) | 2018-08-30 | 2018-08-30 | The method for obtaining accurate high density point cloud |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811003887.4A CN109000587A (en) | 2018-08-30 | 2018-08-30 | The method for obtaining accurate high density point cloud |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN109000587A true CN109000587A (en) | 2018-12-14 |
Family
ID=64594635
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201811003887.4A Pending CN109000587A (en) | 2018-08-30 | 2018-08-30 | The method for obtaining accurate high density point cloud |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN109000587A (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111489382A (en) * | 2019-01-28 | 2020-08-04 | 合肥美亚光电技术股份有限公司 | Method and device for obtaining coded fringe pattern and reconstructing based on structured light |
| CN112164072A (en) * | 2020-09-18 | 2021-01-01 | 深圳市南科信息科技有限公司 | Visible light imaging communication decoding method, device, device and medium |
| CN112710254A (en) * | 2020-12-21 | 2021-04-27 | 珠海格力智能装备有限公司 | Object measuring method, system, device, storage medium and processor |
| CN112967205A (en) * | 2021-03-25 | 2021-06-15 | 苏州天准科技股份有限公司 | Gray code filter-based outlier correction method, storage medium, and system |
| CN115564893A (en) * | 2022-09-28 | 2023-01-03 | 华南理工大学 | Image coding and decoding method based on coding structure light |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1062556A (en) * | 1996-08-19 | 1998-03-06 | Kyosan Electric Mfg Co Ltd | Brace detector |
| CN101666631A (en) * | 2009-09-07 | 2010-03-10 | 东南大学 | Three-dimensional measuring method based on positive and inverse code color encoding stripes |
| CN104315996A (en) * | 2014-10-20 | 2015-01-28 | 四川大学 | Method for realizing fourier transform profilometry by using binary encoding strategy |
| US20150062558A1 (en) * | 2013-09-05 | 2015-03-05 | Texas Instruments Incorporated | Time-of-Flight (TOF) Assisted Structured Light Imaging |
| CN104677308A (en) * | 2015-01-30 | 2015-06-03 | 宋展 | Three-dimensional scanning method for high-frequency two-value strip |
| CN105890546A (en) * | 2016-04-22 | 2016-08-24 | 无锡信捷电气股份有限公司 | Structured light three-dimensional measurement method based on orthogonal Gray code and line shift combination |
| CN108332670A (en) * | 2018-02-06 | 2018-07-27 | 浙江大学 | A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation |
-
2018
- 2018-08-30 CN CN201811003887.4A patent/CN109000587A/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1062556A (en) * | 1996-08-19 | 1998-03-06 | Kyosan Electric Mfg Co Ltd | Brace detector |
| CN101666631A (en) * | 2009-09-07 | 2010-03-10 | 东南大学 | Three-dimensional measuring method based on positive and inverse code color encoding stripes |
| US20150062558A1 (en) * | 2013-09-05 | 2015-03-05 | Texas Instruments Incorporated | Time-of-Flight (TOF) Assisted Structured Light Imaging |
| CN104315996A (en) * | 2014-10-20 | 2015-01-28 | 四川大学 | Method for realizing fourier transform profilometry by using binary encoding strategy |
| CN104677308A (en) * | 2015-01-30 | 2015-06-03 | 宋展 | Three-dimensional scanning method for high-frequency two-value strip |
| CN105890546A (en) * | 2016-04-22 | 2016-08-24 | 无锡信捷电气股份有限公司 | Structured light three-dimensional measurement method based on orthogonal Gray code and line shift combination |
| CN108332670A (en) * | 2018-02-06 | 2018-07-27 | 浙江大学 | A kind of structured-light system coding method for merging the positive and negative Gray code of RGB channel and striped blocks translation |
Non-Patent Citations (1)
| Title |
|---|
| 林辉: "高动态范围光亮表面的结构光三维形貌测量方法研究与实现", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111489382A (en) * | 2019-01-28 | 2020-08-04 | 合肥美亚光电技术股份有限公司 | Method and device for obtaining coded fringe pattern and reconstructing based on structured light |
| CN112164072A (en) * | 2020-09-18 | 2021-01-01 | 深圳市南科信息科技有限公司 | Visible light imaging communication decoding method, device, device and medium |
| CN112710254A (en) * | 2020-12-21 | 2021-04-27 | 珠海格力智能装备有限公司 | Object measuring method, system, device, storage medium and processor |
| CN112967205A (en) * | 2021-03-25 | 2021-06-15 | 苏州天准科技股份有限公司 | Gray code filter-based outlier correction method, storage medium, and system |
| CN115564893A (en) * | 2022-09-28 | 2023-01-03 | 华南理工大学 | Image coding and decoding method based on coding structure light |
| CN115564893B (en) * | 2022-09-28 | 2026-01-06 | 华南理工大学 | An image encoding and decoding method based on coded structured light |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Feng et al. | High dynamic range 3D measurements with fringe projection profilometry: a review | |
| CN109000587A (en) | The method for obtaining accurate high density point cloud | |
| Wang et al. | Robust active stereo vision using Kullback-Leibler divergence | |
| Zhang et al. | Spacetime stereo: Shape recovery for dynamic scenes | |
| US9524578B2 (en) | Projection system, semiconductor integrated circuit, and image correction method | |
| JP5317169B2 (en) | Image processing apparatus, image processing method, and program | |
| KR101974651B1 (en) | Measuring method of 3d image depth and a system for measuring 3d image depth using boundary inheritance based hierarchical orthogonal coding | |
| US10973581B2 (en) | Systems and methods for obtaining a structured light reconstruction of a 3D surface | |
| KR101733228B1 (en) | Apparatus for three dimensional scanning with structured light | |
| US11023762B2 (en) | Independently processing plurality of regions of interest | |
| CN101482398B (en) | A method and device for rapid three-dimensional shape measurement | |
| CN101303229A (en) | Structured light 3D measurement technology based on edge gray code and line shift | |
| CN105069789A (en) | Structured light dynamic scene depth acquiring method based on encoding network template | |
| Tran et al. | A Structured Light RGB‐D Camera System for Accurate Depth Measurement | |
| CN105303572B (en) | Based on the main depth information acquisition method passively combined | |
| Chen et al. | A self-alignment XOR coding strategy resistant to global illumination | |
| CN100368767C (en) | 2D Image Region Location Method Based on Raster Projection | |
| CN113345039A (en) | Three-dimensional reconstruction quantization structure optical phase image coding method | |
| Chen et al. | Realtime structured light vision with the principle of unique color codes | |
| Li et al. | Structured light based high precision 3d measurement and workpiece pose estimation | |
| Li et al. | Surface reconstruction based on computer stereo vision using structured light projection | |
| Zhang et al. | Determination of edge correspondence using color codes for one-shot shape acquisition | |
| JP4840822B2 (en) | Image processing method, apparatus and program | |
| Ishii et al. | Fast 3D shape measurement using structured light projection for a one-directionally moving object | |
| Zhang | Fringe Projection Profilometry: Performance Improvement using Robust Principal Component Analysis |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181214 |