CN102447917A - Three-dimensional image matching method and equipment thereof - Google Patents
Three-dimensional image matching method and equipment thereof Download PDFInfo
- Publication number
- CN102447917A CN102447917A CN2010105025844A CN201010502584A CN102447917A CN 102447917 A CN102447917 A CN 102447917A CN 2010105025844 A CN2010105025844 A CN 2010105025844A CN 201010502584 A CN201010502584 A CN 201010502584A CN 102447917 A CN102447917 A CN 102447917A
- Authority
- CN
- China
- Prior art keywords
- pixel
- cost
- unit
- coupling
- parallax
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000008878 coupling Effects 0.000 claims description 95
- 238000010168 coupling process Methods 0.000 claims description 95
- 238000005859 coupling reaction Methods 0.000 claims description 95
- 238000009825 accumulation Methods 0.000 claims description 34
- 238000012545 processing Methods 0.000 claims description 28
- 230000000903 blocking effect Effects 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 18
- 238000010276 construction Methods 0.000 claims description 7
- 238000000926 separation method Methods 0.000 claims description 6
- 230000015572 biosynthetic process Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 6
- 230000006872 improvement Effects 0.000 description 4
- 244000188472 Ilex paraguariensis Species 0.000 description 3
- 238000007405 data analysis Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000006116 polymerization reaction Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 241001062009 Indigofera Species 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000000205 computational method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Landscapes
- Image Processing (AREA)
Abstract
The invention provides a three-dimensional image matching method and equipment thereof. The equipment comprises a cost computing unit, a cost gathering unit and a parallax determination unit, wherein the cost computing unit is used for computing matching cost between the pixel of a reference image and the pixel of a target image, the cost computing unit is used for respectively computing a nonparametric statistics cost function and a color average absolution difference function between the pixel of the reference image and the pixel of the target image, and carrying out linear combination of the nonparametric statistics cost function and the color average absolution difference function, and taking the combination result as the matching cost between the pixel of the reference image and the pixel of the target image; the cost gathering unit is used for gathering the matching cost between the pixel of the reference image and the pixel of the target image, computed by the cost computing unit; the parallax determining unit is used for determining the parallax from the pixel of the reference image to the target image on the basis of the result of the gathering of the cost gathering unit to the cost matching unit so as to realize three-dimensional image matching.
Description
Technical field
The present invention relates to be used for image process method and equipment, relate in particular to a kind of method and apparatus that between reference picture and target image, carries out the stereo-picture coupling.
Background technology
The stereo-picture coupling is the important technology problem of computer vision field; In the process of implementing the stereo-picture coupling between reference picture (for example, left image) and the target image (for example, right image), obtain disparity map; Wherein, Parallax (that is, binocular parallax) is meant in binocular vision, and the same space point is the position vector between the different projections in left image that binocular is seen respectively and right image.In the reality, because human eye is for the parallax of feeling to depend on horizontal direction of the degree of depth, so parallax also can only refer to the horizontal component of position vector.
As stated, therefore parallax, in the technical development of 3D rendering, realizes that through disparity estimation the stereo-picture coupling is the core technology of calculating in scene depth, scene three-dimensional reconstruction and the three-dimensional display for producing the foundation of depth preception.In existing stereo-picture coupling embodiment, mainly pay close attention to the demand of this two aspect of matching precision and treatment effeciency.Yet although existence at present much is used to realize the system of stereo-picture coupling, these systems all are difficult to simultaneously in this bigger improvement of realization aspect two of matching precision and treatment effeciency, and in a certain respect improvement has caused sacrificing performance on the other hand often.For example, many precision preferably the stereo-picture matching scheme have following common ground: 1) adopt bigger support window to carry out strong cost aggregate operation; 2) in the calculating of disparity map, utilize complicated global optimizer to carry out the energetic optimum computing; 3) segmentation result that mainly depends on matching image is realized the raising of matching precision.The blooming during though These characteristics can successfully suppress to mate, but all needing complicated computational methods is the basis, makes computing need the time of labor, so that is difficult to be applied to real-time is had the situation of requirement.And other stereo-picture matching scheme is in order to reach the performance of real-time coupling; Taked the measure of simplifying; For example, image pyramid, coupling, strong measurement and based on the optimizer of simple dynamic programming etc. gradually, consequently the matching precision in these systems has to reduce.
Therefore, need provide a kind of can all have significantly improved stereo-picture matching scheme aspect matching precision and the processing speed two, thereby can when realizing the high accuracy coupling, reach the computing requirement of real-time processing.
Summary of the invention
The object of the present invention is to provide a kind of quick stereo image matching method and system that can realize the high accuracy coupling, in addition, can be through the disparity map that generates be carried out the performance that thinning processing further improves the stereo-picture coupling.
According to an aspect of the present invention; A kind of stereo-picture matching unit is provided; Said equipment comprises: the cost computing unit; Be used to calculate the coupling cost between the pixel of pixel and target image of reference picture, wherein, the cost computing unit calculates nonparametric statistics cost function and the colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively; And said nonparametric statistics cost function and colored mean absolute difference value function carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference; The cost accumulation unit is used for the coupling cost between the pixel of the pixel of the reference picture that is calculated by the cost computing unit and target image is assembled; And parallax confirms the unit, is used for confirming the parallax of the pixel of reference picture to target image based on the result that the cost accumulation unit is assembled the coupling cost, matees to realize stereo-picture.
Said cost accumulation unit pin can be confirmed the dynamic support region of this pixel to each pixel in the reference picture; In each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled.
Said parallax confirms that the unit can make up the global energy of the coupling cost of subsidiary degree of depth flatness constraint, and the parallax level set that will make global energy obtain minimum value is confirmed as final stereo-picture coupling disparity map.
Said cost accumulation unit can comprise: the dynamic support region construction unit of pixel is used for making up dynamic support region to each pixel of reference picture; Dynamically support region cost accumulation unit is used for to each the dynamic support region that makes up, and all pixels is wherein carried out cost assemble.
Said stereo-picture matching unit also can comprise: disparity map refinement unit is used for the disparity map of being confirmed unit output by parallax is carried out thinning processing.
Said disparity map refinement unit can comprise at least one in following: block a processing unit, be used for detecting blocking a little of disparity map, utilize the image around blocking a little to estimate to block parallax a little; Discontinuous border amending unit; Be used for detecting the discontinuous border that saltus step appears in parallax value at disparity map; For discontinuous borderline pixel, the pixel that is utilized in said discontinuous boundaries on either side and said pixel separation certain distance to confirm again the parallax of discontinuous borderline pixel; The sub-pixel-level difference unit is used for disparity map is carried out other difference computing of sub-pixel-level.
In said linear combination, the ratio between said nonparametric statistics cost function and the colored mean absolute difference value function can be come adaptively to be provided with based on local grain, partial gradient and/or the coupling confidential information of each pixel.
According to a further aspect in the invention; A kind of stereo-picture matching process is provided; Said method comprises: the coupling cost between the pixel of calculating reference picture and the pixel of target image; Wherein, Calculate nonparametric statistics cost function and colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively, and said nonparametric statistics cost function and colored mean absolute difference value function are carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference; Coupling cost between the pixel of the pixel of the reference picture that calculates and target image is assembled; And, mate to realize stereo-picture based on the result that assembles of coupling cost is confirmed the parallax of the pixel of reference picture to target image.
The step of assembling can comprise: confirm the dynamic support region of this pixel to each pixel in the reference picture, in each dynamic support region of confirming, the coupling cost between the pixel of the pixel of the reference picture that calculates and target image is assembled.
The step of confirming parallax can comprise: make up the global energy of the coupling cost of subsidiary degree of depth flatness constraint, and the parallax level set that will make global energy obtain minimum value is confirmed as final stereo-picture coupling disparity map.
Said stereo-picture matching process also can comprise: the stereo-picture coupling disparity map to final carries out thinning processing.
During the step that final stereo-picture coupling disparity map is carried out thinning processing can may further comprise the steps at least one: detect blocking a little in the disparity map, utilize the image that blocks a little on every side to estimate to block parallax a little; In disparity map, detect the discontinuous border that saltus step appears in parallax value, for discontinuous borderline pixel, the pixel that is utilized in said discontinuous boundaries on either side and said pixel separation certain distance to confirm again the parallax of discontinuous borderline pixel; Disparity map is carried out other difference computing of sub-pixel-level.
According to a further aspect in the invention, a kind of generating apparatus of the 3D of being used for content is provided, comprises: 3D data input cell, 3D data analysis unit, 3D content generation unit, said generating apparatus also comprises: according to stereo-picture matching unit of the present invention.
According to a further aspect in the invention; A kind of display unit of the 3D of being used for content is provided; Comprise: 3D data input cell, 3D data analysis unit, 3D content generation unit and 3D content display unit, said display unit also comprises: according to stereo-picture matching unit of the present invention.
According to a further aspect in the invention, a kind of stereo-picture matching unit is provided, said equipment comprises: the cost computing unit is used to calculate the coupling cost between the pixel of pixel and target image of reference picture; The cost accumulation unit; Be used for confirming the dynamic support region of this pixel to each pixel of reference picture; In each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled; And parallax confirms the unit, is used for confirming the parallax of the pixel of reference picture to target image based on the result that the cost accumulation unit is assembled the coupling cost, matees to realize stereo-picture.
According to a further aspect in the invention, a kind of stereo-picture matching process is provided, said method comprises: the coupling cost between the pixel of calculating reference picture and the pixel of target image; Confirm the dynamic support region of this pixel to each pixel in the reference picture, in each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled; And, mate to realize stereo-picture based on the result that assembles of coupling cost is confirmed the parallax of the pixel of reference picture to target image.
According to the present invention; Can calculate the coupling cost function based on rules specific; And further carry out cost and assemble through dynamic support region to each pixel; On the basis that cost is assembled, can consider the degree of depth flatness of the overall situation and produce the disparity map of reference picture, thereby in the process that realizes the stereo-picture coupling, improve matching precision and processing speed simultaneously.In addition, through the disparity map that produces is carried out the refinement correcting process, can further improve the performance of images match.
Description of drawings
Through the description of carrying out below in conjunction with accompanying drawing to embodiment, above-mentioned and/or other purpose of the present invention and advantage will become apparent, wherein:
Fig. 1 illustrates the block diagram of stereo-picture matching unit according to an exemplary embodiment of the present invention;
Fig. 2 illustrates the flow chart of stereo-picture matching process according to an exemplary embodiment of the present invention;
Fig. 3 illustrates the block diagram of the detailed structure of cost accumulation unit according to an exemplary embodiment of the present invention;
Fig. 4 illustrates the example that makes up the dynamic support region of pixel according to an exemplary embodiment of the present;
Fig. 5 illustrates according to an exemplary embodiment of the present invention and carries out the example that cost is assembled based on dynamic support region;
Fig. 6 illustrates the block diagram of the detailed structure of disparity map refinement unit according to an exemplary embodiment of the present invention;
Fig. 7 illustrates the example of blocking a processing according to an exemplary embodiment of the present invention;
Fig. 8 illustrates the example that discontinuous according to an exemplary embodiment of the present invention border is revised; And
Fig. 9 illustrates stereo-picture matching system according to an exemplary embodiment of the present invention compared with prior art in the improvement of aspect of performance.
Embodiment
To describe embodiments of the invention in detail, the example of said embodiment is shown in the drawings at present, and wherein, identical label refers to identical parts all the time.Below will be through said embodiment is described with reference to accompanying drawing, so that explain the present invention.
Fig. 1 illustrates the block diagram of stereo-picture matching unit according to an exemplary embodiment of the present invention.As shown in Figure 1; The stereo-picture matching unit comprises according to an exemplary embodiment of the present invention: cost computing unit 10; Be used to calculate the coupling cost between the pixel of pixel and target image of reference picture; Wherein, Cost computing unit 10 calculates nonparametric statistics cost function and the colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively, and said nonparametric statistics cost function and colored mean absolute difference value function are carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference; Cost accumulation unit 20 is used for the coupling cost between the pixel of the pixel of the reference picture that is calculated by cost computing unit 10 and target image is assembled; And parallax confirms unit 30, is used for confirming the parallax of the pixel of reference picture to target image based on the result that 20 pairs of couplings of cost accumulation unit cost is assembled, and matees to realize stereo-picture.As preferred but non-essential execution mode, also can extraly comprise disparity map refinement unit 40, be used for the disparity map that the parallax of being confirmed unit 30 outputs by parallax constitutes is carried out thinning processing, thereby further improve the performance of stereo-picture coupling.
Describe hereinafter with reference to Fig. 2 and to utilize stereo-picture matching unit shown in Figure 1 to realize example according to stereo-picture matching process of the present invention.
Fig. 2 illustrates the flow chart of stereo-picture matching process according to an exemplary embodiment of the present invention.With reference to Fig. 2; At step S100; Calculate the coupling cost between the pixel of pixel and target image of reference pictures by cost computing unit 10; Wherein, Cost computing unit 10 calculates nonparametric statistics cost function and the colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively, and said nonparametric statistics cost function and colored mean absolute difference value function are carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference; At step S200, the coupling cost between the pixel of the reference picture that will be calculated by cost computing unit 10 by cost accumulation unit 20 and the pixel of target image is assembled; At step S300, confirm that by parallax the result that unit 30 is assembled based on 20 pairs of couplings of cost accumulation unit cost confirms the parallax of the pixel of reference picture to target image, to realize the stereo-picture coupling.As preferred but non-essential execution mode, also can extraly comprise step S400, wherein, confirm that by parallax the disparity map that the parallax of unit 30 outputs constitutes carries out thinning processing by 40 pairs of disparity map refinement unit, thereby further improve the performance of stereo-picture coupling.
Below, at first be described in step S100, calculate the processing of the coupling cost between the pixel of pixel and target image of reference pictures by cost computing unit 10.
In current exemplary embodiment according to the present invention; Cost computing unit 10 calculates nonparametric statistics cost function and the colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively; And said nonparametric statistics cost function and colored mean absolute difference value function carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference.
Particularly, cost computing unit 10 at first calculating pixel p (x, y) (x-d, the nonparametric statistics cost function between y) in this case, are selected pixel p (x, y) on every side square window N with pixel p d
pAnd corresponding pixel p d (x-d, y) on every side square window N
Pd, and respectively with selected square window N
pAnd N
PdIn pixel be expressed as the form of Bit String in a certain order, in said Bit String, through bit Bit
p(p
1) remarked pixel p (x, intensity y) and its square window N
pIn arbitrary pixel p
1Intensity between relativeness (for example, greater than, less than or approximate), in fact, (x, intensity y) is greater than pixel p when pixel p for example
1Intensity the time, with Bit
p(p
1) value is set to 1, otherwise it is set to 0, wherein, can set various concrete comparisons according to actual needs; Similarly, through bit Bit
Pd(pd
1) remarked pixel pd (x-d, intensity y) and its square window N
PdIn the intensity of arbitrary pixel p d1 between relativeness (for example, greater than, less than or approximate), in fact, (x-d, intensity y) is greater than pixel p d as pixel p d for example
1Intensity the time, with Bit
Pd(pd
1) value is set to 1, otherwise it is set to 0, wherein, can set various concrete comparisons according to actual needs.
Based on above-mentioned setting, cost computing unit 10 comes calculating pixel p according to following equality 1, and (x is y) with pixel p d (x-d, the nonparametric statistics cost function C between y)
Census(p, d):
Wherein, W (p
1, p) and W (pd
1, pd) be respectively square window N
pWith N
PdBetween the variation weight of Hamming distance, W (p
1, p) value can be according to pixel p
1With pixel
pBetween relation on space and color calculate, particularly, can draw W (p according to following rule
1, p) value: work as pixel p
1And the distance between the pixel p is near more, W (p
1, p) value is big more, simultaneously, works as pixel p
1Close more with the color of pixel p, W (p
1, p) value is also big more; Similarly, W (pd
1, pd) value can be according to pixel p d
1And the relation between the pixel p d on space and color calculates, and particularly, can draw W (pd according to following rule
1, pd) value: as pixel p d
1And the distance between the pixel p d is near more, W (pd
1, pd) value is big more, simultaneously, and as pixel p d
1Close more with the color of pixel p d, W (pd
1, pd) value is also big more.
(x is y) with pixel p d (x-d, the nonparametric statistics cost function C between y) as above to calculate pixel p at cost computing unit 10
Census(p, d) afterwards, said cost computing unit 10 calculates the colored mean absolute difference value function between the pixel of pixel and target image of reference picture.As an example, (x is y) with pixel p d (x-d, y) the colored mean absolute difference function C on R (red), G (green), B (indigo plant) three chrominance channels can to come calculating pixel p through equality 2
AD(p, d):
Wherein, I (p) and I (pd) respectively remarked pixel p (x is y) with pixel p d (x-d, the y) intensity level under the R of correspondence, G, B passage, λ
ADBe interceptive value.
Then, cost computing unit 10 is with said nonparametric statistics cost function C
Census(p is d) with colored mean absolute difference function C
AD(p d) carries out linear combination, with the result that will make up as a reference the coupling cost C between the pixel of pixel and the target image of image (p, d).
As an example, cost computing unit 10 calculate according to equality 3 coupling cost C between the pixel of pixel and target image of reference picture (p, d):
C (p, d)=z (p) C
Census(p, d)+(1-z (p)) C
AD(p, d) equality 3
Wherein, z (p) is for being used to control nonparametric statistics cost function C
Census(p is d) with colored mean absolute difference function C
AD(it is worth between 0 and 1 for p, the weight of proportionate relationship d).For example, (x, local grain y), partial gradient and/or coupling are put letter (matching confidence) information and are come adaptively to be provided with z (p) can to consider each pixel p.
After the pixel coupling cost function that obtains as stated in the reference picture; Can be at step S200; Coupling cost between the pixel of the reference picture that will be calculated by cost computing unit 10 by cost accumulation unit 20 and the pixel of target image is assembled, and assembles C as a result thereby produce cost
Aggr(p, d).
Here,, can adopt preference aggregation scheme according to an exemplary embodiment of the present invention, that is, carry out cost based on the dynamic support region of each pixel and assemble as a kind of optional mode.Below, will describe the detailed structure of cost accumulation unit according to an exemplary embodiment of the present invention in detail with reference to Fig. 3.
Fig. 3 illustrates the block diagram of the detailed structure of cost accumulation unit according to an exemplary embodiment of the present invention.With reference to Fig. 3, as a kind of exemplary configurations, cost accumulation unit 20 can comprise: the dynamic support region construction unit 201 of pixel is used for making up dynamic support region to each pixel of reference picture; Dynamically support region cost accumulation unit 202 is used for to each the dynamic support region that makes up, and all pixels is wherein carried out cost assemble.
As is known to the person skilled in the art, the basis that the dynamic support region of structure carries out the cost polymerization as each pixel, its shape has very big influence with big or small accuracy to matching result.For example, the zone less for texture need make up bigger support region, if but said support region strides across the border of certain object in the image, and then relatively poor at the matching performance at object bounds place.On the other hand, less support region is also relatively more responsive for picture noise and flex point.Therefore, amid all these factors, the dynamic support region construction unit 201 of pixel can be followed the dynamic support region that following two criterions make up each pixel: the pixel in (1) dynamic support region has and the similar colour information of the center pixel of this support region; (2) dynamically the existing edge in the reference picture should be followed in the border of support region, and wherein, pixel part image gradient information capable of using detects the existing edge in the reference picture.Fig. 4 illustrates the example that makes up the dynamic support region of pixel according to an exemplary embodiment of the present.As shown in Figure 4, be fabricated based on above-mentioned two criterions respectively to the dynamic support region of pixel P and pixel Q.
After the dynamic support region construction unit 201 of pixel had made up the dynamic support region of each pixel, dynamically support region cost accumulation unit 202 carried out cost to all pixels wherein and assembles to each the dynamic support region that makes up.
As an example, dynamically support region cost accumulation unit 202 can carry out the cost gathering to all pixels in the dynamic support region of pixel P according to mode shown in Figure 5.As shown in Figure 5; Dynamically support region cost accumulation unit 202 can carry out the cost gathering to all pixels in the dynamic support region of pixel P according to two paths: at first; Along continuous straight runs carries out cost to all pixels of each row respectively to be assembled; Obtain and store horizontal direction each assemble the result, then, the focusing result of the horizontal direction of storage is vertically carried out polymerization.
It will be understood by those skilled in the art that; Above-mentionedly carry out the example of cost matching treatment based on the dynamic support region of pixel and do not rely on specific pixel coupling cost algorithms through cost accumulation unit 20 shown in Figure 3; Promptly; The pixel coupling cost function of assembling is not limited to the concrete cost function by cost computing unit 10 generations of Fig. 1, but can be applied to the coupling cost function of various pixels.Though above-mentioned dynamic domain is the two-dimensional space territory, also can the three-dimensional support region of space and disparity domain be applied among the present invention.In addition, those skilled in the art should also be understood that dynamic support region construction unit 201 of pixel shown in Figure 3 and dynamic support region cost accumulation unit 202 only are a kind of examples, and this example does not constitute the structural limitations to cost accumulation unit 20.That is to say; Cost accumulation unit 20 can be used as single solid element fully and realizes dynamic support region construction unit 201 of pixel and the dynamically support region cost accumulation unit 202 whole operations of carrying out; Perhaps, cost accumulation unit 20 also can further be subdivided into more unit and be carried out said operation.
With reference to returning Fig. 2, after obtaining the gathering result of cost function as stated, can confirm that by parallax the result that unit 30 is assembled based on 20 pairs of couplings of cost accumulation unit cost confirms the parallax of the pixel of reference picture to target image at step S300.Particularly, parallax confirms that unit 30 available energy optimizers calculate disparity map,, confirms the minimum parallax level set of feasible coupling cost based on specific energy computing that is
Wherein, p representes the arbitrary pixel in the reference picture, d
pParallax value for this pixel p.Select respectively each pixel p to be had the parallax of smallest match cost if directly the cost result who assembles is used local WTA (Winner Take All, the victor covers all) algorithm, can have a lot of noise depth values in the disparity map that then produces.Therefore; As a kind of preferred mode; Can adopt to comprise the Global Algorithm of degree of depth flatness being carried out additional constraint, the processing of confirming parallax is configured to the parallax level set
confirming to make global energy obtain minimum value as definite disparity map.
As an example, parallax confirms that unit 30 can make up the global energy of the coupling cost of subsidiary degree of depth flatness constraint according to equality 4, and the parallax level set that makes this global energy obtain minimum value then is confirmed as final stereo-picture coupling disparity map:
Wherein, C
Aggr(p, d
p) being the gathering result of coupling cost, N is the neighborhood of pixel p, can confirm according to actual needs, q is the pixel among the neighborhood N, d
qBe the parallax value of pixel q, T (A) is a truth-function, that is, when the A equality was set up, T (A) value was 1, otherwise T (A) value is 0, P
1And P
2For the punishment parameter, can be provided with according to actual needs.In addition; As optimal way, can adopt 4 to come to carry out effectively the above-mentioned handling process of confirming
to the scan line optimal way.
With reference to returning Fig. 2; Confirming that by parallax pixel that reference picture has been confirmed in unit 30 is after the disparity map of target image; As preferred but non-essential execution mode, also can extraly comprise step S400, wherein; Confirm that by parallax the disparity map of unit 30 outputs carries out thinning processing by 40 pairs of disparity map refinement unit, thereby further improve the performance of stereo-picture coupling.
Below, will come the detailed structure of disparity map refinement unit 40 according to an exemplary embodiment of the present invention with reference to Fig. 6.
Fig. 6 illustrates the block diagram of the detailed results of disparity map refinement unit according to an exemplary embodiment of the present invention.With reference to Fig. 6, as a kind of exemplary configurations, disparity map refinement unit 40 can comprise: block a processing unit 401, be used for detecting blocking a little of disparity map, utilize the image around blocking a little to estimate to block parallax a little; Discontinuous border amending unit 402; Be used for detecting the discontinuous border that saltus step appears in parallax value by blocking the disparity map that a processing unit 401 handled; For discontinuous borderline pixel, the pixel that is utilized in said discontinuous boundaries on either side and said pixel separation certain distance to confirm again the parallax of discontinuous borderline pixel; Sub-pixel-level difference unit 403 is used for carrying out other difference computing of sub-pixel-level by discontinuous border amending unit 402 corrected disparity maps.
Particularly, blocking the left and right sides consistency of a processing unit 401 through inspection reference picture and target image detects and blocks point (that is, can't confirm the point of its parallax value through the respective pixel in the target image) in the disparity map.Then; Shown in (a) among Fig. 7, at first utilize this other pixel of blocking in the dynamic support region a little to estimate to block parallax value a little, for example; In can the parallax value with all pixels of said dynamic support region, the most general parallax value be as blocking parallax value a little.In this way; Can confirm that some or all blocks parallax value a little, for still confirming blocking a little of parallax value, also can be shown in (b) among Fig. 7; From (for example blocking point along a plurality of directions; 16 directions) scan, will along these scanning directions to pixel in, color is confirmed as said parallax value of blocking a little with the parallax value of blocking a little approaching pixel.
Then; Discontinuous border amending unit 402 is detecting parallax value and (for example saltus step occurring by blocking in the disparity map that a processing unit 401 handled; Poor>2 of parallax value) discontinuous border, then, for discontinuous borderline pixel; Discontinuous border amending unit 402 is utilized in two pixels of said discontinuous boundaries on either side and said pixel separation certain distance (for example, 4 pixels) and confirms again the parallax of discontinuous borderline pixel.For example, can the value of color of said two pixels be compared with discontinuous borderline said pixel respectively, the parallax value of pixel that will be comparatively close with discontinuous borderline said color of pixel is elected the parallax value of said borderline pixel as.As optimal way, the colour information of the dynamic support region of pixel capable of using is come the value of color of represent pixel, thereby reduces corresponding operation consumption.The result of above-mentioned discontinuous border correcting process is as shown in Figure 8, and in Fig. 8, a left side illustrates detected discontinuous border before the correction, and the right side illustrates revised result.
Then, sub-pixel-level difference unit 403 can be to carrying out other difference computing of sub-pixel-level by discontinuous border amending unit 402 corrected disparity maps.Because processing before all based on the integer pixel rank, in order further the disparity map that produces to be carried out refinement, for example, can be carried out other difference computing of sub-pixel-level according to equality 5:
Wherein, d
*Be other parallax value of sub-pixel-level, E representes through equality 4 calculated energy.
It will be understood by those skilled in the art that; Above-mentioned to block a processing unit 401, discontinuous border amending unit 402 and sub-pixel-level difference unit 403 only be a kind of example through shown in Figure 6, and this example does not constitute the structural limitations to disparity map refinement unit 40.That is to say; Disparity map refinement unit 40 can be used as single solid element fully and realizes blocking a processing unit 401, discontinuous border amending unit 402 and the 403 whole operations of carrying out of sub-pixel-level difference unit; Perhaps, disparity map refinement unit 40 also can further be subdivided into more unit and be carried out said operation.In addition; An above-mentioned processing unit 401, discontinuous border amending unit 402 and the sub-pixel-level difference unit 403 of blocking carried out three kinds of independently disparity map thinning processing respectively; Therefore; Disparity map refinement unit 40 can only comprise in above-mentioned three unit any one or two, and do not have the fixing order that is provided with between three unit.
More than show and carry out stereo-picture equipment matched and method according to an exemplary embodiment of the present.Fig. 9 illustrates stereo-picture matching system according to an exemplary embodiment of the present invention compared with prior art in the improvement of aspect of performance; Wherein, (a) among Fig. 9 and (b) be illustrated in the test result of Middlebury website respectively to error threshold=1 and error threshold=0.5; As shown in the figure, stereo-picture matching scheme (in the test result interface, being shown as " YOUR METHOD ") is all obviously being led over other scheme of the prior art aspect matching precision and the treatment effeciency according to an exemplary embodiment of the present invention.
According to the present invention; Can calculate the coupling cost function based on rules specific; And further carry out cost and assemble through dynamic support region to each pixel; On the basis that cost is assembled, can consider the degree of depth flatness of the overall situation and produce the disparity map of reference picture, thereby in the process that realizes the stereo-picture coupling, improve matching precision and processing speed simultaneously.In addition, through the disparity map that produces is carried out the refinement correcting process, can further improve the performance of images match.
It should be noted that stereo-picture matching process and equipment can be included in the generating apparatus that is used for the 3D content according to an exemplary embodiment of the present invention, also can be included in the display unit that is used for the 3D content.In said apparatus; Except stereo-picture matching unit according to an exemplary embodiment of the present invention; Also comprise 3D data input cell, 3D data analysis unit, 3D content generation unit or 3D content display unit, because these unit all belong to the prior art beyond the present invention, therefore; Obscure for fear of theme of the present invention is caused, be not elaborated at this.
Above each embodiment of the present invention only is exemplary, and the present invention is not limited to this.Those skilled in the art should understand that: anyly relate separately to that above-mentioned coupling cost is handled, the dynamic support region of pixel makes up, disparity map generates and refinement is carried out the stereo-picture matching mode and all fallen among the scope of the present invention.Under the situation that does not break away from principle of the present invention and spirit, can change these embodiments, wherein, scope of the present invention limits in claim and equivalent thereof.
Claims (15)
1. stereo-picture matching unit, said equipment comprises:
The cost computing unit; Be used to calculate the coupling cost between the pixel of pixel and target image of reference picture; Wherein, The cost computing unit calculates nonparametric statistics cost function and the colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively, and said nonparametric statistics cost function and colored mean absolute difference value function are carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference;
The cost accumulation unit is used for the coupling cost between the pixel of the pixel of the reference picture that is calculated by the cost computing unit and target image is assembled; And
Parallax is confirmed the unit, is used for confirming the parallax of the pixel of reference picture to target image based on the result that the cost accumulation unit is assembled the coupling cost, to realize the stereo-picture coupling.
2. equipment as claimed in claim 1; Wherein, Said cost accumulation unit is confirmed the dynamic support region of this pixel to each pixel in the reference picture; In each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled.
3. equipment as claimed in claim 1, wherein, said parallax is confirmed the global energy of the coupling cost of the subsidiary degree of depth flatness constraint of cell formation, and the parallax level set that will make global energy obtain minimum value is confirmed as final stereo-picture coupling disparity map.
4. equipment as claimed in claim 3, wherein, said cost accumulation unit comprises:
The dynamic support region construction unit of pixel is used for making up dynamic support region to each pixel of reference picture;
Dynamically support region cost accumulation unit is used for to each the dynamic support region that makes up, and all pixels is wherein carried out cost assemble.
5. equipment as claimed in claim 3 also comprises: disparity map refinement unit is used for the disparity map of being confirmed unit output by parallax is carried out thinning processing.
6. equipment as claimed in claim 5, wherein, said disparity map refinement unit comprises at least one in following:
Block a processing unit, be used for detecting blocking a little of disparity map, utilize the image around blocking a little to estimate to block parallax a little;
Discontinuous border amending unit; Be used for detecting the discontinuous border that saltus step appears in parallax value at disparity map; For discontinuous borderline pixel, the pixel that is utilized in said discontinuous boundaries on either side and said pixel separation certain distance to confirm again the parallax of discontinuous borderline pixel;
The sub-pixel-level difference unit is used for disparity map is carried out other difference computing of sub-pixel-level.
7. equipment as claimed in claim 1; Wherein, In said linear combination, the ratio between said nonparametric statistics cost function and the colored mean absolute difference value function comes adaptively to be provided with based on local grain, partial gradient and/or the coupling confidential information of each pixel.
8. stereo-picture matching process, said method comprises:
Coupling cost between the pixel of calculating reference picture and the pixel of target image; Wherein, Calculate nonparametric statistics cost function and colored mean absolute difference value function between the pixel of pixel and target image of reference picture respectively; And said nonparametric statistics cost function and colored mean absolute difference value function carried out linear combination, with the result that will the make up coupling cost between the pixel of pixel and the target image of image as a reference;
Coupling cost between the pixel of the pixel of the reference picture that calculates and target image is assembled; And
Confirm the parallax of the pixel of reference picture based on the result that the coupling cost is assembled, to realize the stereo-picture coupling to target image.
9. method as claimed in claim 8; Wherein, The step of assembling comprises: the dynamic support region of confirming this pixel to each pixel in the reference picture; In each dynamic support region of confirming, the coupling cost between the pixel of the pixel of the reference picture that calculates and target image is assembled.
10. method as claimed in claim 8; Wherein, The step of confirming parallax comprises: make up the global energy of the coupling cost of subsidiary degree of depth flatness constraint, and the parallax level set that will make global energy obtain minimum value is confirmed as final stereo-picture coupling disparity map.
11. method as claimed in claim 10 also comprises: the stereo-picture coupling disparity map to final carries out thinning processing.
12. method as claimed in claim 11, wherein, at least one during the step that final stereo-picture coupling disparity map is carried out thinning processing may further comprise the steps:
Detect blocking a little in the disparity map, utilize the image around blocking a little to estimate to block parallax a little;
In disparity map, detect the discontinuous border that saltus step appears in parallax value, for discontinuous borderline pixel, the pixel that is utilized in said discontinuous boundaries on either side and said pixel separation certain distance to confirm again the parallax of discontinuous borderline pixel;
Disparity map is carried out other difference computing of sub-pixel-level.
13. method as claimed in claim 8; Wherein, In said linear combination, the ratio between said nonparametric statistics cost function and the colored mean absolute difference value function comes adaptively to be provided with based on local grain, partial gradient and/or the coupling confidential information of each pixel.
14. a stereo-picture matching unit, said equipment comprises:
The cost computing unit is used to calculate the coupling cost between the pixel of pixel and target image of reference picture;
The cost accumulation unit; Be used for confirming the dynamic support region of this pixel to each pixel of reference picture; In each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled; And
Parallax is confirmed the unit, is used for confirming the parallax of the pixel of reference picture to target image based on the result that the cost accumulation unit is assembled the coupling cost, to realize the stereo-picture coupling.
15. a stereo-picture matching process, said method comprises:
Coupling cost between the pixel of calculating reference picture and the pixel of target image;
Confirm the dynamic support region of this pixel to each pixel in the reference picture, in each dynamic support region of confirming, the coupling cost between the pixel of the reference picture that will be calculated by the cost computing unit and the pixel of target image is assembled; And
Confirm the parallax of the pixel of reference picture based on the result that the coupling cost is assembled, to realize the stereo-picture coupling to target image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010105025844A CN102447917A (en) | 2010-10-08 | 2010-10-08 | Three-dimensional image matching method and equipment thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010105025844A CN102447917A (en) | 2010-10-08 | 2010-10-08 | Three-dimensional image matching method and equipment thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102447917A true CN102447917A (en) | 2012-05-09 |
Family
ID=46009941
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2010105025844A Pending CN102447917A (en) | 2010-10-08 | 2010-10-08 | Three-dimensional image matching method and equipment thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102447917A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102831601A (en) * | 2012-07-26 | 2012-12-19 | 中北大学 | Three-dimensional matching method based on union similarity measure and self-adaptive support weighting |
CN102881014A (en) * | 2012-09-07 | 2013-01-16 | 航天恒星科技有限公司 | Quick stereo matching method based on graph cut |
CN103167306A (en) * | 2013-03-22 | 2013-06-19 | 上海大学 | A device and method for real-time extraction of high-resolution depth maps based on image matching |
CN104380338A (en) * | 2012-05-22 | 2015-02-25 | 索尼电脑娱乐公司 | Information processing device and information processing method |
WO2017067390A1 (en) * | 2015-10-20 | 2017-04-27 | 努比亚技术有限公司 | Method and terminal for obtaining depth information of low-texture regions in image |
CN107588721A (en) * | 2017-08-28 | 2018-01-16 | 武汉科技大学 | The measuring method and system of a kind of more sizes of part based on binocular vision |
CN107770512A (en) * | 2016-08-22 | 2018-03-06 | 现代自动车株式会社 | The system and method for disparity map are produced by matching stereo-picture |
CN108230273A (en) * | 2018-01-05 | 2018-06-29 | 西南交通大学 | A kind of artificial compound eye camera three dimensional image processing method based on geological information |
CN109544611A (en) * | 2018-11-06 | 2019-03-29 | 深圳市爱培科技术股份有限公司 | A kind of binocular vision solid matching method and system based on bit feature |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100142828A1 (en) * | 2008-12-10 | 2010-06-10 | Electronics And Telecommunications Research Institute | Image matching apparatus and method |
CN101841730A (en) * | 2010-05-28 | 2010-09-22 | 浙江大学 | Real-time stereoscopic vision implementation method based on FPGA |
-
2010
- 2010-10-08 CN CN2010105025844A patent/CN102447917A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100142828A1 (en) * | 2008-12-10 | 2010-06-10 | Electronics And Telecommunications Research Institute | Image matching apparatus and method |
CN101841730A (en) * | 2010-05-28 | 2010-09-22 | 浙江大学 | Real-time stereoscopic vision implementation method based on FPGA |
Non-Patent Citations (1)
Title |
---|
KRISTIAN AMBROSCH等: "Parameter Optimization of the SAD-IGMCT forStereo Vision in RGB and HSV Color Spaces", 《ELMAR, 2010 PROCEEDINGS》, 17 September 2010 (2010-09-17) * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10469829B2 (en) | 2012-05-22 | 2019-11-05 | Sony Interactive Entertainment Inc. | Information processor and information processing method |
CN104380338A (en) * | 2012-05-22 | 2015-02-25 | 索尼电脑娱乐公司 | Information processing device and information processing method |
CN102831601A (en) * | 2012-07-26 | 2012-12-19 | 中北大学 | Three-dimensional matching method based on union similarity measure and self-adaptive support weighting |
CN102881014A (en) * | 2012-09-07 | 2013-01-16 | 航天恒星科技有限公司 | Quick stereo matching method based on graph cut |
CN102881014B (en) * | 2012-09-07 | 2015-02-11 | 航天恒星科技有限公司 | Quick stereo matching method based on graph cut |
CN103167306A (en) * | 2013-03-22 | 2013-06-19 | 上海大学 | A device and method for real-time extraction of high-resolution depth maps based on image matching |
WO2017067390A1 (en) * | 2015-10-20 | 2017-04-27 | 努比亚技术有限公司 | Method and terminal for obtaining depth information of low-texture regions in image |
CN107770512A (en) * | 2016-08-22 | 2018-03-06 | 现代自动车株式会社 | The system and method for disparity map are produced by matching stereo-picture |
CN107770512B (en) * | 2016-08-22 | 2020-11-06 | 现代自动车株式会社 | System and method for generating disparity map by matching stereo images |
CN107588721A (en) * | 2017-08-28 | 2018-01-16 | 武汉科技大学 | The measuring method and system of a kind of more sizes of part based on binocular vision |
CN108230273B (en) * | 2018-01-05 | 2020-04-07 | 西南交通大学 | Three-dimensional image processing method of artificial compound eye camera based on geometric information |
CN108230273A (en) * | 2018-01-05 | 2018-06-29 | 西南交通大学 | A kind of artificial compound eye camera three dimensional image processing method based on geological information |
CN109544611A (en) * | 2018-11-06 | 2019-03-29 | 深圳市爱培科技术股份有限公司 | A kind of binocular vision solid matching method and system based on bit feature |
CN109544611B (en) * | 2018-11-06 | 2021-05-14 | 深圳市爱培科技术股份有限公司 | Binocular vision stereo matching method and system based on bit characteristics |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11562498B2 (en) | Systems and methods for hybrid depth regularization | |
CN102447917A (en) | Three-dimensional image matching method and equipment thereof | |
CN106780590B (en) | Method and system for acquiring depth map | |
US10477178B2 (en) | High-speed and tunable scene reconstruction systems and methods using stereo imagery | |
CN113034568B (en) | Machine vision depth estimation method, device and system | |
Nalpantidis et al. | Stereo vision for robotic applications in the presence of non-ideal lighting conditions | |
US9350969B2 (en) | Target region filling involving source regions, depth information, or occlusions | |
CN102665086B (en) | Method for obtaining parallax by using region-based local stereo matching | |
Chen et al. | Transforming a 3-d lidar point cloud into a 2-d dense depth map through a parameter self-adaptive framework | |
CN103310421B (en) | The quick stereo matching process right for high-definition image and disparity map acquisition methods | |
US20130064443A1 (en) | Apparatus and method for determining a confidence value of a disparity estimate | |
CN103177451B (en) | Based on the self-adapting window of image border and the Stereo Matching Algorithm of weight | |
CN104065947B (en) | The depth map acquisition methods of a kind of integration imaging system | |
JP2014150521A (en) | Method and system for increasing resolution of depth image | |
CN105335955A (en) | Object detection method and object detection apparatus | |
CN102831601A (en) | Three-dimensional matching method based on union similarity measure and self-adaptive support weighting | |
US20140313187A1 (en) | Stereoscopic Target Region Filling | |
CN102136136A (en) | Luminosity insensitivity stereo matching method based on self-adapting Census conversion | |
CN106408596B (en) | Edge-Based Local Stereo Matching Method | |
Hua et al. | Extended guided filtering for depth map upsampling | |
CN104156957A (en) | Stable and high-efficiency high-resolution stereo matching method | |
CN109887008B (en) | Method, device and equipment for parallax stereo matching based on forward and backward smoothing and O (1) complexity | |
CN114004754A (en) | Scene depth completion system and method based on deep learning | |
CN112734822B (en) | Stereo matching algorithm based on infrared and visible light images | |
Shivakumar et al. | Real time dense depth estimation by fusing stereo with sparse depth measurements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
AD01 | Patent right deemed abandoned |
Effective date of abandoning: 20160302 |
|
C20 | Patent right or utility model deemed to be abandoned or is abandoned |