CN110490829A - A kind of filtering method and system of depth image - Google Patents
A kind of filtering method and system of depth image Download PDFInfo
- Publication number
- CN110490829A CN110490829A CN201910789404.6A CN201910789404A CN110490829A CN 110490829 A CN110490829 A CN 110490829A CN 201910789404 A CN201910789404 A CN 201910789404A CN 110490829 A CN110490829 A CN 110490829A
- Authority
- CN
- China
- Prior art keywords
- pixel
- image
- object pixel
- value
- depth value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001914 filtration Methods 0.000 title claims abstract description 59
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000012545 processing Methods 0.000 claims description 38
- 238000003892 spreading Methods 0.000 claims description 8
- 238000010606 normalization Methods 0.000 claims description 6
- 238000012512 characterization method Methods 0.000 claims description 3
- 235000013399 edible fruits Nutrition 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 7
- 238000004364 calculation method Methods 0.000 description 9
- 230000002146 bilateral effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 241000196324 Embryophyta Species 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20028—Bilateral filtering
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses the filtering method of depth image and systems, comprising: determines the object pixel in the second image extended according to original depth image, and the dynamic threshold based on setting obtains the attributive character information of object pixel;If object pixel is located at plane domain, the mean value of the pixel depth value in the neighborhood region of object pixel is calculated, and all pixels depth value in neighborhood region is updated to mean value, obtain the filter result of plane domain;If being located at fringe region, the weighted value of all pixels point in the neighborhood region of object pixel is calculated, and calculate the target depth value for obtaining object pixel according to weighted value, the depth value of object pixel is updated to target depth value, obtains the filter result of fringe region;The filter result of filter result and fringe region based on plane domain, obtains filtered depth image.The present invention solves the problems, such as that it is obvious to plane domain filter effect that depth map filtering can not be reached simultaneously in the process and retains marginal information.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to the filtering method and system of a kind of depth image.
Background technique
Depth image is also referred to as range image, refers to distance (depth) conduct of each point in from image acquisition device to scene
The image of pixel value, it directly reflects the geometry of scenery visible surface.The acquisition modes of depth image usually have at present
Two kinds, one is directly being measured by hardware device, main equipment is depth camera based on structure light and based on TOF
The depth camera of (Time Of Flight, light flight time);Another kind is calculated by the software measurement of binocular algorithm
It obtains.
During hard ware measure, since the depth image influenced by hardware device manufacturing process and performance is deposited
In certain measurement error, and measurement error increases with the increase of depth value;And in software measurement process, due to binocular
There is also certain errors for the depth image that the influences such as device resolution, algorithmic match precision obtain.
Although the error problem in depth image, traditional filtering can be solved by filtering algorithm in the prior art
Algorithm is unobvious to the filter effect of the plane domain of depth image, or can not retain marginal information.
Summary of the invention
It is directed to the above problem, the present invention provides the filtering method and system of a kind of depth image, solves to depth map
The problem of can not reaching obvious to plane domain filter effect in filtering simultaneously and retaining marginal information.
To achieve the goals above, the present invention provides the following technical scheme that
A kind of filtering method of depth image, this method comprises:
Image spreading is carried out to the first image according to default sliding window, obtains the second image, wherein the first image
Characterize original depth image, the size of second image and the size of the first image and the default sliding window it is big
It is small that there is default corresponding relationship;
It determines the object pixel in second image, and the object pixel is belonged to based on the dynamic threshold of setting
Property judgement, obtain the attributive character information of object pixel, the attributive character information includes that the object pixel is located at marginal zone
Domain or the object pixel are located at plane domain;
If the object pixel is located at plane domain, the pixel depth value in the neighborhood region of the object pixel is calculated
Mean value, and all pixels depth value in the neighborhood region is updated to the mean value, it realizes to each of the plane domain
The filtering processing of a pixel obtains the filter result of plane domain;
If the object pixel is located at fringe region, all pixels point in the neighborhood region of the object pixel is calculated
Weighted value, and the target depth value for obtaining the object pixel is calculated according to the weighted value, by the depth of the object pixel
Value is updated to the target depth value, realizes to the filtering processing of the pixel of the fringe region, obtains the filter of fringe region
Wave result;
The filter result of filter result and the fringe region based on the plane domain, obtains filtered depth map
Picture.
Optionally, the basis presets sliding window and carries out image spreading to the first image, obtains the second image, comprising:
According to the size of default sliding window, the first numerical value picture is respectively extended to four sides up and down of the first image
Element, wherein the size of default sliding window is N*N, and N is the positive odd number more than or equal to 3, the first numerical value=(N-1)/2, and expansion
Each pixel depth value after exhibition is equal to the depth value of pixel adjacent thereto;
According to the size of the size of the first image and the sliding window, the size for obtaining the second image is calculated, wherein if
The width of the first image is W1, is highly H1, the width W2=W1+N-1 of second image, the height of second image
Spend H2=H1+N-1;
According to the size of the depth value of the pixel of the first image and default sliding window, to the picture in second image
Vegetarian refreshments carries out assignment, the second image after obtaining assignment.
Optionally, the object pixel in the determination second image, and based on the dynamic threshold of setting to the mesh
It marks pixel and carries out determined property, obtain the attributive character information of object pixel, comprising:
According to the reference parameter of the object pixel of second image, the determining dynamic threshold to match with the object pixel
Value, the reference parameter includes reference threshold, reference percentage and reference depth value;
Obtain the neighborhood region of the object pixel;
The gradient of each pixel in the neighborhood region is calculated, gradient set is obtained;
Each of gradient set gradient value is compared with the dynamic threshold, if there is a gradient value big
In the dynamic threshold, then judge that the object pixel is located at fringe region, otherwise, the object pixel is located at plane domain.
Optionally, if the object pixel is located at plane domain, the neighborhood region of the object pixel is calculated
The mean value of pixel depth value, and all pixels depth value in the neighborhood region is updated to the mean value, comprising:
If the object pixel is located at plane domain, all pixels in the neighborhood region of the object pixel are calculated
With;
According to the sum of all pixels, the mean value of all pixels in the neighborhood region is calculated;
The depth value of all pixels in the neighborhood region is updated to the mean value;
The coordinate value for updating the object pixel is filtered updated object pixel.
Optionally, it if the object pixel is located at fringe region, calculates in the neighborhood region of the object pixel
The weighted value of all pixels point, and the target depth value for obtaining the object pixel is calculated according to the weighted value, by the mesh
The depth value of mark pixel is updated to the target depth value, realizes to the filtering processing of the pixel of the fringe region, obtains
The filter result of fringe region, comprising:
If the object pixel is located at fringe region, all pixels point in the areas of the object pixel is calculated
Weight;
The weight of the pixel is normalized, the weighted value after being normalized;
The target depth value of the object pixel is calculated based on the weighted value after the normalization;
The depth value of the object pixel is updated to the target depth value, and removes the mesh in the fringe region
The depth value for marking the pixel except pixel is corresponding original depth value, realizes the filter to the pixel of the fringe region
Wave processing, obtains the filter result of fringe region.
Optionally, this method further include:
If the depth value of the pixel in the neighborhood region is 0, the pixel that the depth value is 0 is weeded out.
A kind of filtering system of depth image, the system include:
Expanding element, for obtaining the second image according to sliding window is preset to the first image progress image spreading,
In, the first image characterizes original depth image, the size of second image and the size of the first image and described
The size of default sliding window has default corresponding relationship;
Determined property unit, for determining the object pixel in second image, and the dynamic threshold pair based on setting
The object pixel carries out determined property, obtains the attributive character information of object pixel, and the attributive character information includes described
Object pixel is located at fringe region or the object pixel is located at plane domain;
Plane domain processing unit calculates the object pixel if being located at plane domain for the object pixel
The mean value of the pixel depth value in neighborhood region, and all pixels depth value in the neighborhood region is updated to the mean value,
It realizes to the filtering processing of each pixel of the plane domain, obtains the filter result of plane domain;
Fringe region processing unit calculates the object pixel if being located at fringe region for the object pixel
The weighted value of all pixels point in neighborhood region, and the target depth for obtaining the object pixel is calculated according to the weighted value
The depth value of the object pixel is updated to the target depth value, realizes the filter to the pixel of the fringe region by value
Wave processing, obtains the filter result of fringe region;
Generation unit is obtained for the filter result of filter result and the fringe region based on the plane domain
Filtered depth image.
Optionally, the expanding element includes:
Subelement is extended, for respectively expanding four sides up and down of the first image according to the size for presetting sliding window
Open up the first numerical value pixel, wherein the size of default sliding window is N*N, and N is the positive odd number more than or equal to 3, the first numerical value
=(N-1)/2, and each pixel depth value after extension is equal to the depth value of pixel adjacent thereto;
First computation subunit obtains for calculating according to the size of the first image and the size of the sliding window
The size of two images, wherein it is highly H1 if the width of the first image is W1, the width W2=W1+ of second image
N-1, the height H2=H1+N-1 of second image;
Assignment subunit, for according to the depth value of the pixel of the first image and the size of default sliding window, to institute
The pixel stated in the second image carries out assignment, the second image after obtaining assignment.
Optionally, the determined property unit includes:
Determine subelement, it is determining with the target picture for the reference parameter according to the object pixel of second image
The dynamic threshold that element matches, the reference parameter include reference threshold, reference percentage and reference depth value;
First obtains subelement, for obtaining the neighborhood region of the object pixel;
Second computation subunit obtains gradient set for calculating the gradient of each pixel in the neighborhood region;
Comparing subunit, for each of gradient set gradient value to be compared with the dynamic threshold,
If there is a gradient value to be greater than the dynamic threshold, judge that the object pixel is located at fringe region, otherwise, the target picture
Element is located at plane domain.
Optionally, the plane treatment unit includes:
Third computation subunit calculates the neighbour of the object pixel if being located at plane domain for the object pixel
The sum of all pixels in the region of domain;
All pictures in the neighborhood region are calculated for the sum according to all pixels in 4th computation subunit
The mean value of element;
First updates subelement, for the depth value of all pixels in the neighborhood region to be updated to the mean value;
First processing subelement filters updated object pixel for updating the coordinate value of the object pixel
Wave processing;
The fringe region processing unit includes:
5th computation subunit calculates the neck of the object pixel if being located at fringe region for the object pixel
The weight of all pixels point in the region of domain;
Subelement is normalized, is normalized for the weight to the pixel, the weight after being normalized
Value;
6th computation subunit, for calculating the target depth of the object pixel based on the weighted value after the normalization
Value;
Second updates subelement, and the depth value of the object pixel is updated to the target depth value, and the edge
The depth value of the pixel in addition to the object pixel in region is corresponding original depth value, is realized to the edge
The filtering processing of the pixel in region obtains the filter result of fringe region.
Compared to the prior art, the present invention provides a kind of filtering method of depth image and systems, by original depth
Degree image is extended, and obtains the second image, is then based on dynamic threshold and is carried out to the attribute of the target image in the second image
Judgement can all have adaptive and robustness using dynamic threshold, according to dynamic threshold to depth pixel to different depth value
Value is classified, and different filtering processings can be carried out for different attributive character by realizing, i.e., to using in plane domain
Mean value updates the depth value of all pixels in neighborhood, and updates object pixel using depth value is recalculated to fringe region
Depth value.And reservation obvious to plane domain filter effect can not be reached simultaneously in the process to depth map filtering to solve
The problem of marginal information.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
The embodiment of invention for those of ordinary skill in the art without creative efforts, can also basis
The attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of flow diagram of the filtering method of depth image provided in an embodiment of the present invention;
Fig. 2 is the schematic diagram of original depth image provided in an embodiment of the present invention;
Fig. 3 is original depth-map provided in an embodiment of the present invention three-dimensional display in matlab;
Fig. 4 is the schematic diagram of the depth map after provided in an embodiment of the present invention be filtered;
Fig. 5 three-dimensional display in matlab for the depth map after provided in an embodiment of the present invention be filtered;
Fig. 6 is a kind of structural schematic diagram of the filtering system of depth image provided in an embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
Term " first " and " second " in description and claims of this specification and above-mentioned attached drawing etc. are for area
Not different objects, rather than for describing specific sequence.Furthermore term " includes " and " having " and their any deformations,
It is intended to cover and non-exclusive includes.Such as it contains the process, method of a series of steps or units, system, product or sets
It is standby not to be set in listed step or unit, but may include the step of not listing or unit.
A kind of filtering method of depth image is provided in embodiments of the present invention, and referring to Fig. 1, this method includes following step
It is rapid:
S101, image spreading is carried out to the first image according to default sliding window, obtains the second image.
Wherein, the size of the first characterization image original depth image, the size of the second image and the first image and default cunning
The size of dynamic window has default corresponding relationship.In order to avoid the loss to marginal information, need to original depth image
Four sides up and down be extended, the pixel of preset quantity can be extended, the processing of the second image satisfaction after the extension for being
Demand.It is corresponding, original image can also be extended according to the sliding window size of setting.
A kind of method of image spreading is additionally provided in the embodiment of the present application, may comprise steps of:
S1011, basis preset the size of sliding window, respectively extend the first numerical value to four sides up and down of the first image
A pixel;
S1012, according to the size of the first image and the size of the sliding window, calculate the size for obtaining the second image;
S1013, according to the depth value of the pixel of the first image and the size of default sliding window, to second image
In pixel carry out assignment, obtain assignment after the second image.
Wherein, by the N*N that is dimensioned to of default sliding window, and N is the positive odd number more than or equal to 3, needs to illustrate
Be set the value of N to odd number be in order to guarantee sliding window center pixel both sides have same number pixel.First number
Value=(N-1)/2, i.e., to original depth image, four sides respectively extend (N-1)/2 pixel up and down, and each of after extension
Pixel depth value is equal to the depth value of pixel adjacent thereto.It is specific:
It is highly H1 if the width of the first image is W1, the width W2=W1+N-1 of second image, described the
The height H2=H1+N-1 of two images;
Assignment is carried out to the depth value of the pixel in the second image, wherein shown in assignment formula such as formula (1).
Wherein, d2 (i, j) indicates that row coordinate is depth coordinate value that i column coordinate is j, same d1 (i, j) in the second image
Indicate that row coordinate in the first image is the depth coordinate value that i column coordinate is j, max (a, b) expression takes maximum in two number of a, b
Number, min (a, b) expression take the smallest number in two number of a, b.
It should be noted that the value of N needs to be carried out according to factors such as the resolution ratio of image, filtering accuracy, algorithm time-consumings
Value, for ease of calculation, preferably value is 3~11, but is also not excluded for other values, is chosen according to concrete condition.
For example, in the present invention, the value of parameter is W1=552, H1=664, N=7, (N-1)/2=3 in specific implementation process;This
In invention in following part, the pixel of the second image is indicated with d (i, j), and d (i, j) indicates the i-th row jth column of the second image
Pixel coordinate also illustrate that the depth value of the pixel simultaneously.
S102, it determines object pixel in the second image, and attribute is carried out to object pixel based on the dynamic threshold of setting
Judgement, obtains the attributive character information of object pixel.
After obtaining the expanded images of original depth image, need to the pixel d of filtering processing pending in the second image (i,
J) determined property is carried out, that is, judges that the pixel is to be located at fringe region or plane domain.
A kind of determined property method is additionally provided in embodiments of the present invention, comprising:
S1021, the reference parameter according to the object pixel of second image, it is determining to match with the object pixel
Dynamic threshold, the reference parameter include reference threshold, reference percentage and reference depth value;
S1022, the neighborhood region for obtaining the object pixel;
S1023, the gradient for calculating each pixel in the neighborhood region obtain gradient set;
S1024, each of gradient set gradient value is compared with the dynamic threshold, if there is a ladder
Angle value is greater than the dynamic threshold, then judges that the object pixel is located at fringe region, otherwise, the object pixel is located at plane
Region.
Specifically, pixel d (i, j) the origin coordinates value to be processed in the second image is i=(N-1)/2, j=(N-1)/2;
The process of determined property is as follows:
The reference threshold of pixel d (i, j) determined property is obtained according to dynamic threshold calculation formula first.Dynamic threshold calculates
Shown in formula such as formula (2):
Wherein, DT (i, j) indicates the reference dynamic threshold of pixel d (i, j) determined property, refers to percentage based on base
Than ref_depth is reference depth value, for example, the value of parameter is base=5, ref_depth=in implementation process
1000。
Obtain the neighborhood region of pixel d (i, j).Using pixel d (i, j) as the center of sliding window, in depth image
The set that all elements form in upper sliding window is denoted as the neighborhood region of pixel d (i, j), therefore the neighborhood region includes the mesh
It marks pixel d (i, j);
The gradient of each pixel and d (i, j) in neighborhood region is calculated, and absolute value fortune is carried out to all gradient values
It calculates, the set of all gradient value compositions is denoted as the gradient set GS of d (i, j) after signed magnitude arithmetic(al).The calculation formula of element in GS
For formula (3):
Grad (r, c)=abs (d (i+r, j+c)-d (i, j)) (3)
Wherein, grad (r, c) indicates (r+a) * N+ (c+a) a element in GS, r=-a ,-(a-1) ... (a-1), a;c
=-a ,-(a-1) ... (a-1), a;A=(N-1)/2;Abs indicates signed magnitude arithmetic(al).For example, in implementation process parameter value
Respectively N=7, a=3.
Each of GS gradient value is compared with reference to dynamic threshold, if there is a gradient value to be greater than the threshold value,
Illustrate that d (i, j) is located at depth image fringe region;Conversely, then illustrating that d (i, j) is located at depth image plane domain.
If S103, object pixel are located at plane domain, the equal of the pixel depth value in the neighborhood region of object pixel is calculated
Value, and all pixels depth value in neighborhood region is updated to mean value, realize the filtering to each pixel of plane domain
Processing, obtains the filter result of plane domain;
If S104, object pixel are located at fringe region, the power of all pixels point in the neighborhood region of object pixel is calculated
Weight values, and the target depth value for obtaining object pixel is calculated according to weighted value, the depth value of object pixel is updated to target depth
Angle value obtains the filter result of fringe region;
The filter result of S105, the filter result based on plane domain and fringe region obtain filtered depth image.
After the attribute information for obtaining object pixel d (i, j), need to be filtered the object pixel.
When pixel d (i, j) is located at depth image plane domain, using all pictures in the method calculation window of mean filter
The mean value of plain depth value and with the depth value of all pixels in the mean value more new window, specific implementation process includes:
The sum of all pixels in d (i, j) neighborhood region is calculated, shown in calculation formula such as formula (4):
Wherein, sum (i, j) indicates the depth of all elements and r=-a ,-(a-1) ... (a- in d (i, j) neighborhood region
1),a;C=-a ,-(a-1) ... (a-1), a;A=(N-1)/2.
The mean value of all pixels in d (i, j) neighborhood region is calculated, calculation formula is formula (5):
Mean (i, j)=sum (i, j)/(N*N) (5)
As shown in formula (6), the depth value of all pixels in d (i, j) neighborhood region is updated:
D (i+r, j+c)=mean (i, j) (6)
Wherein, r=-a ,-(a-1) ... (a-1), a;C=-a ,-(a-1) ... (a-1), a;A=(N-1)/2.
After the completion of being filtered to current object pixel d (i, j), need to update i, the value of j is realized under
The filtering processing of one pixel.The sliding step of sliding window is a in plane domain, enables j=j+a;If j < W1, i=i;It is no
Then, i=i+a;If i < H1, j=(N-1)/2 is enabled, next pixel d (i, j) to be processed is filtered again;It is no
Then, illustrate that filtering is completed in the second image.
When pixel d (i, j) is located at depth image fringe region, new depth is calculated using improved bilateral filtering method
Value, is then updated the d (i, j) with new depth value, detailed process is as follows:
The pixel that depth value is 0 in depth map indicates the invalid point of depth, so needing when calculating new depth value
Reject the pixel that these depth values are 0.All pictures in d (i, j) neighborhood region are calculated according to improved bilateral filtering method
The weight of vegetarian refreshments.Weight calculation formula is shown in formula (7):
Wherein, abs indicates the operation that takes absolute value, and min (a, b) indicates the operation of minimum value in a, b.
All pixels point weight in d (i, j) neighborhood region is normalized, normalization calculation formula is formula (8)
It is shown:
The depth value of d (i, j) is recalculated, shown in calculation formula such as formula (9):
Update i, the value of j.In the edge region in order to which the sliding step for retaining the detailed information sliding window of edge is
1, enable j=j+1;If j < W1, then i=i;Otherwise, i=i+1;If i < H1, then j=(N-1)/2 is enabled, again to next to be processed
Pixel d (i, j) be filtered;Otherwise, illustrate that filtering is completed in the second image.
For example, with reference to Fig. 2, it illustrates the schematic diagram of original depth-map, Fig. 2 is the 552*664 size before a filtering
Depth image, display foreground is the people to open one's arms, and image background is horizontal metope, and black portions represent nothing in image
Imitate the pixel of depth value;Image 3 is Fig. 2 three-dimensional display in matlab, from figure 3, it can be seen that by depth camera
There are biggish fluctuations in plane domain for the depth map of crawl, thus need to be filtered depth image.Using this hair
The above-mentioned filtering method provided in bright embodiment, wherein the W1=552 in the first image, H1=664, N=7, (N-1)/2=
3, a=3.The depth image obtained after being filtered to the second image is as shown in figure 4, Fig. 5 is that three-dimensional of the Fig. 4 in matlab is aobvious
Diagram, plane domain is smoothed out with respect to Fig. 3 as seen from Figure 5.In addition, Fig. 5 and Fig. 2 is compared, the details in Fig. 2 is remained
Information, as finger edge is still high-visible in Fig. 4 in Fig. 2.In addition the part cavity in Fig. 2 has been carried out effectively after filtering
Filling, as there is depth value in black hole corresponding position in Fig. 4 in part in ellipse in Fig. 2.
The filtering method of the depth image provided in embodiments of the present invention is moved according to the pixel depth value of depth image
State determines the threshold value for judging pixel classifications, and dynamic threshold enhances this method to the adaptivity and robustness of different depth value;
Secondly, this method classifies to depth pixel value according to dynamic threshold, different filtering modes is used according to different classifications
I.e. plane domain uses mean filter and fringe region uses modified bilateral filtering, and the mode of classified filtering can be to plane area
Domain achievees the effect that smooth, can also retain the detailed information at depth image edge;During plane filtering, using more new window
The mode of interior all pixels depth value can increase the step-length of sliding window simultaneously, and which can both transmit planar smoothness while can
Reduce filtering time;Finally, using follow-on bilateral filtering during edge filter, bilateral filtering is equally changed
Into, when being included in calculating position weight, the distance of position is replaced with pixel separation number, it is right when calculating image depth values weight
Concentration gradient value carries out scale smaller, more reasonable by the weight distribution of exponential transform in this way, when calculating weight, rejects depth
The pixel that value is 0;So that improved bilateral filtering reduces runtime of filtering and has preferably to depth map edge processing
Filter effect, while the part cavity in fillable depth image.
Referring to Fig. 6, a kind of filtering system of depth image is additionally provided in the embodiment of the present application, comprising:
Expanding element 10, for obtaining the second image according to sliding window is preset to the first image progress image spreading,
In, the first image characterizes original depth image, the size of second image and the size of the first image and described
The size of default sliding window has default corresponding relationship;
Determined property unit 20, for determining the object pixel in second image, and the dynamic threshold based on setting
Determined property is carried out to the object pixel, obtains the attributive character information of object pixel, the attributive character information includes institute
State that object pixel is located at fringe region or the object pixel is located at plane domain;
Plane domain processing unit 30 calculates the object pixel if being located at plane domain for the object pixel
Neighborhood region pixel depth value mean value, and by all pixels depth value in the neighborhood region be updated to it is described
Value is realized to the filtering processing of each pixel of the plane domain, obtains the filter result of plane domain;
Fringe region processing unit 40 calculates the object pixel if being located at fringe region for the object pixel
Neighborhood region in all pixels point weighted value, and the target depth for obtaining the object pixel is calculated according to the weighted value
The depth value of the object pixel is updated to the target depth value, realizes the filter to the pixel of the fringe region by value
Wave processing, obtains the filter result of fringe region;
Generation unit 50 is obtained for the filter result of filter result and the fringe region based on the plane domain
Obtain filtered depth image.
On the basis of the above embodiments, the expanding element includes:
Subelement is extended, for respectively expanding four sides up and down of the first image according to the size for presetting sliding window
Open up the first numerical value pixel, wherein the size of default sliding window is N*N, and N is the positive odd number more than or equal to 3, the first numerical value
=(N-1)/2, and each pixel depth value after extension is equal to the depth value of pixel adjacent thereto;
First computation subunit obtains for calculating according to the size of the first image and the size of the sliding window
The size of two images, wherein it is highly H1 if the width of the first image is W1, the width W2=W1+ of second image
N-1, the height H2=H1+N-1 of second image;
Assignment subunit, for according to the depth value of the pixel of the first image and the size of default sliding window, to institute
The pixel stated in the second image carries out assignment, the second image after obtaining assignment.
On the basis of the above embodiments, the determined property unit includes:
Determine subelement, it is determining with the target picture for the reference parameter according to the object pixel of second image
The dynamic threshold that element matches, the reference parameter include reference threshold, reference percentage and reference depth value;
First obtains subelement, for obtaining the neighborhood region of the object pixel;
Second computation subunit obtains gradient set for calculating the gradient of each pixel in the neighborhood region;
Comparing subunit, for each of gradient set gradient value to be compared with the dynamic threshold,
If there is a gradient value to be greater than the dynamic threshold, judge that the object pixel is located at fringe region, otherwise, the target picture
Element is located at plane domain.
On the basis of the above embodiments, the plane treatment unit includes:
Third computation subunit calculates the neighbour of the object pixel if being located at plane domain for the object pixel
The sum of all pixels in the region of domain;
All pictures in the neighborhood region are calculated for the sum according to all pixels in 4th computation subunit
The mean value of element;
First updates subelement, for the depth value of all pixels in the neighborhood region to be updated to the mean value;
First processing subelement filters updated object pixel for updating the coordinate value of the object pixel
Wave processing;
The fringe region processing unit includes:
5th computation subunit calculates the neck of the object pixel if being located at fringe region for the object pixel
The weight of all pixels point in the region of domain;
Subelement is normalized, is normalized for the weight to the pixel, the weight after being normalized
Value;
6th computation subunit, for calculating the target depth of the object pixel based on the weighted value after the normalization
Value;
Second updates subelement, for the depth value of the object pixel to be updated to the target depth value, and it is described
The depth value of the pixel in addition to the object pixel in fringe region is corresponding original depth value, is realized to described
The filtering processing of the pixel of fringe region obtains the filter result of fringe region.
On the basis of the above embodiments, the fringe region processing unit further includes
Subelement is rejected, if the depth value for the pixel in the neighborhood region is 0, weeds out the depth value
For 0 pixel.
The present invention provides a kind of filtering systems of depth image to obtain by being extended to original depth image
Two images are then based on dynamic threshold and judge the attribute of the target image in the second image, use dynamic threshold can be with
Adaptive and robustness is all had to different depth value, is classified according to dynamic threshold to depth pixel value, realizing can
Different filtering processings is carried out for different attributive character, i.e., uses mean value to update all pixels in neighborhood in plane domain
Depth value, and the depth value of object pixel is updated using depth value is recalculated to fringe region.To solve to depth
The problem of can not reaching obvious to plane domain filter effect in degree figure filtering simultaneously and retaining marginal information.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with other
The difference of embodiment, the same or similar parts in each embodiment may refer to each other.For device disclosed in embodiment
For, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place is said referring to method part
It is bright.
The foregoing description of the disclosed embodiments enables those skilled in the art to implement or use the present invention.
Various modifications to these embodiments will be readily apparent to those skilled in the art, as defined herein
General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, of the invention
It is not intended to be limited to the embodiments shown herein, and is to fit to and the principles and novel features disclosed herein phase one
The widest scope of cause.
Claims (10)
1. a kind of filtering method of depth image, which is characterized in that this method comprises:
Image spreading is carried out to the first image according to default sliding window, obtains the second image, wherein the first image characterization
Original depth image, the size of second image and the size of the first image and the size of default sliding window tool
There is default corresponding relationship;
It determines the object pixel in second image, and attribute is carried out to the object pixel based on the dynamic threshold of setting and is sentenced
It is disconnected, obtain the attributive character information of object pixel, the attributive character information include the object pixel be located at fringe region or
Object pixel described in person is located at plane domain;
If the object pixel is located at plane domain, the equal of the pixel depth value in the neighborhood region of the object pixel is calculated
Value, and all pixels depth value in the neighborhood region is updated to the mean value, it realizes to each of the plane domain
The filtering processing of pixel obtains the filter result of plane domain;
If the object pixel is located at fringe region, the weight of all pixels point in the neighborhood region of the object pixel is calculated
Value, and the target depth value for obtaining the object pixel is calculated according to the weighted value, more by the depth value of the object pixel
Newly it is the target depth value, realizes to the filtering processing of the pixel of the fringe region, obtain the filtering knot of fringe region
Fruit;
The filter result of filter result and the fringe region based on the plane domain, obtains filtered depth image.
2. the method according to claim 1, wherein the basis, which presets sliding window, carries out figure to the first image
As extension, the second image is obtained, comprising:
According to the size of default sliding window, the first numerical value pixel is respectively extended to four sides up and down of the first image,
In, the size for presetting sliding window is N*N, and N is positive odd number more than or equal to 3, the first numerical value=(N-1)/2, and after extending
Each pixel depth value be equal to pixel adjacent thereto depth value;
According to the size of the size of the first image and the sliding window, the size for obtaining the second image is calculated, wherein if described
The width of first image is W1, is highly H1, the width W2=W1+N-1, the height H2 of second image of second image
=H1+N-1;
According to the size of the depth value of the pixel of the first image and default sliding window, to the pixel in second image
Assignment is carried out, the second image after obtaining assignment.
3. the method according to claim 1, wherein the object pixel in the determination second image, and
Determined property is carried out to the object pixel based on the dynamic threshold of setting, obtains the attributive character information of object pixel, comprising:
According to the reference parameter of the object pixel of second image, the determining dynamic threshold to match with the object pixel,
The reference parameter includes reference threshold, reference percentage and reference depth value;
Obtain the neighborhood region of the object pixel;
The gradient of each pixel in the neighborhood region is calculated, gradient set is obtained;
Each of gradient set gradient value is compared with the dynamic threshold, if there is a gradient value to be greater than institute
Dynamic threshold is stated, then judges that the object pixel is located at fringe region, otherwise, the object pixel is located at plane domain.
If 4. the method according to claim 1, wherein the object pixel were located at plane domain, meter
Calculate the mean value of the pixel depth value in the neighborhood region of the object pixel, and by all pixels depth value in the neighborhood region
It is updated to the mean value, comprising:
If the object pixel is located at plane domain, the sum of all pixels in the neighborhood region of the object pixel is calculated;
According to the sum of all pixels, the mean value of all pixels in the neighborhood region is calculated;
The depth value of all pixels in the neighborhood region is updated to the mean value;
The coordinate value for updating the object pixel is filtered updated object pixel.
If 5. the method according to claim 1, wherein the object pixel were located at fringe region, meter
The weighted value of all pixels point in the neighborhood region of the object pixel is calculated, and is calculated according to the weighted value and obtains the target
The depth value of the object pixel is updated to the target depth value, realized to the marginal zone by the target depth value of pixel
The filtering processing of the pixel in domain obtains the filter result of fringe region, comprising:
If the object pixel is located at fringe region, the power of all pixels point in the areas of the object pixel is calculated
Weight;
The weight of the pixel is normalized, the weighted value after being normalized;
The target depth value of the object pixel is calculated based on the weighted value after the normalization;
The depth value of the object pixel is updated to the target depth value, and removes the target picture in the fringe region
The depth value of pixel except element is corresponding original depth value, is realized to the filtering of the pixel of the fringe region
Reason, obtains the filter result of fringe region.
6. according to the method described in claim 5, it is characterized in that, this method further include:
If the depth value of the pixel in the neighborhood region is 0, the pixel that the depth value is 0 is weeded out.
7. a kind of filtering system of depth image, which is characterized in that the system includes:
Expanding element, for obtaining the second image, wherein institute according to sliding window is preset to the first image progress image spreading
State the first characterization image original depth image, the size of second image and the size of the first image and the default cunning
The size of dynamic window has default corresponding relationship;
Determined property unit, for determining the object pixel in second image, and based on the dynamic threshold of setting to described
Object pixel carries out determined property, obtains the attributive character information of object pixel, the attributive character information includes the target
Pixel is located at fringe region or the object pixel is located at plane domain;
Plane domain processing unit calculates the neighborhood of the object pixel if being located at plane domain for the object pixel
The mean value of the pixel depth value in region, and all pixels depth value in the neighborhood region is updated to the mean value, it realizes
Filtering processing to each pixel of the plane domain, obtains the filter result of plane domain;
Fringe region processing unit calculates the neighborhood of the object pixel if being located at fringe region for the object pixel
The weighted value of all pixels point in region, and the target depth value for obtaining the object pixel is calculated according to the weighted value, it will
The depth value of the object pixel is updated to the target depth value, realizes to the filtering of the pixel of the fringe region
Reason, obtains the filter result of fringe region;
Generation unit is filtered for the filter result of filter result and the fringe region based on the plane domain
Depth image afterwards.
8. system according to claim 7, which is characterized in that the expanding element includes:
Subelement is extended, for respectively extending the to four sides up and down of the first image according to the size for presetting sliding window
One numerical value pixel, wherein the size of default sliding window is N*N, and N is the positive odd number more than or equal to 3, the first numerical value=
(N-1)/2 each pixel depth value, and after extension is equal to the depth value of pixel adjacent thereto;
First computation subunit, for calculating and obtaining the second figure according to the size of the first image and the size of the sliding window
The size of picture, wherein it is highly H1 if the width of the first image is W1, the width W2=W1+N-1 of second image,
The height H2=H1+N-1 of second image;
Assignment subunit, for according to the depth value of the pixel of the first image and the size of default sliding window, to described
Pixel in two images carries out assignment, the second image after obtaining assignment.
9. system according to claim 7, which is characterized in that the determined property unit includes:
Subelement is determined, for the reference parameter according to the object pixel of second image, the determining and object pixel phase
Matched dynamic threshold, the reference parameter include reference threshold, reference percentage and reference depth value;
First obtains subelement, for obtaining the neighborhood region of the object pixel;
Second computation subunit obtains gradient set for calculating the gradient of each pixel in the neighborhood region;
Comparing subunit, for each of gradient set gradient value to be compared with the dynamic threshold, if having
One gradient value is greater than the dynamic threshold, then judges that the object pixel is located at fringe region, otherwise, the object pixel position
In plane domain.
10. system according to claim 7, which is characterized in that the plane treatment unit includes:
Third computation subunit calculates the neighborhood area of the object pixel if being located at plane domain for the object pixel
The sum of all pixels in domain;
All pixels in the neighborhood region are calculated for the sum according to all pixels in 4th computation subunit
Mean value;
First updates subelement, for the depth value of all pixels in the neighborhood region to be updated to the mean value;
First processing subelement is filtered place to updated object pixel for updating the coordinate value of the object pixel
Reason;
The fringe region processing unit includes:
5th computation subunit calculates the domain area of the object pixel if being located at fringe region for the object pixel
The weight of all pixels point in domain;
Subelement is normalized, is normalized for the weight to the pixel, the weighted value after being normalized;
6th computation subunit, for calculating the target depth value of the object pixel based on the weighted value after the normalization;
Second updates subelement, for the depth value of the object pixel to be updated to the target depth value, and the edge
The depth value of the pixel in addition to the object pixel in region is corresponding original depth value, is realized to the edge
The filtering processing of the pixel in region obtains the filter result of fringe region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910789404.6A CN110490829B (en) | 2019-08-26 | 2019-08-26 | Depth image filtering method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910789404.6A CN110490829B (en) | 2019-08-26 | 2019-08-26 | Depth image filtering method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110490829A true CN110490829A (en) | 2019-11-22 |
CN110490829B CN110490829B (en) | 2022-03-15 |
Family
ID=68553945
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910789404.6A Active CN110490829B (en) | 2019-08-26 | 2019-08-26 | Depth image filtering method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110490829B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111323756A (en) * | 2019-12-30 | 2020-06-23 | 北京海兰信数据科技股份有限公司 | Deep learning-based marine radar target detection method and device |
CN111415310A (en) * | 2020-03-26 | 2020-07-14 | Oppo广东移动通信有限公司 | Image processing method and device and storage medium |
CN111986124A (en) * | 2020-09-07 | 2020-11-24 | 北京凌云光技术集团有限责任公司 | Filling method and device for missing pixels of depth image |
CN113570530A (en) * | 2021-06-10 | 2021-10-29 | 北京旷视科技有限公司 | Image fusion method, apparatus, computer-readable storage medium and electronic device |
CN114066779A (en) * | 2022-01-13 | 2022-02-18 | 杭州蓝芯科技有限公司 | Depth map filtering method and device, electronic equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120002863A1 (en) * | 2010-07-02 | 2012-01-05 | Samsung Electronics Co., Ltd. | Depth image encoding apparatus and depth image decoding apparatus using loop-filter, method and medium |
CN103581633A (en) * | 2012-07-24 | 2014-02-12 | 索尼公司 | Image processing device, image processing method, program, and imaging apparatus |
CN103854257A (en) * | 2012-12-07 | 2014-06-11 | 山东财经大学 | Depth image enhancement method based on self-adaptation trilateral filtering |
CN104683783A (en) * | 2015-01-08 | 2015-06-03 | 电子科技大学 | An Adaptive Depth Image Filtering Method |
-
2019
- 2019-08-26 CN CN201910789404.6A patent/CN110490829B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120002863A1 (en) * | 2010-07-02 | 2012-01-05 | Samsung Electronics Co., Ltd. | Depth image encoding apparatus and depth image decoding apparatus using loop-filter, method and medium |
CN103581633A (en) * | 2012-07-24 | 2014-02-12 | 索尼公司 | Image processing device, image processing method, program, and imaging apparatus |
CN103854257A (en) * | 2012-12-07 | 2014-06-11 | 山东财经大学 | Depth image enhancement method based on self-adaptation trilateral filtering |
CN104683783A (en) * | 2015-01-08 | 2015-06-03 | 电子科技大学 | An Adaptive Depth Image Filtering Method |
Non-Patent Citations (4)
Title |
---|
徐飞 等: "《MATLAB应用图像处理》", 31 May 2002 * |
赵伟舟 等: "保留边缘信息和特定目标的高密脉冲噪声滤波算法", 《西北师范大学学报(自然科学版)》 * |
邹星星 等: "基于边缘检测的Kinect 深度图像去噪算法", 《湖南工业大学学报》 * |
陈潇红 等: "基于时空联合滤波的高清视频降噪算法", 《浙江大学学报(工学版)》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111323756A (en) * | 2019-12-30 | 2020-06-23 | 北京海兰信数据科技股份有限公司 | Deep learning-based marine radar target detection method and device |
CN111415310A (en) * | 2020-03-26 | 2020-07-14 | Oppo广东移动通信有限公司 | Image processing method and device and storage medium |
CN111415310B (en) * | 2020-03-26 | 2023-06-30 | Oppo广东移动通信有限公司 | Image processing method and device and storage medium |
CN111986124A (en) * | 2020-09-07 | 2020-11-24 | 北京凌云光技术集团有限责任公司 | Filling method and device for missing pixels of depth image |
CN111986124B (en) * | 2020-09-07 | 2024-05-28 | 凌云光技术股份有限公司 | Filling method and device for missing pixels of depth image |
CN113570530A (en) * | 2021-06-10 | 2021-10-29 | 北京旷视科技有限公司 | Image fusion method, apparatus, computer-readable storage medium and electronic device |
CN113570530B (en) * | 2021-06-10 | 2024-04-16 | 北京旷视科技有限公司 | Image fusion method, device, computer-readable storage medium, and electronic device |
CN114066779A (en) * | 2022-01-13 | 2022-02-18 | 杭州蓝芯科技有限公司 | Depth map filtering method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110490829B (en) | 2022-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110490829A (en) | A kind of filtering method and system of depth image | |
CN109409437B (en) | Point cloud segmentation method and device, computer readable storage medium and terminal | |
CN108876749A (en) | A kind of lens distortion calibration method of robust | |
CN109741356A (en) | Subpixel edge detection method and system | |
CN106991693B (en) | Binocular Stereo Matching Method Based on Fuzzy Support Weight | |
CN108399611A (en) | Multi-focus image fusing method based on gradient regularisation | |
CN108369737B (en) | Using heuristic graph search to segment layered images quickly and automatically | |
JP2014096062A (en) | Image processing method and image processing apparatus | |
NL2016660B1 (en) | Image stitching method and device. | |
CN107798702A (en) | A kind of realtime graphic stacking method and device for augmented reality | |
JP5969460B2 (en) | Nail region detection method, program, storage medium, and nail region detection device | |
CN107346041A (en) | The determination method, apparatus and electronic equipment of the grating parameter of bore hole 3D display equipment | |
CN115439615B (en) | Distributed integrated management system based on three-dimensional BIM | |
CN109816749A (en) | Method, device, computer equipment and storage medium for filling in dotted map symbols | |
Kim et al. | Adaptive weighted sum method for multiobjective optimization | |
CN104899592B (en) | A kind of road semiautomatic extraction method and system based on circular shuttering | |
JP2006513483A (en) | How to segment a 3D structure | |
CN114332291A (en) | Oblique photography model building outer contour rule extraction method | |
CN109327712A (en) | Video de-shake method for fixed scenes | |
CN106296604B (en) | A kind of image repair method and device | |
CN108983233B (en) | PS point combination selection method in GB-InSAR data processing | |
CN106908747B (en) | Chemical shift coding imaging method and device | |
CN106157301B (en) | A kind of certainly determining method and device of the threshold value for Image Edge-Detection | |
CN109584166A (en) | Disparity map denseization method, apparatus and computer readable storage medium | |
CN108961292A (en) | A kind of method and apparatus detecting MSP in brain medical image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |