[go: up one dir, main page]

CN102737245B - Three-dimensional scene object boundary detection method and device - Google Patents

Three-dimensional scene object boundary detection method and device Download PDF

Info

Publication number
CN102737245B
CN102737245B CN201210185752.0A CN201210185752A CN102737245B CN 102737245 B CN102737245 B CN 102737245B CN 201210185752 A CN201210185752 A CN 201210185752A CN 102737245 B CN102737245 B CN 102737245B
Authority
CN
China
Prior art keywords
differential
pixel
brt
clr
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210185752.0A
Other languages
Chinese (zh)
Other versions
CN102737245A (en
Inventor
戴琼海
林靖宇
曹汛
王竞瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING TSING HUA VISION TECHNOLOGY Co Ltd
Tsinghua University
Original Assignee
BEIJING TSING HUA VISION TECHNOLOGY Co Ltd
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING TSING HUA VISION TECHNOLOGY Co Ltd, Tsinghua University filed Critical BEIJING TSING HUA VISION TECHNOLOGY Co Ltd
Priority to CN201210185752.0A priority Critical patent/CN102737245B/en
Publication of CN102737245A publication Critical patent/CN102737245A/en
Application granted granted Critical
Publication of CN102737245B publication Critical patent/CN102737245B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

本发明提出一种三维场景物体边界检测方法,包括:计算三维图像的每个像素的亮度特征、颜色特征、亮度纹理特征和颜色纹理特征在多个方向上的微分;将每个像素的相同方向上的亮度特征的微分与亮度纹理特征的微分进行合并以生成第一微分,并将相同方向上的颜色特征的微分与颜色纹理特征的微分进行合并以生成第二微分;将每个像素的多个方向上的第一微分进行合并以生成第三微分,并将多个方向上的第二微分进行合并以生成第四微分;根据第三微分和第四微分计算每个像素为边界点的概率,得到物体边界图。本发明还提出了一种三维场景物体边界检测装置。本发明可以快速检测出三维场景中物体的边界,显著降低阴影和物体表面的纹理对边界识别效果的影响。

The present invention proposes a method for detecting the boundary of a three-dimensional scene object, comprising: calculating the differential of brightness features, color features, brightness texture features, and color texture features of each pixel of a three-dimensional image in multiple directions; The differential of the brightness feature and the differential of the brightness texture feature in the same direction are combined to generate the first differential, and the differential of the color feature in the same direction is combined with the differential of the color texture feature to generate the second differential; the multiple of each pixel Combine the first differentials in one direction to generate the third differential, and combine the second differentials in multiple directions to generate the fourth differential; calculate the probability that each pixel is a boundary point according to the third differential and the fourth differential , to get the object boundary map. The invention also proposes a three-dimensional scene object boundary detection device. The invention can quickly detect the boundary of the object in the three-dimensional scene, and significantly reduces the influence of the shadow and the texture of the surface of the object on the boundary recognition effect.

Description

Three-dimensional scenic object boundary detection method and device
Technical field
Technical field of computer vision of the present invention, particularly a kind of three-dimensional scenic object boundary detection method and device.
Background technology
Boundary Detection has important effect in computer vision technique.Early stage Tuscany (Canny) algorithm adopts the carry out filtering of differential filter to gray-scale map, can find the edge of brightness generation saltus step in image.Some algorithms afterwards detect object edge by the variation that detects color change or comprehensive brightness and color.But in actual three-dimensional scenic, the saltus step judgment object border of dependent color or brightness is insecure.The texture of body surface is easy to be detected as edge (as the striped of zebra) on the one hand, and illumination produces on the other hand brightness gradual change and shade also can be detected as edge.Now, utilize existing above-mentioned algorithm to be difficult to estimate correct object boundary.
Summary of the invention
Object of the present invention is intended at least solve one of above-mentioned technological deficiency.
For this reason, first object of the present invention is to propose a kind of three-dimensional scenic object boundary detection method, can detect fast and effectively the border of object in three-dimensional scenic, the impact of the texture that significantly reduces shade and body surface on border recognition effect, has improved the precision of Boundary Detection.
For achieving the above object, the embodiment of first aspect present invention has proposed a kind of three-dimensional scenic object boundary detection method, the differential of the one or more features that comprise the steps: each pixel of calculating 3-D view in multiple directions, wherein, described one or more feature comprises brightness, color characteristic, luminance texture feature and color and vein feature; The differential of the differential of the described brightness on the equidirectional of described each pixel and described luminance texture feature is merged to generate the first differential, and the differential of the differential of the described color characteristic on equidirectional and described color and vein feature is merged to generate the second differential; Multiple the first differential in the described multiple directions of described each pixel are merged to generate the 3rd differential, and multiple the second differential in described multiple directions are merged to generate the 4th differential; The probability that is frontier point according to each pixel described in described the 3rd differential of described each pixel and described the 4th differential calculation, obtains object boundary figure.
According to the three-dimensional scenic object boundary detection method of the embodiment of the present invention, can detect fast and effectively the border of object in three-dimensional scenic, the impact of the texture that significantly reduces shade and body surface on border recognition effect, has improved the precision of Boundary Detection.
In one embodiment of the invention, establish current pixel point P, centered by pixel P, along direction θ, choose the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, then calculate the brightness histogram of each window
Figure GDA00002444939200021
and color histogram
Figure GDA00002444939200022
the numbering that wherein i is subwindow.
In one embodiment of the invention, according to the brightness histogram of described each window and color histogram
Figure GDA00002444939200024
calculate respectively the differential of brightness, differential, the differential of luminance texture feature and the differential of color and vein feature of color characteristic, wherein, the differential of the brightness of described pixel P on direction θ is:
Figure GDA00002444939200025
the differential of the color characteristic of described pixel P on direction θ is:
Figure GDA00002444939200026
the differential of the luminance texture feature of described pixel P on direction θ is:
Figure GDA00002444939200027
the differential of the color and vein feature of described pixel P on direction θ is:
In one embodiment of the invention, described the first differential is:
Figure GDA00002444939200029
described the second differential is: G cmb clr ( P , θ ) = min [ G int clr , G txt clr ] .
In one embodiment of the invention, described the 3rd differential is:
Figure GDA000024449392000211
described the 4th differential is: G clr ( P ) = max θ [ G cmb clr ( P , θ ) ] .
In one embodiment of the invention, the described probability that is frontier point according to each pixel described in described the 3rd differential of described each pixel and described the 4th differential calculation
Pb (P)=α G brt(P)+(1-α) G clr(P), wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
The embodiment of a second aspect of the present invention has proposed a kind of three-dimensional scenic object boundary pick-up unit, comprise: differential calculation module, differential for one or more features of each pixel of calculating 3-D view in multiple directions, wherein, described one or more feature comprises brightness, color characteristic, luminance texture feature and color and vein feature; Differential merges module, for merging the differential of described pixel multiple features in the same direction, and merges the differential of described pixel in multiple directions; Probability calculation module, for calculating the probability that described each pixel is frontier point.
According to the three-dimensional scenic object boundary pick-up unit of the embodiment of the present invention, can detect fast and effectively the border of object in three-dimensional scenic, the impact of the texture that significantly reduces shade and body surface on border recognition effect, has improved the precision of Boundary Detection.
In one embodiment of the invention, establish current pixel point P, described differential calculation module is centered by pixel P, along direction θ, choose the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, then calculate the brightness histogram of each window
Figure GDA00002444939200031
and color histogram the numbering that wherein i is subwindow.
In one embodiment of the invention, described differential calculation module is according to the brightness histogram of described each window
Figure GDA00002444939200033
and color histogram
Figure GDA00002444939200034
calculate respectively the differential of brightness, differential, the differential of luminance texture feature and the differential of color and vein feature of color characteristic, wherein, the differential of the brightness of described pixel P on direction θ is: the differential of the color characteristic of described pixel P on direction θ is: the differential of the luminance texture feature of described pixel P on direction θ is:
Figure GDA00002444939200037
the differential of the color and vein feature of described pixel P on direction θ is: G txt clr ( P , θ ) = f [ d ( H 1 clr , H 3 clr ) + d ( H 2 clr , H 4 clr ) ] .
In one embodiment of the invention, described differential merges module the differential of the differential of the described brightness on the equidirectional of described each pixel and described luminance texture feature is merged to generate the first differential, and the computing formula of described the first differential is
Figure GDA00002444939200039
and the differential of the differential of the described color characteristic on equidirectional and described color and vein feature is merged to generate the second differential, the computing formula of described the second differential is G cmb clr ( P , θ ) = min [ G int clr , G txt clr ] .
In one embodiment of the invention, described differential merges module multiple the first differential in the described multiple directions of described each pixel is merged to generate the 3rd differential, and the computing formula of described the 3rd differential is and multiple the second differential in described multiple directions are merged to generate the 4th differential, the computing formula of described the 4th differential is
Figure GDA000024449392000312
The computing formula of the probability that in one embodiment of the invention, described probability calculation module is frontier point according to each pixel described in described the 3rd differential of described each pixel and described the 4th differential calculation is
Pb (P)=α G brt(P)+(1-α) G clr(P), wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
The aspect that the present invention is additional and advantage in the following description part provide, and part will become obviously from the following description, or recognize by practice of the present invention.
Brief description of the drawings
The present invention above-mentioned and/or additional aspect and advantage will become from the following description of the accompanying drawings of embodiments obviously and easily and understand, wherein:
Fig. 1 is the process flow diagram of the three-dimensional scenic object boundary detection method of the embodiment of the present invention;
Fig. 2 is that the window of one embodiment of the invention is divided schematic diagram;
Fig. 3 is the 3-D view object boundary testing process schematic diagram of one embodiment of the invention; And
Fig. 4 is the schematic diagram of the three-dimensional scenic object boundary pick-up unit of the embodiment of the present invention.
Embodiment
Describe embodiments of the invention below in detail, the example of described embodiment is shown in the drawings, and wherein same or similar label represents same or similar element or has the element of identical or similar functions from start to finish.Be exemplary below by the embodiment being described with reference to the drawings, only for explaining the present invention, and can not be interpreted as limitation of the present invention.
Disclosing below provides many different embodiment or example to be used for realizing different structure of the present invention.Of the present invention open in order to simplify, hereinafter the parts to specific examples and setting are described.Certainly, they are only example, and object does not lie in restriction the present invention.In addition, the present invention can be in different examples repeat reference numerals and/or letter.This repetition is in order to simplify and object clearly, itself do not indicate the relation between discussed various embodiment and/or setting.In addition, the various specific technique the invention provides and the example of material, but those of ordinary skill in the art can recognize the property of can be applicable to of other techniques and/or the use of other materials.In addition, First Characteristic described below Second Characteristic it " on " structure can comprise that the first and second Characteristics creations are the direct embodiment of contact, also can comprise the embodiment of other Characteristics creation between the first and second features, such the first and second features may not be direct contacts.
In description of the invention, it should be noted that, unless otherwise prescribed and limit, term " installation ", " being connected ", " connection " should be interpreted broadly, for example, can be mechanical connection or electrical connection, also can be the connection of two element internals, can be to be directly connected, and also can indirectly be connected by intermediary, for the ordinary skill in the art, can understand as the case may be the concrete meaning of above-mentioned term.
With reference to description and accompanying drawing below, these and other aspects of embodiments of the invention will be known.In these descriptions and accompanying drawing, specifically disclose some specific implementations in embodiments of the invention, represent some modes of the principle of implementing embodiments of the invention, but should be appreciated that the scope of embodiments of the invention is not limited.On the contrary, embodiments of the invention comprise all changes, amendment and the equivalent within the scope of spirit and the intension that falls into additional claims.
As shown in Figure 1, according to the three-dimensional scenic object boundary detection method of first aspect present invention embodiment, comprise the following steps:
S101: the differential of one or more features of each pixel of calculating 3-D view in multiple directions, wherein, one or more features comprise brightness, color characteristic, luminance texture feature and color and vein feature etc.
As shown in Figure 2, establish current pixel point P, centered by pixel P, along direction θ, choose the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, be numbered 1,2,3,4, then calculate the brightness histogram of each window
Figure GDA00002444939200051
and color histogram
Figure GDA00002444939200052
the numbering that wherein i is subwindow.In Fig. 2, choosing θ is 90 °.
The differential of the brightness of described pixel P on direction θ and the differential of color characteristic are respectively:
G int brt ( P , θ ) = f [ d ( H 2 brt , H 3 brt ) ] , G int clr ( P , θ ) = f [ d ( H 2 clr , H 3 clr ) ] , Wherein, f and d are mathematical functions, f (x)=1-exp (Cx), and C is constant, d (g, h)=∑ [g (n)-h (n)] 2,
Figure GDA00002444939200055
for brightness differential,
Figure GDA00002444939200056
for color characteristic differential.
The differential of the luminance texture feature of described pixel P on direction θ and the differential of color and vein feature are respectively:
Figure GDA00002444939200057
Figure GDA00002444939200058
wherein
Figure GDA00002444939200059
for luminance texture characteristic differentiation, for color and vein characteristic differentiation.
Particularly, according to f (x)=1-exp (Cx), C is constant, d (g, h)=∑ [g (n)-h (n)] 2, known G txt clr ( P , θ ) = 1 - exp ( - C { Σ [ H 1 clr ( n ) - H 3 clr ( n ) ] 2 + Σ [ H 2 clr ( n ) - H 4 clr ( n ) ] 2 } ) . The rest may be inferred for the differential of other features, do not repeat them here.
S102: the differential of the differential of the brightness on the equidirectional of each pixel and luminance texture feature is merged to generate the first differential, and the differential of the differential of the color characteristic on equidirectional and color and vein feature is merged to generate the second differential.
Wherein, the computing formula of the first differential is:
Figure GDA000024449392000512
The computing formula of the second differential is:
S103: multiple the first differential in the multiple directions of each pixel are merged to generate the 3rd differential, and multiple the second differential in multiple directions are merged to generate the 4th differential.
Wherein, the computing formula of the 3rd differential is:
The computing formula of the 4th differential is:
Figure GDA000024449392000515
S104: the probability that is frontier point according to the 3rd differential of each pixel and the each pixel of the 4th differential calculation, obtains object boundary figure.
Particularly, the probability that pixel is frontier point is: Pb (P)=α G brt(P)+(1-α) G clr(P)
Wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
Pb (P) is converted to 0~255 integer gray-scale value and just obtains object boundary figure.Probability threshold value also can be set, and the whole pixels that the probability of frontier point exceeded to probability threshold value combine, and obtain the object boundary figure of binaryzation.
According to the three-dimensional scenic object boundary detection method of the embodiment of the present invention, can detect fast and effectively the border of object in three-dimensional scenic, the impact of the texture that significantly reduces shade and body surface on border recognition effect, has improved the precision of Boundary Detection.
In one embodiment of the invention, the testing process of three-dimensional scenic object boundary as shown in Figure 3.Fig. 3 (a) is 3-D view to be detected.Calculate brightness, color characteristic, luminance texture feature and the differential of color and vein feature in multiple directions of each pixel of this 3-D view, the differential of the differential of the brightness on the equidirectional of each pixel and luminance texture feature is merged to generate the first differential, and the differential of the differential of the color characteristic on equidirectional and color and vein feature is merged to generate the second differential.Then, multiple the first differential in the described multiple directions of described each pixel are merged to generate the 3rd differential, the boundary graph now obtaining is brightness boundary graph, as Fig. 3 (b); Multiple the second differential in described multiple directions are merged to generate the 4th differential, and the boundary graph now obtaining is color boundaries figure, as Fig. 3 (c).Finally, the probability that is frontier point according to the 3rd differential of each pixel and the each pixel of the 4th differential calculation, Pb (P)=α G brt(P)+(1-α) G clr(P), wherein, G brt(P) be the 3rd differential, G clr(P) be the 4th differential, Pb (P) for pixel P be the probability of frontier point, α be brightness as the shared weight of edge determination factor, value α=0.5 in the present embodiment, obtains last object boundary figure, as Fig. 3 (d).
As shown in Figure 4, according to the three-dimensional scenic object boundary pick-up unit of the embodiment of second aspect present invention, comprising: differential calculation module 410, differential merge module 420 and probability calculation module 430.Wherein, differential calculation module 410 is the differential in multiple directions for one or more features of each pixel of calculating 3-D view, and wherein, described one or more features comprise brightness, color characteristic, luminance texture feature and color and vein feature etc.Differential merges module 420 for merging the differential of pixel multiple features in the same direction, and merges the differential of pixel in multiple directions.Probability calculation module 430 is for calculating the probability that described each pixel is frontier point.
Particularly, establish current pixel point P, differential calculation module 410, centered by pixel P, along direction θ, is chosen the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, then calculates the brightness histogram of each window
Figure GDA00002444939200061
and color histogram
Figure GDA00002444939200062
the numbering that wherein i is subwindow.Differential calculation module 410 is according to the brightness histogram of each window and color histogram
Figure GDA00002444939200064
calculate respectively the differential of brightness, differential, the differential of luminance texture feature and the differential of color and vein feature of color characteristic, wherein, the differential of the brightness of pixel P on direction θ is: G int brt ( P , θ ) = f [ d ( H 2 brt , H 3 brt ) ] ;
The differential of the color characteristic of pixel P on direction θ is:
Figure GDA00002444939200066
The differential of the luminance texture feature of pixel P on direction θ is:
Figure GDA00002444939200067
The differential of the color and vein feature of pixel P on direction θ is:
Figure GDA00002444939200068
Differential merges module 420 differential of the differential of the brightness on the equidirectional of each pixel and luminance texture feature is merged to generate the first differential, and the computing formula of the first differential is and the differential of the differential of the color characteristic on equidirectional and color and vein feature is merged to generate the second differential, the computing formula of described the second differential is
Figure GDA000024449392000610
then, differential merges module 420 multiple the first differential in the multiple directions of each pixel is merged to generate the 3rd differential, and the computing formula of described the 3rd differential is
Figure GDA00002444939200071
and multiple the second differential in multiple directions are merged to generate the 4th differential, the computing formula of described the 4th differential is
Figure GDA00002444939200072
The probability that probability calculation module 430 is frontier point according to the 3rd differential of each pixel and the each pixel of the 4th differential calculation, the computing formula of the probability that each pixel is frontier point is
Pb(P)=αG brt(P)+(1-α)G clr(P)
Wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
Pb (P) is converted to 0~255 integer gray-scale value and just obtains object boundary figure.Probability threshold value also can be set, and the whole pixels that the probability of frontier point exceeded to probability threshold value combine, and obtain the object boundary figure of binaryzation.
According to the three-dimensional scenic object boundary pick-up unit of the embodiment of the present invention, can detect fast and effectively the border of object in three-dimensional scenic, the impact of the texture that significantly reduces shade and body surface on border recognition effect, has improved the precision of Boundary Detection.
Any process of otherwise describing in process flow diagram or at this or method are described and can be understood to, represent to comprise that one or more is for realizing module, fragment or the part of code of executable instruction of step of specific logical function or process, and the scope of the preferred embodiment of the present invention comprises other realization, wherein can be not according to order shown or that discuss, comprise according to related function by the mode of basic while or by contrary order, carry out function, this should be understood by embodiments of the invention person of ordinary skill in the field.
The logic and/or the step that in process flow diagram, represent or otherwise describe at this, for example, can be considered to the sequencing list of the executable instruction for realizing logic function, may be embodied in any computer-readable medium, use for instruction execution system, device or equipment (as computer based system, comprise that the system of processor or other can and carry out the system of instruction from instruction execution system, device or equipment instruction fetch), or use in conjunction with these instruction execution systems, device or equipment.With regard to this instructions, " computer-readable medium " can be anyly can comprise, device that storage, communication, propagation or transmission procedure use for instruction execution system, device or equipment or in conjunction with these instruction execution systems, device or equipment.The example more specifically (non-exhaustive list) of computer-readable medium comprises following: the electrical connection section (electronic installation) with one or more wirings, portable computer diskette box (magnetic device), random-access memory (ram), ROM (read-only memory) (ROM), the erasable ROM (read-only memory) (EPROM or flash memory) of editing, fiber device, and portable optic disk ROM (read-only memory) (CDROM).In addition, computer-readable medium can be even paper or other the suitable medium that can print described program thereon, because can be for example by paper or other media be carried out to optical scanning, then edit, decipher or process in electronics mode and obtain described program with other suitable methods if desired, be then stored in computer memory.
Should be appreciated that each several part of the present invention can realize with hardware, software, firmware or their combination.In the above-described embodiment, multiple steps or method can realize with being stored in software or the firmware carried out in storer and by suitable instruction execution system.For example, if realized with hardware, the same in another embodiment, can realize by any one in following technology well known in the art or their combination: there is the discrete logic for data-signal being realized to the logic gates of logic function, there is the special IC of suitable combinational logic gate circuit, programmable gate array (PGA), field programmable gate array (FPGA) etc.
Those skilled in the art are appreciated that realizing all or part of step that above-described embodiment method carries is can carry out the hardware that instruction is relevant by program to complete, described program can be stored in a kind of computer-readable recording medium, this program, in the time carrying out, comprises step of embodiment of the method one or a combination set of.
In addition, the each functional unit in each embodiment of the present invention can be integrated in a processing module, can be also that the independent physics of unit exists, and also can be integrated in a module two or more unit.Above-mentioned integrated module both can adopt the form of hardware to realize, and also can adopt the form of software function module to realize.If described integrated module realizes and during as production marketing independently or use, also can be stored in a computer read/write memory medium using the form of software function module.
The above-mentioned storage medium of mentioning can be ROM (read-only memory), disk or CD etc.
In the description of this instructions, the description of reference term " embodiment ", " some embodiment ", " example ", " concrete example " or " some examples " etc. means to be contained at least one embodiment of the present invention or example in conjunction with specific features, structure, material or the feature of this embodiment or example description.In this manual, the schematic statement of above-mentioned term is not necessarily referred to identical embodiment or example.And specific features, structure, material or the feature of description can be with suitable mode combination in any one or more embodiment or example.
Although illustrated and described embodiments of the invention, for the ordinary skill in the art, be appreciated that without departing from the principles and spirit of the present invention and can carry out multiple variation, amendment, replacement and modification to these embodiment, scope of the present invention is by claims and be equal to and limit.

Claims (2)

1. a three-dimensional scenic object boundary detection method, is characterized in that, comprises the steps:
The differential of one or more features of each pixel of calculating 3-D view in multiple directions, wherein, described one or more features comprise brightness, color characteristic, luminance texture feature and color and vein feature, specifically comprise:
If current pixel point P, centered by pixel P, along direction θ, chooses the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, then calculate the brightness histogram of each subwindow
Figure FDA0000437040360000011
and color histogram
Figure FDA0000437040360000012
the numbering that wherein i is subwindow, according to the brightness histogram of described each subwindow
Figure FDA0000437040360000013
and color histogram
Figure FDA0000437040360000014
calculate respectively the differential of brightness, differential, the differential of luminance texture feature and the differential of color and vein feature of color characteristic, wherein,
The differential of the brightness of described pixel P on direction θ is:
Figure FDA0000437040360000015
The differential of the color characteristic of described pixel P on direction θ is:
Figure FDA0000437040360000016
The differential of the luminance texture feature of described pixel P on direction θ is:
G txt brt ( P , θ ) = f [ d ( H 1 brt , H 3 brt ) + d ( H 2 brt , H 4 brt ) ] ,
The differential of the color and vein feature of described pixel P on direction θ is:
G txt clr ( P , θ ) = f [ d ( H 1 clr , H 3 clr ) + d ( H 2 clr , H 4 clr ) ] ;
The differential of the differential of the described brightness on the equidirectional of described each pixel and described luminance texture feature is merged to generate the first differential, and the differential of the differential of the described color characteristic on equidirectional and described color and vein feature is merged to generate the second differential, wherein
Described the first differential is: G cmb brt ( P , θ ) = min [ G int brt , G txt brt ] ,
Described the second differential is: G cmb clr ( P , θ ) = min [ G int clr , G txt clr ] ;
Multiple the first differential in the described multiple directions of described each pixel are merged to generate the 3rd differential, and multiple the second differential in described multiple directions are merged to generate the 4th differential, wherein,
Described the 3rd differential is:
Figure FDA00004370403600000111
Described the 4th differential is:
Figure FDA00004370403600000112
and
The probability that is frontier point according to each pixel described in described the 3rd differential of described each pixel and described the 4th differential calculation, obtains object boundary figure, and wherein, the probability that described each pixel is frontier point is:
Pb(P)=αG brt(P)+(1-α)G clr(P),
Wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
2. a three-dimensional scenic object boundary pick-up unit, comprising:
Differential calculation module, the differential for one or more features of each pixel of calculating 3-D view in multiple directions, wherein, described one or more features comprise brightness, color characteristic, luminance texture feature and color and vein feature, specifically comprise:
If current pixel point P, described differential calculation module, centered by pixel P, along direction θ, is chosen the window of 2W × W, and described window is laterally divided into the subwindow of 4 W/2 × W, then calculates the brightness histogram of each subwindow
Figure FDA0000437040360000021
and color histogram
Figure FDA0000437040360000022
, the numbering that wherein i is subwindow, described differential calculation module is according to the brightness histogram of described each subwindow
Figure FDA0000437040360000023
and color histogram
Figure FDA0000437040360000024
calculate respectively the differential of brightness, differential, the differential of luminance texture feature and the differential of color and vein feature of color characteristic, wherein,
The differential of the brightness of described pixel P on direction θ is:
G int brt ( P , θ ) = f [ d ( H 2 brt , H 3 brt ) ] ,
The differential of the color characteristic of described pixel P on direction θ is:
G int clr ( P , θ ) = f [ d ( H 2 clr , H 3 clr ) ] ,
The differential of the luminance texture feature of described pixel P on direction θ is:
G txt brt ( P , θ ) = f [ d ( H 1 brt , H 3 brt ) + d ( H 2 brt , H 4 brt ) ] ,
The differential of the color and vein feature of described pixel P on direction θ is:
G txt clr ( P , θ ) = f [ d ( H 1 clr , H 3 clr ) + d ( H 2 clr , H 4 clr ) ] ;
Differential merges module, for merging the differential of described pixel multiple features in the same direction, and merges the differential of described pixel in multiple directions, specifically comprises:
Described differential merges module the differential of the differential of the described brightness on the equidirectional of described each pixel and described luminance texture feature is merged to generate the first differential, and the computing formula of described the first differential is G cmb brt ( P , θ ) = min [ G int brt , G txt brt ] ,
And the differential of the differential of the described color characteristic on equidirectional and described color and vein feature is merged to generate the second differential, the computing formula of described the second differential is
Figure FDA00004370403600000210
and
Probability calculation module, for calculating the probability that described each pixel is frontier point, specifically comprises:
Described differential merges module multiple the first differential in the described multiple directions of described each pixel is merged to generate the 3rd differential, and the computing formula of described the 3rd differential is
Figure FDA00004370403600000211
And multiple the second differential in described multiple directions are merged to generate the 4th differential, the computing formula of described the 4th differential is
Figure FDA0000437040360000031
The computing formula of the probability that described probability calculation module is frontier point according to each pixel described in described the 3rd differential of described each pixel and described the 4th differential calculation is:
Pb(P)=αG brt(P)+(1-α)G clr(P),
Wherein, Pb (P) for pixel P be the probability of frontier point, α is that brightness is as the shared weight of edge determination factor.
CN201210185752.0A 2012-06-06 2012-06-06 Three-dimensional scene object boundary detection method and device Expired - Fee Related CN102737245B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210185752.0A CN102737245B (en) 2012-06-06 2012-06-06 Three-dimensional scene object boundary detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210185752.0A CN102737245B (en) 2012-06-06 2012-06-06 Three-dimensional scene object boundary detection method and device

Publications (2)

Publication Number Publication Date
CN102737245A CN102737245A (en) 2012-10-17
CN102737245B true CN102737245B (en) 2014-06-11

Family

ID=46992711

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210185752.0A Expired - Fee Related CN102737245B (en) 2012-06-06 2012-06-06 Three-dimensional scene object boundary detection method and device

Country Status (1)

Country Link
CN (1) CN102737245B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114721511B (en) * 2019-04-24 2025-03-25 北京星宿视觉文化传播有限公司 A method and device for positioning three-dimensional objects

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676953A (en) * 2008-08-22 2010-03-24 奥多比公司 Automatic video image segmentation
CN101689300A (en) * 2007-04-27 2010-03-31 惠普开发有限公司 Image segmentation and enhancement
WO2011042601A1 (en) * 2009-10-09 2011-04-14 Visidon Oy Face recognition in digital images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101689300A (en) * 2007-04-27 2010-03-31 惠普开发有限公司 Image segmentation and enhancement
CN101676953A (en) * 2008-08-22 2010-03-24 奥多比公司 Automatic video image segmentation
WO2011042601A1 (en) * 2009-10-09 2011-04-14 Visidon Oy Face recognition in digital images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SAR 图像目标边界流和边沿流特征提取;谢明;《光电工程》;20080630;第35卷(第6期);95-100 *
谢明.SAR 图像目标边界流和边沿流特征提取.《光电工程》.2008,第35卷(第6期),95-100.

Also Published As

Publication number Publication date
CN102737245A (en) 2012-10-17

Similar Documents

Publication Publication Date Title
CN103150708B (en) Based on the image Quick demisting optimization method of black channel
US8873835B2 (en) Methods and apparatus for correcting disparity maps using statistical analysis on local neighborhoods
WO2016127736A1 (en) Computing method for area of fingerprint overlapping area and electronic apparatus
CN102411774A (en) Processing method, device and system based on single image defogging
CN104584076B (en) Image processing device and image processing method
Ying et al. Robust lane marking detection using boundary-based inverse perspective mapping
CN102982545B (en) A kind of image depth estimation method
CN103927717A (en) Depth image recovery method based on improved bilateral filters
CN102074014A (en) Stereo matching method by utilizing graph theory-based image segmentation algorithm
Hu et al. An adaptive lighting indoor vSLAM with limited on-device resources
CN103280052B (en) Be applied to the intrusion detection method of long distance track circuit intelligent video monitoring
CN102034230B (en) Method for enhancing image visibility
CN101582171A (en) A method and device for creating a depth map
CN109816645A (en) A kind of automatic testing method of coil of strip loose winding
JP7428870B2 (en) Tire wear degree estimation device, tire wear degree learning device, tire wear degree estimation method, learned model generation method and program
CN105447489A (en) Character and background adhesion noise elimination method for image OCR system
CN102737245B (en) Three-dimensional scene object boundary detection method and device
CN104981844A (en) Moving object detection
CN108256470A (en) A kind of lane shift judgment method and automobile
CN108174087A (en) A method and system for updating reference frames in grayscale projection image stabilization
Huang et al. Unstructured lane identification based on hough transform and improved region growing
CN105069773B (en) The auto-adaptable image edge detection computational methods being combined based on mask with canny
JP4825737B2 (en) Eye opening degree determination device
Pertuz et al. Region-based depth recovery for highly sparse depth maps
CN103986880B (en) Optical processing device and light source brightness setting method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140611