[go: up one dir, main page]

CN102905147A - Three-dimensional image correction method and apparatus - Google Patents

Three-dimensional image correction method and apparatus Download PDF

Info

Publication number
CN102905147A
CN102905147A CN2012103205396A CN201210320539A CN102905147A CN 102905147 A CN102905147 A CN 102905147A CN 2012103205396 A CN2012103205396 A CN 2012103205396A CN 201210320539 A CN201210320539 A CN 201210320539A CN 102905147 A CN102905147 A CN 102905147A
Authority
CN
China
Prior art keywords
correction
matrix
point
corner
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012103205396A
Other languages
Chinese (zh)
Inventor
姚华
钟雄光
彭超建
何光彩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI STEREOSCOPIC DIGITAL TECHNOLOGY DEVELOPMENT Co Ltd
Original Assignee
SHANGHAI STEREOSCOPIC DIGITAL TECHNOLOGY DEVELOPMENT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHANGHAI STEREOSCOPIC DIGITAL TECHNOLOGY DEVELOPMENT Co Ltd filed Critical SHANGHAI STEREOSCOPIC DIGITAL TECHNOLOGY DEVELOPMENT Co Ltd
Priority to CN2012103205396A priority Critical patent/CN102905147A/en
Publication of CN102905147A publication Critical patent/CN102905147A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a three-dimensional image correction method and apparatus, and belongs to the technical field of three-dimensional (3D) image display. The three-dimensional image correction method comprises the following steps of: an acquiring step: acquiring an original three-dimensional image pair; a correction parameter extracting step: finding matching point pairs in the original three-dimensional image pair to form a matching point pair set, and extracting a correction parameter; a correction step: at least generating a correction matrix according to the correction parameter and a depth-of-field regulation parameter, and correcting the original three-dimensional image pair based on the correction matrix so as to eliminate a vertical parallax error; and a feedback step: carrying out feedback and outputting on the depth-of-field regulation parameter according to a screen-exiting degree/screen-entering degree displayed by a three-dimensional image. According to the three-dimensional image correction method, the vertical parallax error can be eliminated, and the screen-exiting degree and the screen-entering degree displayed by the 3D image can be regulated, so that the viewing experience of a viewer on the three-dimensional image is good.

Description

Stereo image correction method and device
Technical Field
The invention belongs to the technical field of three-dimensional (3D) image display, and relates to a method and a device for correcting a three-dimensional image.
Background
A stereoscopic (3D) image display device is provided with a single or multiple cameras, for example, a dual-camera shooting module is provided, two cameras of which are arranged in parallel with the left and right eye directions, the left camera is used for shooting and acquiring a left view, the right camera is used for shooting and acquiring a right view, and the two cameras form an original stereoscopic image pair. The image pair is further subjected to 3D processing by a stereoscopic image display device to form a 3D image with good stereoscopic display effect relative to a viewer. Or the left view is shot through a single camera, then the right view is shot through an auxiliary device or an auxiliary program, and the obtained image pair forms a 3D image with a stereoscopic display effect relative to a viewer after being further subjected to 3D processing through the stereoscopic image display device.
In the above 3D processing, a correction processing for the original stereoscopic image pair is usually included to improve the 3D image display effect, and the correction processing has been continuously improved to pursue a better 3D image display effect.
Disclosure of Invention
One of the objects of the present invention is to obtain a better 3D image display effect.
It is still another object of the present invention to eliminate vertical parallax in the display of 3D images.
It is still another object of the present invention to provide a stereo image correction method suitable for use in a terminal with relatively low operation processing capability and relatively small memory capacity.
To achieve the above and other objects, the present invention provides the following technical solutions.
According to an aspect of the present invention, there is provided a stereoscopic image correction method, including:
an acquisition step: acquiring an original stereo image pair;
a correction parameter extraction step: searching a matching point pair in the original stereo image pair to form a matching point pair set, and extracting correction parameters;
a correction step: generating a correction matrix according to at least the correction parameter and the depth of field adjustment parameter, and correcting the original stereo image pair based on the correction matrix to eliminate vertical parallax; and
a feedback step: feeding back and outputting the depth of field adjustment parameter according to the screen-out degree/screen-in degree displayed by the stereo image;
wherein the depth of field adjustment parameter output by the feedback step is used in the correction step.
In the correction method, the viewer can use the feedback module to feed back the depth of field adjustment parameter, and then the original stereo image pair can be corrected by combining the depth of field adjustment parameter, so that not only can the vertical parallax be eliminated, but also the original stereo image pair can be corrected again according to the viewing experience requirements of the viewer, and therefore the screen-out and screen-in degrees of 3D image display are adjusted, and the stereo image display experience of the viewer is better.
The method for correcting a stereoscopic image according to an embodiment of the present invention, wherein the feedback step further includes:
a judging step: and judging whether the stereo image needs to be corrected again according to the display experience of the stereo image, if so, entering the correction parameter extraction step, and if not, entering the correction step.
Specifically, in the acquiring step, the original stereo image pair is composed of a left view and a right view which are respectively photographed by a left camera and a right camera for the same scene.
The stereoscopic image correction method according to still another embodiment of the present invention, wherein the correction parameter extracting step includes:
angular point extraction: extracting corner points with relatively severe brightness change and relatively easy identification from the views of the original stereo image pair;
corner matching: respectively extracting a feature descriptor with robust characteristics from each corner point, and matching the corner points according to the feature descriptor to form an initial matching point pair set;
and a mismatching elimination step: rejecting mismatching point pairs existing in the initial matching point pair set by using a robust model estimation method to form a relatively stable and reliable second internal point set; and
and (3) correction parameter optimization: parameterizing the basic matrix, establishing an error equation for the matching point pairs in the second inner point set based on the parameterized basic matrix, and optimizing the correction parameters by using a nonlinear least square method to obtain the optimal values of the correction parameters.
In the stereo image correction method of the embodiment, the correction parameters can be more accurate and reliable through the steps of mismatching and removing and the step of optimizing the correction parameters.
Further, preferably, in the corner extraction step, an OFAST corner is extracted by using an OFAST corner detection method;
the OFAST corner detection method comprises an OFAST corner determination step and an OFAST corner main direction extraction step.
Further, preferably, in the step of determining the OFAST corner, for a certain detected point p, drawing a Bresenham circle with the point p as a center and a radius of r, and if the gray values of n consecutive points on the Bresenham circle are simultaneously greater than the gray values of n consecutive points on the Bresenham circleOr simultaneously less than
Figure 149531DEST_PATH_IMAGE002
Then, the point p is determined as an OFAST detection point, wherein,I(p) To representpThe gray-scale value of the point or points,
Figure 449932DEST_PATH_IMAGE003
the value range of n is [2 ]
Figure 375162DEST_PATH_IMAGE004
Figure 599470DEST_PATH_IMAGE005
];
In the step of extracting the principal direction of the OFAST corner points, when points on a Bresenham circle are labeled for the determined OFAST corner points, the point right above the circle center is labeled with 1, and other points are sequentially labeled according to the clockwise direction, the labels of two endpoints in the n continuous points are respectively labeled as a and b according to the clockwise direction, the principal direction of the OFAST corner points is determined by the following relational expression (1),
Figure 864229DEST_PATH_IMAGE006
wherein,
Figure 883001DEST_PATH_IMAGE007
is the main direction of the OFAST corner point,
Figure 295528DEST_PATH_IMAGE008
the number of points contained in the Bresenham circle of radius r. The corner point extracting method of the embodiment can judge the corner point without extra information, and simultaneously extracts the main direction for the corner point, and the main direction can be utilized in the subsequent correction process to reduce less calculation amount.
Further, preferably, in the corner matching step, an OBRIEF feature descriptor is adopted and matching is performed based on the OBRIEF feature descriptor.
In a preferred embodiment, the corner point matching step includes the steps of:
generating a standard sampling pattern;
constructing the OBRIEF feature descriptor for the corner points; and
and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
In another preferred example, the corner point matching step includes:
generating a standard sampling pattern: by centering on a point (0, 0) and having a size ofIn a square frame, according to uniform distribution
Figure 187665DEST_PATH_IMAGE010
Or Gaussian distribution
Figure 642918DEST_PATH_IMAGE011
Random decimation
Figure 277161DEST_PATH_IMAGE012
Pair of points (
Figure 780955DEST_PATH_IMAGE013
) Each group of point pairs (
Figure 708777DEST_PATH_IMAGE013
Figure 830316DEST_PATH_IMAGE014
) Sampling point in
Figure 387069DEST_PATH_IMAGE013
And
Figure 543243DEST_PATH_IMAGE014
connecting straight lines to form line segments to generateThe standard sampling pattern is a pattern in which, among other things,
Figure 340298DEST_PATH_IMAGE013
and
Figure 949134DEST_PATH_IMAGE014
is as followsiTwo sampling points in the group point pair are more than or equal to 1i
Figure 794730DEST_PATH_IMAGE012
The value range of the side length S of the square frame is [2r +1, 12r +6 ]]R is the radius of a Bresenham circle used in the OFAST corner detection method,section [2 ]
Figure 557915DEST_PATH_IMAGE016
]The inner even number;
an OBRIEF feature description sub-step of OFAST corner construction: according to the principal direction of the OFAST corner point
Figure 269519DEST_PATH_IMAGE007
Rotating the standard sampling pattern, and then comparing the gray values of two sampling points in each group of point pairs in the rotated standard sampling pattern to construct a binary OBRIEF feature descriptor;
matching point pair forming: and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
Further, preferably, in the step of constructing an OBRIEF feature description, the OBRIEF feature description of a certain OFAST corner p is provided
Figure 134707DEST_PATH_IMAGE017
Constructed by the following relation (2):
Figure 273564DEST_PATH_IMAGE018
wherein,
Figure 794675DEST_PATH_IMAGE007
for the OFAST corner pointpIn the direction of the main direction of the plane,
Figure 309970DEST_PATH_IMAGE012
is the point logarithm in the standard sampling pattern;
Figure 29665DEST_PATH_IMAGE013
Figure 339423DEST_PATH_IMAGE014
for a point pair in the standard sampling pattern,tto quantize the threshold, its value range [2, 64 ]];
Figure 597098DEST_PATH_IMAGE019
Representing a binary number;
Figure 650505DEST_PATH_IMAGE020
a 2-dimensional rotation matrix is represented,
Figure 490285DEST_PATH_IMAGE021
indicating points
Figure 908628DEST_PATH_IMAGE022
The gray-scale value of (a) is,
Figure 466648DEST_PATH_IMAGE023
indicating pointsThe gray value of (d).
Any examples or implementations described previouslyIn the exemplary correction method, it is preferable that in the corner matching process, for the OFAST cornerAnd
Figure 358009DEST_PATH_IMAGE026
the similarity between the two is calculated by the following relation (3):
wherein,
Figure 532955DEST_PATH_IMAGE025
the ast corner point of the left view,the ast corner point of the right view,XORrepresenting a bitwise exclusive-or operation,bitcountrepresenting the number of 1's in the statistical binary number,Dist(p L , p R ) Representing OFAST corner points
Figure 107473DEST_PATH_IMAGE025
And
Figure 374507DEST_PATH_IMAGE026
the similarity between them.
In the correction method of any of the examples or embodiments described before, the OFAST corner for the left view
Figure 573407DEST_PATH_IMAGE025
And traversing all OFAST corner points in the right view to find the order relation (3). Taking the point with the minimum value as an OFAST corner pointThereby forming said pair of matching points.
In the above example, by using the OFAST corner detection and the binary obef feature descriptor, a large amount of operations are mainly bitwise operations and comparison operations, so that the operation amount and the storage consumption are small.
In the correction method of any of the foregoing examples or embodiments, in order to further reduce the amount of computation, it is preferable that in the correction parameter optimization step, the parameters for the basis matrix are parameterized according to the following relation (4)
Figure 422600DEST_PATH_IMAGE028
The parameterization is carried out, and the parameters are calculated,
Figure 2012103205396100002DEST_PATH_IMAGE029
wherein,
Figure 176929DEST_PATH_IMAGE030
is a 3-dimensional vectorA determined antisymmetric matrix, which is derived by the following relation (5):
Figure 375009DEST_PATH_IMAGE032
wherein,in the form of a basis matrix, the matrix,is a 3-dimensional rotation matrix and is,θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 2012103205396100002DEST_PATH_IMAGE035
representing a direction of offset of a right camera used to capture a right view of the original stereo image pair relative to a left camera used to capture a left view of the original stereo image pair,
Figure 10018DEST_PATH_IMAGE036
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
Figure 591172DEST_PATH_IMAGE037
Figure 926339DEST_PATH_IMAGE038
respectively the width and height of the left view in pixels,
Figure 655260DEST_PATH_IMAGE039
Figure 734075DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels;
Figure 419003DEST_PATH_IMAGE041
and
Figure 190650DEST_PATH_IMAGE042
the focal lengths of the left camera and the right camera in pixel unit are respectively.
In the correction method of any of the foregoing examples or embodiments, preferably, the error equation is:
Figure 141288DEST_PATH_IMAGE043
wherein,in the form of a basis matrix, the matrix,
Figure 313961DEST_PATH_IMAGE044
is a matrix
Figure 256509DEST_PATH_IMAGE033
The transpose of (a) is performed,the matching points in the second inner point set are the points corresponding to the left view,
Figure 115061DEST_PATH_IMAGE013
the matching points in the second inner point set correspond to the points in the middle right view,Errorindicating a correction error;
wherein the basic matrix
Figure 270186DEST_PATH_IMAGE033
Included parameters
Figure 383636DEST_PATH_IMAGE028
Is taken as an initial value
Figure 308867DEST_PATH_IMAGE046
The stereoscopic image correction method according to still another embodiment of the present invention, wherein the correcting step includes:
a correction matrix construction step: constructing a correction matrix by using the correction parameters;
fine adjustment of a correction matrix: fine-tuning the correction matrix at least in combination with the depth-of-field adjustment parameter; and
and (3) stereo image cutting: and respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to obtain a corrected stereo image pair.
The stereoscopic image correction method according to still another embodiment of the invention, wherein the correcting step includes:
a correction matrix construction step: constructing a correction matrix by using the correction parameters;
fine adjustment of a correction matrix: fine-tuning the correction matrix at least in combination with the depth-of-field adjustment parameter and the second interior point set; and
and (3) stereo image cutting: and respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to obtain a corrected stereo image pair.
In the stereoscopic image correction method of any of the foregoing embodiments, preferably, the correction matrix is constructed based on the following relation (7):
Figure 470858DEST_PATH_IMAGE047
wherein,
Figure 532355DEST_PATH_IMAGE048
a correction matrix corresponding to a left view of the original stereo image pair,a correction matrix corresponding to a right view of the original stereo image pair; k N As an internal parameter matrix of the corrected camera, KL(f L ) To correct the internal parameter matrix of the front left camera, KR(f R ) Correcting an internal parameter matrix of the front right camera;
Figure 229232DEST_PATH_IMAGE041
andpixel-by-pixel of a left camera for taking the left view and a right camera for taking the right view, respectivelyA focal length;θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 625764DEST_PATH_IMAGE036
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
wherein, KL(f L ) And KR(f R ) Calculated by the following relation:
Figure 815437DEST_PATH_IMAGE050
wherein,
Figure 652943DEST_PATH_IMAGE037
Figure 219054DEST_PATH_IMAGE038
respectively the width and height of the left view in pixels,
Figure 520722DEST_PATH_IMAGE039
Figure 881296DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels;
wherein R is a rotation matrix, R L For correcting the rotation matrix of the left camera in the process, R R Is the rotation matrix of the right camera in the correction process.
Further, preferably, the fine-tuned correction matrix is calculated by the following relation (8):
Figure 956831DEST_PATH_IMAGE051
wherein,
Figure 326632DEST_PATH_IMAGE048
the correction matrix for the left view corresponds to,
Figure 217228DEST_PATH_IMAGE049
the correction matrix for the right view corresponds to,adjusting a parameter for said depth of field, M L
Figure 560802DEST_PATH_IMAGE052
) Indicating the depth adjustment matrix, M, corresponding to the left view R
Figure 734294DEST_PATH_IMAGE052
) Representing a depth adjustment matrix corresponding to the right view;
Figure 744975DEST_PATH_IMAGE053
as an adjustment amount in the vertical direction,
Figure 899882DEST_PATH_IMAGE054
is the adjustment amount in the horizontal direction.
In the stereoscopic image correction method of any of the foregoing embodiments, preferably, the stereoscopic image cropping step includes the steps of:
respectively acquiring tailorable areas of a corrected left view and a corrected right view of an original stereo image pair;
acquiring a maximum common clipping area between the corrected left view and the corrected right view; and
the left and right views in the corrected maximum common clipping region are filled in accordingly with the grey values of the original stereo image pair.
According to still another aspect of the present invention, there is provided a stereoscopic image correction device, including:
an acquisition module for acquiring an original stereoscopic image pair;
the correction parameter extraction module is used for searching a matching point pair in the original stereo image pair to form a matching point pair set and extracting a correction parameter;
the correction module is used for generating a correction matrix at least according to the correction parameters and the depth adjustment parameters and correcting the original stereo image pair based on the correction matrix so as to eliminate vertical parallax; and
the feedback module is used for feeding back and outputting the depth of field adjustment parameter according to the screen output degree/screen input degree of the stereoscopic image display;
and the depth of field adjustment parameter output by the feedback module is output to the correction module.
The stereoscopic image correction apparatus according to an embodiment of the present invention further includes:
and the judging module is used for judging whether the correction needs to be carried out again according to the display experience of the stereo image.
Specifically, the original stereo image pair acquired by the acquisition module is composed of a left view and a right view which are respectively photographed by a left camera and a right camera for the same scene.
The stereoscopic image correction device according to still another embodiment of the present invention, wherein the correction parameter extraction module includes:
a corner extraction unit for extracting corners from the views of the original stereo image pair, the corners having relatively severe brightness variation and being relatively easily identified;
the corner matching unit is used for respectively extracting a feature descriptor with robust characteristics from each corner and matching the corners according to the feature descriptor to form an initial matching point pair set;
the mismatching rejection unit is used for rejecting mismatching point pairs existing in the initial matching point pair set by using a robust model estimation method to form a relatively stable and reliable second internal point set; and
and the correction parameter optimization unit is used for parameterizing the basic matrix, establishing an error equation for the matching point pairs in the second inner point set based on the parameterized basic matrix, and optimizing the correction parameters by utilizing a nonlinear least square method to obtain the optimal values of the correction parameters.
Further, preferably, the corner extraction unit extracts an OFAST corner by using an OFAST corner detection component;
the OFAST corner detection component comprises an OFAST corner determination submodule and an OFAST corner main direction extraction submodule.
Further, preferably, in the OFAST corner point determining submodule, for a certain detected point p, a Bresenham circle is drawn by taking the point p as a circle center and taking the radius as r, and if the gray values of n continuous points on the Bresenham circle are simultaneously greater than the gray values of n continuous points on the Bresenham circle
Figure 996014DEST_PATH_IMAGE055
Or simultaneously less than
Figure 707618DEST_PATH_IMAGE056
Then, the point p is determined as an OFAST detection point, wherein,I(p) To representpThe gray-scale value of the point or points,
Figure 572806DEST_PATH_IMAGE003
the value range of n is [2 ]
Figure 232774DEST_PATH_IMAGE005
];
In the OFAST corner point main direction extraction submodule, when points on a Bresenham circle are labeled for the determined OFAST corner points, the point label 1 right above the circle center is labeled, other points are sequentially labeled according to the clockwise direction, the labels of two endpoints in the n continuous points are respectively marked as a and b according to the clockwise direction, the OFAST corner point main direction is determined by the following relational expression (1),
Figure 482490DEST_PATH_IMAGE006
wherein,
Figure 467764DEST_PATH_IMAGE007
is the main direction of the OFAST corner point,
Figure 465938DEST_PATH_IMAGE008
the number of points contained in the Bresenham circle of radius r.
Further, preferably, the corner matching unit uses an OBRIEF feature descriptor and performs matching based on the OBRIEF feature descriptor.
In a preferred example, the corner matching unit includes:
a component that generates a standard sampling pattern;
building an OBRIEF feature descriptor for the corner points; and
and a component for matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
In still another preferred example, the corner matching unit includes:
generating a standard sampling pattern component: by centering on a point (0, 0) and having a size of
Figure 536662DEST_PATH_IMAGE009
In a square frame, according to uniform distribution
Figure 590069DEST_PATH_IMAGE010
Or Gaussian distributionRandom decimation
Figure 848192DEST_PATH_IMAGE012
Pair of points (
Figure 140633DEST_PATH_IMAGE013
Figure 997730DEST_PATH_IMAGE014
) Each group of point pairs (
Figure 530529DEST_PATH_IMAGE014
) Sampling point in
Figure 575845DEST_PATH_IMAGE013
And
Figure 971054DEST_PATH_IMAGE014
straight lines are connected to form line segments to generate the standard sampling pattern, wherein,
Figure 457530DEST_PATH_IMAGE013
and
Figure 545572DEST_PATH_IMAGE014
is as followsiTwo sampling points in the group point pair are more than or equal to 1i
Figure 812605DEST_PATH_IMAGE012
The value range of the side length S of the square frame is [2r +1, 12r +6 ]]And r is Bresenham circle used in the OFAST corner detection method processThe radius of (a) is greater than (b),
Figure 11506DEST_PATH_IMAGE012
section [2 ]
Figure 103221DEST_PATH_IMAGE015
Figure 362164DEST_PATH_IMAGE016
]The inner even number;
constructing an OBRIEF feature description subcomponent: rotating the standard sampling pattern according to the principal direction alpha of the OFAST corner, and then comparing the gray values of two sampling points in each group of point pairs in the rotated standard sampling pattern to construct a binary OBRIEF feature descriptor;
matching point pair forming means: and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
Further, preferably, in the OBRIEF feature description sub-component, an OBRIEF feature descriptor of a certain OFAST corner p
Figure 116493DEST_PATH_IMAGE017
Constructed by the following relation (2):
Figure 791188DEST_PATH_IMAGE018
wherein,for the OFAST corner pointpIn the direction of the main direction of the plane,
Figure 744418DEST_PATH_IMAGE012
is the point logarithm in the standard sampling pattern;
Figure 720464DEST_PATH_IMAGE057
Figure 385800DEST_PATH_IMAGE058
for a point pair in the standard sampling pattern,tto quantize the threshold, its value range [2, 64 ]];
Figure 29271DEST_PATH_IMAGE019
Representing a binary number;
Figure 364438DEST_PATH_IMAGE020
a 2-dimensional rotation matrix is represented,
Figure 93359DEST_PATH_IMAGE021
indicating points
Figure 878167DEST_PATH_IMAGE022
The gray-scale value of (a) is,
Figure 376145DEST_PATH_IMAGE023
indicating points
Figure 882213DEST_PATH_IMAGE024
The gray value of (d).
In the correction device of any of the foregoing examples or embodiments, preferably, in the matching point pair forming member, for an OFAST corner point
Figure 36113DEST_PATH_IMAGE025
And
Figure 918619DEST_PATH_IMAGE026
the similarity between the two is calculated by the following relation (3):
Figure 271103DEST_PATH_IMAGE059
wherein,
Figure 213651DEST_PATH_IMAGE025
the ast corner point of the left view,
Figure 838536DEST_PATH_IMAGE026
the ast corner point of the right view,XORrepresenting a bitwise exclusive-or operation,bitcountrepresenting the number of 1's in the statistical binary number,Dist(p L , p R ) Representing OFAST corner points
Figure 259153DEST_PATH_IMAGE025
Andthe similarity between them.
In the correction device of any of the foregoing examples or embodiments, preferably, the correction parameter optimization unit executes the following relational expression (4) to parameterize the basis matrixThe parameterization is carried out, and the parameters are calculated,
wherein,
Figure 666815DEST_PATH_IMAGE030
is a 3-dimensional vectorA determined antisymmetric matrix, which is derived by the following relation (5):
Figure 966657DEST_PATH_IMAGE060
wherein,in the form of a basis matrix, the matrix,
Figure 141604DEST_PATH_IMAGE034
is a 3-dimensional rotation matrix and is,θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 260869DEST_PATH_IMAGE061
representing a direction of offset of a right camera used to capture a right view of the original stereo image pair relative to a left camera used to capture a left view of the original stereo image pair,
Figure 716122DEST_PATH_IMAGE036
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
Figure 615944DEST_PATH_IMAGE037
Figure 182055DEST_PATH_IMAGE038
respectively the width and height of the left view in pixels,
Figure 405095DEST_PATH_IMAGE039
Figure 31248DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels;
Figure 418367DEST_PATH_IMAGE041
and
Figure 522589DEST_PATH_IMAGE042
the focal lengths of the left camera and the right camera in pixel unit are respectively.
In the correction method of any of the foregoing examples or embodiments, preferably, the error equation is:
Figure 616447DEST_PATH_IMAGE062
wherein,
Figure 413502DEST_PATH_IMAGE033
in the form of a basis matrix, the matrix,
Figure 22338DEST_PATH_IMAGE044
is a matrix
Figure 195830DEST_PATH_IMAGE033
The transpose of (a) is performed,the matching points in the second inner point set are the points corresponding to the left view,
Figure 862883DEST_PATH_IMAGE013
the matching points in the second inner point set correspond to the points in the middle right view,Errorindicating a correction error;
wherein the basic matrixIncluded parameters
Figure 342723DEST_PATH_IMAGE028
Is taken as an initial value
According to still another embodiment of the present invention, the stereoscopic image correction device, wherein the correction module includes:
a correction matrix construction unit for constructing a correction matrix using the correction parameters;
a correction matrix fine tuning unit for fine tuning the correction matrix at least in combination with the depth of field adjustment parameter; and
and the stereo image cutting unit is used for respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to acquire a corrected stereo image pair.
The stereoscopic image correction device according to still another embodiment of the present invention, wherein the correction module includes:
a correction matrix construction unit for constructing a correction matrix using the correction parameters;
a correction matrix fine tuning unit for fine tuning the correction matrix at least in combination with the depth of field adjustment parameter and a second inner point set; and
and the stereo image cutting unit is used for respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to acquire a corrected stereo image pair.
In the stereoscopic image correction device of any of the foregoing embodiments, preferably, the correction matrix construction unit constructs the correction matrix by executing the following relational expression (7):
Figure 612347DEST_PATH_IMAGE063
wherein,
Figure 195776DEST_PATH_IMAGE048
a correction matrix corresponding to a left view of the original stereo image pair,
Figure 632442DEST_PATH_IMAGE049
a correction matrix corresponding to a right view of the original stereo image pair; k N As an internal parameter matrix of the corrected camera, KL(f L ) To correct internal parameters of front left cameraNumber matrix, KR(f R ) Correcting an internal parameter matrix of the front right camera;
Figure 617716DEST_PATH_IMAGE041
and
Figure 927474DEST_PATH_IMAGE042
focal lengths in pixel units of a left camera for photographing the left view and a right camera for photographing the right view, respectively;θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 998198DEST_PATH_IMAGE064
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
wherein, KL(f L ) And KR(f R ) Calculated by the following relation:
Figure 989288DEST_PATH_IMAGE050
wherein,
Figure 829068DEST_PATH_IMAGE037
Figure 309728DEST_PATH_IMAGE038
respectively the width and height of the left view in pixels,
Figure 602169DEST_PATH_IMAGE039
Figure 147682DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels;
wherein R is a rotation matrix, R L For correcting the rotation matrix of the left camera in the process, R R Is the rotation matrix of the right camera in the correction process.
Further, preferably, the correction matrix fine adjustment unit performs calculation of a fine-adjusted correction matrix by the following relation (8):
wherein,
Figure 493530DEST_PATH_IMAGE048
the correction matrix for the left view corresponds to,
Figure 476529DEST_PATH_IMAGE049
the correction matrix for the right view corresponds to,
Figure 871739DEST_PATH_IMAGE052
adjusting a parameter for said depth of field, M L ) Indicating the depth adjustment matrix, M, corresponding to the left view R
Figure 508573DEST_PATH_IMAGE052
) Representing a depth adjustment matrix corresponding to the right view;
Figure 962557DEST_PATH_IMAGE053
as an adjustment amount in the vertical direction,
Figure 895878DEST_PATH_IMAGE054
is the adjustment amount in the horizontal direction.
In the stereoscopic image correction device of any of the foregoing embodiments, preferably, the stereoscopic image cropping unit includes:
means for obtaining tailorable regions of the corrected left view and the corrected right view of the original stereoscopic image pair, respectively;
a component that acquires a maximum common clipping region between the corrected left view and the corrected right view; and
the parts of the left view and the right view in the corrected maximum common clipping region are filled in accordingly with the grey values of the original stereo image pair.
Drawings
The above and other objects and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which like or similar elements are designated by like reference numerals.
Fig. 1 is a schematic block diagram of a stereo image correction apparatus according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating a stereo image correction method according to an embodiment of the present invention.
FIG. 3 is a flowchart illustrating a method for extracting calibration parameters according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of the OFAST corner extraction used in the corner extraction step.
Fig. 5 is a schematic diagram of a standard sampling pattern.
Fig. 6 is a schematic block diagram of a correction parameter extraction module according to an embodiment of the present invention.
FIG. 7 is a flowchart of a method of the calibration step according to an embodiment of the invention.
FIG. 8 is a schematic diagram of a tailorable area of a corrected view.
Fig. 9 is a schematic block diagram of a calibration module according to an embodiment of the present invention.
Detailed Description
The following description is of some of the many possible embodiments of the invention and is intended to provide a basic understanding of the invention and is not intended to identify key or critical elements of the invention or to delineate the scope of the invention. It is easily understood that according to the technical solution of the present invention, other implementations that can be substituted with each other can be suggested by those skilled in the art without changing the spirit of the present invention. Therefore, the following detailed description and the accompanying drawings are merely illustrative of the technical aspects of the present invention, and should not be construed as all of the present invention or as limitations or limitations on the technical aspects of the present invention.
In the following description, for clarity and conciseness of description, not all of the various components shown in the figures are described. There are shown in the drawings, various components that would be well within the ability of one of ordinary skill in the art to practice the present invention. The operation of many of the components is familiar and obvious to those skilled in the art.
In the description, components of the various embodiments that are described using directional terms (e.g., "left," "right," etc.) and similar terms represent the directions shown in the drawings or directions that can be understood by those skilled in the art. These directional terms are used for relative description and clarification and are not intended to limit the orientation of any embodiment to a particular direction or orientation.
Fig. 1 is a schematic block diagram of a stereo image correction apparatus according to an embodiment of the present invention. In this embodiment, the stereoscopic image correction device 20 is configured to perform stereoscopic correction processing on an original stereoscopic image pair captured by the camera module 10 and output a corrected stereoscopic image pair to the 3D display screen 30 to improve a 3D image display effect. The stereoscopic image correction device 20 may be a part of the stereoscopic image display device or included in the stereoscopic image display device. Moreover, the camera module 10 may be a single camera module, or may be a dual camera module, or may be a camera module with other structural forms that can shoot images forming original stereo image pairs.
Fig. 2 is a flowchart illustrating a stereo image correction method according to an embodiment of the invention. Corresponding to fig. 1, the stereo image correction apparatus 20 is configured to complete the method flow shown in fig. 2. Specifically, in this embodiment, the stereoscopic image correction method mainly includes the steps of:
step S910, acquiring step: an original stereo image pair is acquired. Specifically, when the camera module 10 is a dual-camera module, it includes a left camera and a right camera, and the original stereo image pair is composed of a left view and a right view respectively photographed by the left camera and the right camera for the same scene; when the camera module 10 is a single camera module, a left view is taken at one position, and a right view is taken at another position by an auxiliary device or an auxiliary program. In the following embodiments, the description will be mainly given by taking a bidirectional head module as an example, but it should be understood that when the camera module 10 is a single camera module, it can be virtually defined as a "left camera" when taking a left view, and as a "right camera" when taking a right view, and although the "left camera" and the "right camera" are substantially the same camera, positional parameters and the like between the two can be calculated similarly.
Step S920, a correction parameter extraction step: and searching for a matching point pair in the original stereo image pair to form a matching point pair set, and extracting correction parameters. This step is specifically performed by the correction singular number extraction module 21 shown in fig. 1, and the extracted correction parameters are output to the correction module 22.
In step S930, a correction matrix is generated according to the correction parameters and the depth adjustment parameters, and the original stereo image pair is corrected based on the correction matrix to eliminate the vertical parallax. The step is specifically completed by the correction module 22 shown in fig. 1, the depth-of-field adjustment parameter is specifically provided by feedback from the feedback module 23, and the correction module 22 may output a corrected stereoscopic image pair, so that the stereoscopic image pair may be displayed on the 3D display screen 30 shown in fig. 1, and the vertical parallax may be eliminated.
Step S940, a feedback step: and the viewer feeds back and outputs the depth of field adjustment parameter according to the screen-out degree/screen-in degree displayed by the stereo image. This step is accomplished in particular by a feedback module 23 as shown in fig. 1. After watching the 3D image display in the 3D display screen 30, the viewer obtains a watching experience, and the viewer can intuitively obtain the screen-out degree or the screen-in degree of the 3D image display; according to the personal experience requirement of the viewer, further, the viewer may feed back the depth-of-field adjustment parameter through the feedback module 23 shown in fig. 1; the larger the degree of screen selection by the viewer is, the larger the depth of field adjustment parameter fed back by the feedback module 23The larger the screen size, on the contrary, the larger the screen size selected by the viewer, the larger the screen size
Figure 823700DEST_PATH_IMAGE052
The smaller. The depth adjustment parameter is further input to the correction module 22, so that the original stereo image pair can be further corrected according to the change of the depth adjustment parameter in step S930.
Step S950, a determination step: the viewer judges whether re-correction is required according to the stereoscopic display experience, and if yes, the process proceeds to the correction parameter extraction step S920, and if no, the process proceeds to the correction step S930. This step is accomplished in the recalibration decision module 24 as shown in fig. 1.
To complete the process of the stereo image correction method in the above embodiment, in an embodiment, the stereo image correction apparatus 20 may further include an obtaining module, which is configured to complete the above step S910.
In the process of the stereo image correction method of the above embodiment, the introduced feedback step feeds back the depth of field adjustment parameterIn the correction step, the screen-out degree can be flexibly adjusted according to the requirements of the viewer, the viewer can conveniently obtain the required 3D viewing experience, and the 3D image display effect is greatly improved.
FIG. 3 is a flowchart illustrating a method of the correction parameter extraction step according to an embodiment of the invention. The correction parameter extraction method of this embodiment is specifically described below with reference to fig. 2 and 3.
First, in step S921, the corner point extracting step: corner points with relatively sharp intensity variations and relatively easy recognition are extracted from the views of the original stereo image pair (transmitted by the acquisition unit). There are many corner detection methods used in this step, such as: harris corner detection method, SUSAN (minimum singular value segmentation approximating Nucleus), SIFT (Scale Invariant Feature Transform) corner detection and the like.
In the patent of chinese patent application No. CN200910118629.5 entitled "multi-view video image correction method, apparatus and system", a SIFT-like corner detection method and feature descriptor are used, which relates to complex operations such as convolution, scale space and histogram, and is not suitable for portable terminals such as mobile phones and tablets with relatively low CPU processing capability and small RAM memory capacity. In consideration of the characteristics of limited operation processing capability and memory when various portable digital terminals display 3D stereoscopic images, the applicant preferably extracts an OFAST corner by using an orthogonal Features from accessed segmented Segment Test (Oriented feature based on Accelerated segmentation detection). The following description takes the example of extracting the OFAST corner in the left view as an example, and similar operations can be adopted to realize the OFAST corner extraction for the right view.
Fig. 4 is a schematic diagram illustrating the OFAST corner extraction used in the corner extraction step. In fig. 4, a local area in a view (left view or right view) is shown, and each cell represents a pixel point. The method for extracting the OFAST corner mainly comprises two steps of OFAST corner judgment and OFAST corner main direction extraction.
First, an OFAST corner point determining step.
Taking point p in fig. 4 as an example, it is determined whether it is an ast corner point. DotpFor the OFAST corner point, the point must be satisfiedpThe gray value of continuous n points on a discretization Bresenham (Brasenham) circle with the center radius equal to r is simultaneously larger than
Figure 252724DEST_PATH_IMAGE055
Or simultaneously less than(ii) a WhereinI(p) To representpThe gray-scale value of the point or points,
Figure 940375DEST_PATH_IMAGE003
represents a gray threshold value in the range of 0, 128]The radius r of the Bresenham circle has a value range of [2, 24 ]]And the value range of the continuous point number n is as follows: [
Figure 864556DEST_PATH_IMAGE004
Figure 405259DEST_PATH_IMAGE005
],
Figure 783151DEST_PATH_IMAGE066
Representing the circumferential ratio. In the example of fig. 4, the OFAST corner determination parameter is set:
Figure 2012103205396100002DEST_PATH_IMAGE067
Figure 321580DEST_PATH_IMAGE068
Figure 2012103205396100002DEST_PATH_IMAGE069
(ii) a For Bresenha16 points (marked with 1-16) on the m circle, wherein the point right above the circle center is marked with the number 1 and is increased to 16 in the clockwise direction; for any point in the left view
Figure 784922DEST_PATH_IMAGE070
In order to point
Figure 316266DEST_PATH_IMAGE070
On a discrete Bresenham circle with a central radius equal to 3, the gray value of 9 continuous points (points 14, 15, 16, 1, 2, 3, 4, 5 and 6, and the number of continuous points can be more than 9 in practical implementation) is simultaneously larger than the gray value of the other continuous pointsOr simultaneously less than
Figure 320311DEST_PATH_IMAGE072
Point of contact
Figure 536529DEST_PATH_IMAGE070
Is determined as an OFAST corner.
And secondly, extracting the main direction of the OFAST corner.
When a certain point is judged as an OFAST corner point, the main direction of the certain point needs to be judged, and the main direction is extracted for the corner point. When labeling points on the discretized Bresenham circle, as described above, the point right above the center of the circle is labeled as 1, and other points are labeled in sequence in the clockwise direction. In the OFAST corner point judgment, the labels of two end points in n continuous points on a Bresenham circle meeting the OFAST corner point judgment criterion are respectively marked as a and b in the clockwise direction, and then the main direction of the OFAST corner point is
Figure 356718DEST_PATH_IMAGE007
Determined according to equation (1):
Figure 709202DEST_PATH_IMAGE006
wherein,the number of points contained in a discretized Bresenham circle of radius r. As shown in fig. 4, in dotspThe labels of two end points in continuous points which meet the OFAST corner point judgment criterion on a discrete Bresenham circle with the center radius equal to 3 are 14 and 6 respectively in the clockwise direction, a is 14, b is 6, and the points arepThe upper arrow represents the main direction of the OFAST corner
Figure 824105DEST_PATH_IMAGE007
By adopting the angular point extraction method, the main direction can be extracted for the angular point while the angular point is judged without extra information, the main direction is easy to extract, and the method is used in subsequent angular point matching, and is favorable for reducing the calculation amount.
Further, in step S922, the corner point matching step: and respectively extracting a feature descriptor with robust characteristics from each corner point, and matching the corner points according to the feature descriptor to form an initial matching point pair set. For the extraction of feature descriptors, there are many methods that can be used, such as: SURF (Speeded Up Robust Features), SIFT (Scale Invariant Feature Transform), and other Feature descriptor extraction methods. In order to reduce the operation processing amount and the storage consumption in the step, in a preferred embodiment, an Object Binary Robust Independent Element (OBRIEF) feature descriptor is used in the feature descriptor extraction method in step S922, and matching is performed based on the OBRIEF feature descriptor. Similarly, taking the example of extracting the obef feature descriptors based on the OFAST corner extracted from the left view as an example, the similar operation can be adopted for the right view to realize the extraction of the obef feature descriptors of the corresponding OFAST corner.
The corner matching step based on the OBIEF feature descriptors includes the following sub-steps.
First, a standard sampling pattern is generated. The standard sampling pattern is obtained by centering on a point (0, 0) and having a size of
Figure 198717DEST_PATH_IMAGE009
Can be uniformly distributed in the square frame
Figure 405707DEST_PATH_IMAGE010
Or Gaussian distribution
Figure 519157DEST_PATH_IMAGE011
Random decimation
Figure 382071DEST_PATH_IMAGE012
Pair of points (
Figure 606379DEST_PATH_IMAGE013
,
Figure 933455DEST_PATH_IMAGE014
) Formed such that a set of point pairs comprises two sampling points
Figure 217806DEST_PATH_IMAGE013
And
Figure 551704DEST_PATH_IMAGE014
. Wherein, the value range of S is as follows: [2r +1, 12r +6 ]]R is the radius of Bresenham circle used in the OFAST corner detection process, and the number of point pairs in the standard sampling pattern
Figure 579703DEST_PATH_IMAGE012
Section [2 ]
Figure 761285DEST_PATH_IMAGE015
Figure 216537DEST_PATH_IMAGE016
]The even number inside. Then, the sampling points in each set of point pairs are connected in a straight line (And) And forming line segments to generate a standard sampling pattern shown in fig. 5 (a schematic diagram of the standard sampling pattern is shown in fig. 5).
Secondly, an OBRIEF feature descriptor is constructed for the OFAST corner points. According to OFAST corner main direction
Figure 656243DEST_PATH_IMAGE007
The standard sampling pattern is rotated and then the grayscale values at the two sampling points in each set of point pairs in the sampling pattern are compared to construct a binary feature descriptor. In this embodiment, the OBRIEF feature descriptor for point p
Figure 282396DEST_PATH_IMAGE017
Specifically, the construction is performed by the following relation (2):
Figure 92352DEST_PATH_IMAGE018
wherein,
Figure 462153DEST_PATH_IMAGE007
for the OFAST corner pointpIn the direction of the main direction of the plane,
Figure 618328DEST_PATH_IMAGE012
is the point logarithm in the standard sampling pattern;
Figure 961902DEST_PATH_IMAGE058
for a point pair in the standard sampling pattern,tto quantize the threshold, its value range [2, 64 ]];
Figure 869815DEST_PATH_IMAGE019
Representing a binary number;
Figure 67447DEST_PATH_IMAGE020
a 2-dimensional rotation matrix is represented,
Figure 35403DEST_PATH_IMAGE021
indicating pointsThe gray-scale value of (a) is,
Figure 843139DEST_PATH_IMAGE023
indicating pointsThe gray value of (d).
Thirdly, matching the corner points in the left view and the right view by using an OBRIEF feature descriptor to form a matching point pair set. The OBRIEF feature descriptors formed through the steps are binary features, so that not only is the storage space saved, but also the similarity between the feature descriptors of the corners in the left view and the feature descriptors of the corners in the right view can be rapidly compared by utilizing bitwise XOR operation, and then the corners are matched. In particular for the OFAST corner points
Figure 50446DEST_PATH_IMAGE025
And
Figure 368295DEST_PATH_IMAGE026
the similarity between the two is calculated by the following relation (3):
Figure 883590DEST_PATH_IMAGE073
wherein,the ast corner point of the left view,the ast corner point of the right view,XORrepresenting a bitwise exclusive-or operation,bitcountrepresenting the number of 1's in the statistical binary number.The smaller the value, the corner point is represented
Figure 663273DEST_PATH_IMAGE025
And
Figure 503053DEST_PATH_IMAGE026
the more similar.
In particular, it can be directed to the OFAST corner in the left view
Figure 983713DEST_PATH_IMAGE025
Traversing all OFAST angular points in the right view, and finding out the point which makes the value of the relational expression (3) minimum as the point
Figure 728684DEST_PATH_IMAGE025
The matching point of (e.g.,
Figure 320202DEST_PATH_IMAGE026
) The two form a group of matching point pairs, and all the matching point pairs form an initial matching point pair set.
Further, step S923, a mismatch rejection step: and rejecting mismatching point pairs existing in the initial matching point pair set by using a robust model estimation method to form a relatively stable and reliable second internal point set. In this step, the following methods may be used for removing: GCE (Genetic Consistency Estimation), LMedS (Least mean Square of Square), MLESAC (Maximum Likelihood sampling Consistency Estimation), MAPSAC (Maximum a temporal sampling Consistency Estimation), and STARSaC (steady state Random sampling Consistency). After the processing of the step, the formed second internal point set is the stable, reliable and consistent internal point set with the best consistency.
Further, in step S924, a correction parameter optimization step: parameterizing the basic matrix, establishing an error equation for the matching point pairs in the second inner point set based on the parameterized basic matrix, and optimizing the correction parameters by using a nonlinear least square method to obtain the optimal values of the correction parameters. In a preferred embodiment, step S924 includes the following substeps.
First, the basis matrix is parameterized, in particular, the basis matrixParameterization is performed by the following relation (4):
Figure 931629DEST_PATH_IMAGE029
wherein
Figure 649049DEST_PATH_IMAGE030
Is a 3-dimensional vector
Figure 44258DEST_PATH_IMAGE031
A determined antisymmetric matrix, which is derived by the following relation (5):
Figure 858631DEST_PATH_IMAGE075
wherein,
Figure 363649DEST_PATH_IMAGE033
is a basis matrix;
Figure 630683DEST_PATH_IMAGE034
is 3-dimensional rotational torqueThe array firstly rotates around the Z axis of the cameraθAngle and then rotate about camera Y axisβAngle, finally rotating around X-axis of cameraαAn angle;
Figure 829583DEST_PATH_IMAGE076
represents the offset direction of the right camera relative to the left camera in the dual camera shooting module 10;
Figure 436145DEST_PATH_IMAGE036
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
Figure 429508DEST_PATH_IMAGE037
Figure 183838DEST_PATH_IMAGE038
respectively the width and height of the left view in pixels,
Figure 186429DEST_PATH_IMAGE039
Figure 896765DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels;
Figure 61030DEST_PATH_IMAGE041
and
Figure 302655DEST_PATH_IMAGE042
the focal lengths of the left camera and the right camera in pixel unit are respectively. The width and height of the image are known prior to iterative optimization. In this step, according to the relation (4), the parameter can be utilizedThe basic matrix
Figure 158933DEST_PATH_IMAGE033
And (4) parameterizing.
The X-axis, Y-axis, and Z-axis of the camera may refer to the X-axis, Y-axis, and Z-axis of the left camera or the X-axis, Y-axis, and Z-axis of the right camera, and according to the known definition of the camera in the art, the X-axis is parallel to the image plane and points to the image width direction, the Y-axis is parallel to the image plane and points to the image height direction, and the Z-axis is the optical axis direction of the camera and is perpendicular to the image plane.
In order to reduce the parameters to be estimated and to take account of the rationality, it can be assumed in the iterative optimization process
Figure 759678DEST_PATH_IMAGE041
One of which is constant and the other of which is variable, so that, in this embodiment, the basis matrix is
Figure 567415DEST_PATH_IMAGE033
Comprises
Figure 488228DEST_PATH_IMAGE078
Figure 259875DEST_PATH_IMAGE079
Figure 30702DEST_PATH_IMAGE081
Figure 383186DEST_PATH_IMAGE082
And
Figure 325734DEST_PATH_IMAGE041
or
Figure 763669DEST_PATH_IMAGE042
There are 6 parameters.
Second, an error square is establishedAnd optimizing the correction parameters. Specifically, matching points in the second inner point set are denoted by (
Figure 636816DEST_PATH_IMAGE045
),
Figure 957256DEST_PATH_IMAGE045
Being a point in the left view of the figure,
Figure 882486DEST_PATH_IMAGE013
for the points in the right view, the following error equation (6) is established with all the matching point pairs therein:
Figure 44477DEST_PATH_IMAGE083
wherein,
Figure 371554DEST_PATH_IMAGE033
in the form of a basis matrix, the matrix,
Figure 390325DEST_PATH_IMAGE044
is a matrix
Figure 491268DEST_PATH_IMAGE033
The transpose of (a) is performed,
Figure 519266DEST_PATH_IMAGE033
included parametersIn (1), the initial value is taken
Figure 156101DEST_PATH_IMAGE046
Figure 728028DEST_PATH_IMAGE037
Figure 294138DEST_PATH_IMAGE038
Respectively the width and height in pixels of the left view,
Figure 595807DEST_PATH_IMAGE039
Figure 221960DEST_PATH_IMAGE040
respectively, the width and height of the right view in pixels), and then performing iterative optimization by using a nonlinear least square method to obtain the optimal value of the correction parameter.
At this point, the correction parameter extraction step is basically completed, and the optimal correction parameter is obtained. In the embodiment of the correction parameter extraction method, OFAST corner detection and binary OBRIEF feature descriptors can be adopted, and a large amount of operation mainly comprises bitwise operation and comparison operation, so that the operation amount and the storage consumption are small; furthermore, 7 parameters are used for parameterizing the basic matrix, and only 6 parameters are used as variables in the optimization process, so that the computation amount can be further reduced. Therefore, the correction parameter extraction method of the embodiment is particularly suitable for being applied to portable digital terminals (e.g., mobile phones and tablet computers) with relatively low CPU computation processing capability and small RAM memory capacity.
Fig. 6 is a schematic block diagram of a correction parameter extraction module according to an embodiment of the present invention. In this embodiment, the correction parameter extraction module 21 is configured to perform a correction parameter extraction step as shown in fig. 3, and specifically, the correction parameter extraction module 21 includes a corner extraction unit 211, a corner matching unit 212, an mis-matching rejection unit 213, and a correction parameter optimization unit 214. The corner point extracting unit 211 is configured to complete the step S921, the corner point matching unit 212 is configured to complete the step S922, the mis-matching rejecting unit 213 is configured to complete the step S923, and the correction parameter optimizing unit 214 is configured to complete the step S924; the corner matching unit 212 outputs an initial matching point pair set, the mismatch culling unit 213 outputs a relatively stable and reliable second inner point set, and the correction parameter optimizing unit 214 outputs correction parameters to the correcting unit 22.
FIG. 7 is a flowchart illustrating a method of performing a calibration step according to an embodiment of the invention. The correction method of this embodiment will be specifically described below with reference to fig. 2 and 7.
First, in step S931, a correction matrix construction step constructs a correction matrix using the correction parameters. In this embodiment, the parameter information obtained by iterative optimization using a nonlinear least squares methodTo construct a stereo correction matrix
Figure 900252DEST_PATH_IMAGE048
Figure 56427DEST_PATH_IMAGE049
The construction is based on the following relation (7):
Figure 791165DEST_PATH_IMAGE047
wherein,
Figure 400001DEST_PATH_IMAGE048
the correction matrix for the left view corresponds to,
Figure 307914DEST_PATH_IMAGE049
a correction matrix corresponding to the right view; k N As an internal parameter matrix of the corrected camera, KL(f L ) To correct the internal parameter matrix of the front left camera, KR(f R ) To correct the internal parameter matrix of the front right camera,
Figure 318595DEST_PATH_IMAGE085
andthe focal lengths of the left camera and the right camera in pixel unit, KL(f L ) And KR(f R ) Can be calculated by the correlation part of relation (4); r L For correcting the rotation matrix of the left camera in the process, R R To correct for the rotation matrix of the right camera in the process, it can also be calculated by the correlation part of relation (4).
Further, in step S932, the correction matrix fine adjustment step: and fine-tuning the correction matrix at least in combination with the depth of field adjustment parameter. In this embodiment, fine adjustment is performed based on the second interior point set formed in the above embodiment, wherein the depth of field adjustment parameter is output by the feedback module 23, and the 3D display image content can be located in a comfortable 3D area by combining the fed back depth of field adjustment parameter, so that the screen-out/screen-in effect is better. Trimmed correction matrix
Figure 782703DEST_PATH_IMAGE087
Calculated by the following relation (8):
wherein,
Figure 724431DEST_PATH_IMAGE048
the correction matrix for the left view corresponds to,
Figure 307859DEST_PATH_IMAGE049
the correction matrix for the right view corresponds to,adjusting parameters for the depth of field, M L
Figure 729799DEST_PATH_IMAGE052
) Indicating the depth adjustment matrix, M, corresponding to the left view R
Figure 39558DEST_PATH_IMAGE052
) Representing a depth adjustment matrix corresponding to the right view;
Figure 110282DEST_PATH_IMAGE053
as an adjustment amount in the vertical direction,
Figure 163688DEST_PATH_IMAGE054
is the adjustment amount in the horizontal direction.
Figure 941152DEST_PATH_IMAGE053
Can be calculated by the following relation (9):
Figure 421811DEST_PATH_IMAGE088
Figure 979832DEST_PATH_IMAGE054
can be calculated by the following relation (10):
Figure DEST_PATH_IMAGE089
wherein,
Figure 253906DEST_PATH_IMAGE090
represents the ith component of the vector, (1)
Figure 948193DEST_PATH_IMAGE045
Figure 865333DEST_PATH_IMAGE013
) For a matching pair of points in the second inner set of points,being a point in the left view of the figure,
Figure 977963DEST_PATH_IMAGE013
a point in the right view.
Figure 792335DEST_PATH_IMAGE052
The feedback module 23 feeds back the depth of field adjustment parameter of the 3D image display system according to the selection of the viewer on/off the screen; the greater the degree of screen selection by the viewer
Figure 801748DEST_PATH_IMAGE052
The larger the screen size, on the contrary, the larger the screen size selected by the viewer
Figure 68781DEST_PATH_IMAGE052
The smaller. Since the viewing experience of each person is different for the same 3D image display, and the stereoscopic display effect is also related to factors such as the viewing distance, the adjustment amount of the correction matrix in the horizontal direction needs to be determined according to circumstances, and can be adjusted through the experience of the viewer in the viewing process. In the default state of the system,set to 0, which can later be changed by the viewer according to his own experience.
Further, in step S933, the stereoscopic image cropping step: and respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to obtain a corrected stereo image pair. In this embodiment, this step may be specifically divided into the following three substeps.
First, the tailorable regions of the corrected left and right views are acquired, respectively. The description is given by taking the left view as an example,the right view may be treated similarly. For 4 vertexes of the left view in the original stereo image pair, utilizing the corresponding fine-tuned correction matrix of the left view
Figure 874243DEST_PATH_IMAGE086
And carrying out correction transformation to obtain 4 corrected new vertexes to form a corrected quadrangle. Then, 4 vertexes of the corrected quadrangle are sequenced according to Y-axis coordinates (namely row coordinates), two vertexes in the middle are taken, horizontal lines are drawn according to the two vertexes respectively, and the parts of the corrected quadrangle outside the two horizontal lines are cut off to form a horizontally cut quadrangle; meanwhile, 4 vertexes of the horizontally cut quadrangle are sequenced according to X-axis coordinates (namely column coordinates), two vertexes in the middle are taken, vertical lines are drawn according to the two vertexes respectively, and the part of the horizontally cut quadrangle outside the two vertical lines is cut off to form the horizontally and vertically cut quadrangle.
FIG. 8 is a schematic diagram of a tailorable area of a corrected view, wherein FIG. 8 (a) is a schematic diagram of a corrected quadrangle ABCD after being trimmed to form a horizontally and vertically trimmed quadrangle
Figure 867607DEST_PATH_IMAGE091
FIG. 8 (b) is a diagram showing the still another corrected quadrangle ABCD after being cut to form a horizontally and vertically cut quadrangle
Figure 621937DEST_PATH_IMAGE092
Second, a maximum common clipping region between the corrected left view and the corrected right view is obtained. In this embodiment, it is assumed that the minimum value and the maximum value of the X-axis coordinate of the quadrangle after horizontal and vertical cropping of the corrected left view are respectively
Figure 47364DEST_PATH_IMAGE093
The minimum value and the maximum value of the Y-axis coordinate are respectively
Figure 938277DEST_PATH_IMAGE095
(ii) a The minimum value and the maximum value of the X-axis coordinate of the quadrangle after horizontal and vertical cutting of the corrected right view are respectively
Figure 455026DEST_PATH_IMAGE097
The minimum value and the maximum value of the Y-axis coordinate are respectively
Figure 620614DEST_PATH_IMAGE099
Figure 349535DEST_PATH_IMAGE100
. For the diagonal point of the largest common clipping region are points (
Figure 864010DEST_PATH_IMAGE102
) And points (
Figure 635657DEST_PATH_IMAGE103
Figure 586296DEST_PATH_IMAGE104
)。
Third, the left and right views in the corrected maximum common cropped area are filled accordingly with the gray values of the original stereo image pair. In this embodiment, taking processing the left view as an example, assume that any point on the left view in the maximum common clipping region after correction is set
Figure 157217DEST_PATH_IMAGE105
Then its corresponding point on the left view in the original stereo image pair is a point
Figure 509700DEST_PATH_IMAGE106
Using points on the left view in the original stereo image pair
Figure 452249DEST_PATH_IMAGE107
Upper gray value fills in the corrected left view upper pointThe corrected left view can be generated. The right view in the largest common cropped area may be similarly processed.
Thus, a corrected stereo image pair is substantially formed. In the above correction method, in the correction method of this embodiment, the correction matrix is finely adjusted by using the second interior point set and the feedback information of the viewer when performing the stereo correction, and the generated stereo image pair has a comfortable 3D viewing effect.
Fig. 9 is a schematic block diagram of a calibration module according to an embodiment of the invention.
In this embodiment, the correction module 22 is used to perform the correction step as shown in fig. 7, and specifically, the correction module 22 includes a correction matrix building unit 221, a correction matrix fine-tuning unit 222, and a stereoscopic image pair cropping unit 223. Wherein, the correction matrix construction unit 221 is configured to complete the above step S931, the correction matrix fine adjustment unit 222 is configured to complete the above step S932, and the stereo image pair clipping unit 223 is configured to complete the above step S933; the relatively stable and reliable second internal point set output by the mismatch culling unit 213 and the depth of field adjustment parameter fed back by the feedback module 23 are used in the correction matrix fine tuning unit 222.
The above examples mainly illustrate the stereoscopic image correction method and apparatus of the present invention. Although only a few embodiments of the present invention have been described, those skilled in the art will appreciate that the present invention may be embodied in many other forms without departing from the spirit or scope thereof. Accordingly, the present examples and embodiments are to be considered as illustrative and not restrictive, and various modifications and substitutions may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims.

Claims (37)

1. A stereoscopic image correction method, comprising:
an acquisition step: acquiring an original stereo image pair;
a correction parameter extraction step: searching a matching point pair in the original stereo image pair to form a matching point pair set, and extracting correction parameters;
a correction step: generating a correction matrix according to at least the correction parameter and the depth of field adjustment parameter, and correcting the original stereo image pair based on the correction matrix to eliminate vertical parallax; and
a feedback step: feeding back and outputting the depth of field adjustment parameter according to the screen-out degree/screen-in degree displayed by the stereo image;
wherein the depth of field adjustment parameter output by the feedback step is used in the correction step.
2. The stereoscopic image correction method according to claim 1, wherein the feedback step further comprises:
a judging step: and judging whether the stereo image needs to be corrected again according to the display experience of the stereo image, if so, entering the correction parameter extraction step, and if not, entering the correction step.
3. The stereoscopic image correction method according to claim 1, wherein in the acquiring step, the original stereoscopic image pair is composed of a left view and a right view taken by a left camera and a right camera, respectively, for the same scene.
4. The stereoscopic image correction method according to claim 1, wherein the correction parameter extraction step includes:
angular point extraction: extracting corner points with relatively severe brightness change and relatively easy identification from the views of the original stereo image pair;
corner matching: respectively extracting a feature descriptor with robust characteristics from each corner point, and matching the corner points according to the feature descriptor to form an initial matching point pair set;
and a mismatching elimination step: rejecting mismatching point pairs existing in the initial matching point pair set by using a robust model estimation method to form a relatively stable and reliable second internal point set; and
and (3) correction parameter optimization: parameterizing the basic matrix, establishing an error equation for the matching point pairs in the second inner point set based on the parameterized basic matrix, and optimizing the correction parameters by using a nonlinear least square method to obtain the optimal values of the correction parameters.
5. The stereoscopic image correction method according to claim 4, wherein in the corner extraction step, an Orientation Feature (OFAST) corner detected based on acceleration division is extracted using an Orientation Feature (OFAST) corner detection method based on acceleration division detection;
the OFAST corner detection method comprises an OFAST corner determination step and an OFAST corner main direction extraction step.
6. The stereoscopic image correction method according to claim 5,
in the OFAST corner point judging step, aiming at a certain point p to be detected, drawing a Brassin Hamm circle by taking the point p as the center of the circle and taking the radius as r, and if the gray values of continuous n points on the Brassin Hamm circle are simultaneously larger than the gray values of continuous n points on the Brassin Hamm circle
Figure 2012103205396100001DEST_PATH_IMAGE001
Or simultaneously less than
Figure 960520DEST_PATH_IMAGE002
Then, the point p is determined as an OFAST detection point, wherein,I(p) To representpThe gray-scale value of the point or points,
Figure DEST_PATH_IMAGE003
the value range of n is [2 ]
Figure DEST_PATH_IMAGE005
Figure DEST_PATH_IMAGE007
];
In the step of extracting the principal direction of the OFAST corner points, when points on a Brazier Hamm circle are marked for the determined OFAST corner points, marking 1 a point right above the circle center, and sequentially marking other points according to the clockwise direction, wherein the marks of two end points in the n continuous points are respectively marked as a and b according to the clockwise direction, the principal direction of the OFAST corner points is determined by the following relational expression (1),
Figure 182554DEST_PATH_IMAGE008
wherein,
Figure DEST_PATH_IMAGE009
is the main direction of the OFAST corner point,
Figure 532764DEST_PATH_IMAGE010
the number of points included in the Brasson Hamm circle with the radius r.
7. The stereoscopic image correction method as claimed in claim 4 or 5, wherein in the corner matching step, an oriented binary independent primitive robust feature (OBRIEF) feature descriptor is adopted and matching is performed based on the OBRIEF feature descriptor.
8. The stereoscopic image correction method as claimed in claim 7, wherein the corner point matching step comprises the steps of:
generating a standard sampling pattern;
constructing the OBRIEF feature descriptor for the corner points; and
and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
9. The stereoscopic image correction method according to claim 6, wherein the corner point matching step includes:
generating a standard sampling pattern: by centering on a point (0, 0) and having a size of
Figure DEST_PATH_IMAGE011
In a square frame, according to uniform distribution
Figure 295183DEST_PATH_IMAGE012
Or Gaussian distribution
Figure DEST_PATH_IMAGE013
Random decimation
Figure 414449DEST_PATH_IMAGE014
Pair of points (
Figure DEST_PATH_IMAGE015
Figure 869701DEST_PATH_IMAGE016
) Each group of point pairs (
Figure 707207DEST_PATH_IMAGE015
Figure 273318DEST_PATH_IMAGE016
) Sampling point in
Figure 309407DEST_PATH_IMAGE015
And
Figure 935560DEST_PATH_IMAGE016
straight lines are connected to form line segments to generate the standard sampling pattern, wherein,and
Figure 361655DEST_PATH_IMAGE016
is as followsiTwo sampling points in the group point pair are more than or equal to 1i
Figure 517830DEST_PATH_IMAGE014
The value range of the side length S of the square frame is [2r +1, 12r +6 ]]R is the radius of a Brasson Hamm circle used in the OFAST corner detection method,
Figure 314885DEST_PATH_IMAGE014
section [2 ]
Figure DEST_PATH_IMAGE017
]The inner even number;
an OBRIEF feature description sub-step of OFAST corner construction: rotating the standard sampling pattern according to the principal direction of the OFAST corner, and then comparing the gray values of two sampling points in each group of point pairs in the rotated standard sampling pattern to construct a binary OBRIEF feature descriptor;
matching point pair forming: and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
10. The stereoscopic image correction method as claimed in claim 9, wherein, in the constructing an obef feature description sub-step, an obef feature descriptor of a certain OFAST corner p
Figure DEST_PATH_IMAGE019
Constructed by the following relation (2):
wherein,for the OFAST corner pointpIs mainlyThe direction of the light beam is changed,
Figure 951216DEST_PATH_IMAGE014
is the point logarithm in the standard sampling pattern;
Figure DEST_PATH_IMAGE021
for a point pair in the standard sampling pattern,tto quantize the threshold, its value range [2, 64 ]];Representing a binary number;a 2-dimensional rotation matrix is represented,
Figure DEST_PATH_IMAGE025
indicating pointsThe gray-scale value of (a) is,
Figure DEST_PATH_IMAGE027
indicating pointsThe gray value of (d).
11. The stereoscopic image correction method as claimed in claim 9, wherein in the corner matching process, for the OFAST corner
Figure DEST_PATH_IMAGE029
And
Figure 221792DEST_PATH_IMAGE030
the similarity between the two is calculated by the following relation (3):
Figure 409191DEST_PATH_IMAGE032
wherein,
Figure DEST_PATH_IMAGE033
the ast corner point of the left view,
Figure 394464DEST_PATH_IMAGE034
the ast corner point of the right view,XORrepresenting a bitwise exclusive-or operation,bitcountrepresenting the number of 1's in the statistical binary number,Dist(p L , p R ) Representing OFAST corner points
Figure 641906DEST_PATH_IMAGE033
Andthe similarity between them.
12. The stereoscopic image correction method as claimed in claim 11, wherein the OFAST corner point for the left view
Figure DEST_PATH_IMAGE035
Traversing all OFAST corner points in the right view, and finding out the point with the minimum value of the relational expression (3) as the OFAST corner point
Figure 766037DEST_PATH_IMAGE035
Thereby forming said pair of matching points.
13. The stereoscopic image correction method according to claim 4, wherein in the correction parameter optimization step, the parameters for the basis matrix are parameterized according to the following relation (4)
Figure 543500DEST_PATH_IMAGE036
The parameterization is carried out, and the parameters are calculated,
Figure DEST_PATH_IMAGE037
wherein,
Figure 24160DEST_PATH_IMAGE038
is a 3-dimensional vector
Figure DEST_PATH_IMAGE039
A determined antisymmetric matrix, which is derived by the following relation (5):
Figure DEST_PATH_IMAGE041
wherein,in the form of a basis matrix, the matrix,
Figure DEST_PATH_IMAGE043
is a 3-dimensional rotation matrix and is,θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 46135DEST_PATH_IMAGE044
representing a direction of offset of a right camera used to capture a right view of the original stereo image pair relative to a left camera used to capture a left view of the original stereo image pair,is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
Figure DEST_PATH_IMAGE047
Figure 329666DEST_PATH_IMAGE048
respectively the width and height of the left view in pixels,
Figure DEST_PATH_IMAGE049
Figure 374982DEST_PATH_IMAGE050
respectively, the width and height of the right view in pixels;
Figure DEST_PATH_IMAGE051
and
Figure 707875DEST_PATH_IMAGE052
the focal lengths of the left camera and the right camera in pixel unit are respectively.
14. The stereoscopic image correction method according to claim 13, wherein the error equation is:
Figure 256668DEST_PATH_IMAGE054
wherein,
Figure 344709DEST_PATH_IMAGE042
in the form of a basis matrix, the matrix,
Figure 611743DEST_PATH_IMAGE056
is a matrix
Figure 482747DEST_PATH_IMAGE042
The transpose of (a) is performed,
Figure DEST_PATH_IMAGE057
the matching points in the second inner point set are the points corresponding to the left view,
Figure 151625DEST_PATH_IMAGE015
the matching points in the second inner point set correspond to the points in the middle right view,Errorindicating a correction error;
wherein the basic matrixIncluded parameters
Figure 837002DEST_PATH_IMAGE036
Is taken as an initial value
Figure 839593DEST_PATH_IMAGE058
15. The stereoscopic image correction method according to claim 1, wherein the correction step includes:
a correction matrix construction step: constructing a correction matrix by using the correction parameters;
fine adjustment of a correction matrix: fine-tuning the correction matrix at least in combination with the depth-of-field adjustment parameter; and
and (3) stereo image cutting: and respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to obtain a corrected stereo image pair.
16. The stereoscopic image correction method according to claim 4, wherein the correcting step includes:
a correction matrix construction step: constructing a correction matrix by using the correction parameters;
fine adjustment of a correction matrix: fine-tuning the correction matrix at least in combination with the depth-of-field adjustment parameter and the second interior point set; and
and (3) stereo image cutting: and respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to obtain a corrected stereo image pair.
17. The stereoscopic image correction method according to claim 15 or 16, characterized in that the correction matrix is constructed based on the following relation (7):
Figure 362978DEST_PATH_IMAGE060
wherein,
Figure DEST_PATH_IMAGE061
a correction matrix corresponding to a left view of the original stereo image pair,
Figure 464926DEST_PATH_IMAGE062
a correction matrix corresponding to a right view of the original stereo image pair; k N As an internal parameter matrix of the corrected camera, KL(f L ) To correct the internal parameter matrix of the front left camera, KR(f R ) Correcting an internal parameter matrix of the front right camera;
Figure 706552DEST_PATH_IMAGE051
and
Figure 247254DEST_PATH_IMAGE052
focal lengths in pixel units of a left camera for photographing the left view and a right camera for photographing the right view, respectively;θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 562829DEST_PATH_IMAGE064
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γis the offset direction of the right cameraAn included angle with the Z axis of the left camera;
wherein, KL(f L ) And KR(f R ) Calculated by the following relation:
wherein,
Figure 163575DEST_PATH_IMAGE047
Figure 626917DEST_PATH_IMAGE048
respectively the width and height of the left view in pixels,
Figure 906064DEST_PATH_IMAGE049
Figure 404042DEST_PATH_IMAGE050
respectively, the width and height of the right view in pixels;
wherein R is a rotation matrix, R L For correcting the rotation matrix of the left camera in the process, R R Is the rotation matrix of the right camera in the correction process.
18. The stereoscopic image correction method as claimed in claim 17, wherein the fine-tuned correction matrix is calculated by the following relation (8):
Figure 910109DEST_PATH_IMAGE066
wherein,the correction matrix for the left view corresponds to,
Figure 946516DEST_PATH_IMAGE062
the correction matrix for the right view corresponds to,
Figure DEST_PATH_IMAGE067
adjusting a parameter for said depth of field, M L
Figure 298999DEST_PATH_IMAGE067
) Indicating the depth adjustment matrix, M, corresponding to the left view R
Figure 913652DEST_PATH_IMAGE067
) Representing a depth adjustment matrix corresponding to the right view;
Figure 351586DEST_PATH_IMAGE068
as an adjustment amount in the vertical direction,is the adjustment amount in the horizontal direction.
19. The stereoscopic image correction method as claimed in claim 15 or 16, wherein the stereoscopic image cropping step comprises the steps of:
respectively acquiring tailorable areas of a corrected left view and a corrected right view of an original stereo image pair;
acquiring a maximum common clipping area between the corrected left view and the corrected right view; and
the left and right views in the corrected maximum common clipping region are filled in accordingly with the grey values of the original stereo image pair.
20. A stereoscopic image correction apparatus, characterized by comprising:
an acquisition module for acquiring an original stereoscopic image pair;
the correction parameter extraction module is used for searching a matching point pair in the original stereo image pair to form a matching point pair set and extracting a correction parameter;
the correction module is used for generating a correction matrix at least according to the correction parameters and the depth adjustment parameters and correcting the original stereo image pair based on the correction matrix so as to eliminate vertical parallax; and
the feedback module is used for feeding back and outputting the depth of field adjustment parameter according to the screen output degree/screen input degree of the stereoscopic image display;
and the depth of field adjustment parameter output by the feedback module is output to the correction module.
21. The stereoscopic image correction apparatus as claimed in claim 20, further comprising:
and the judging module is used for judging whether the correction needs to be carried out again according to the display experience of the stereo image.
22. The stereoscopic image correction apparatus as claimed in claim 20, wherein the original stereoscopic image pair acquired by the acquisition module is composed of a left view and a right view respectively photographed by a left camera and a right camera for the same scene.
23. The stereoscopic image correction apparatus as claimed in claim 20, wherein the correction parameter extraction module comprises:
a corner extraction unit for extracting corners from the views of the original stereo image pair, the corners having relatively severe brightness variation and being relatively easily identified;
the corner matching unit is used for respectively extracting a feature descriptor with robust characteristics from each corner and matching the corners according to the feature descriptor to form an initial matching point pair set;
the mismatching rejection unit is used for rejecting mismatching point pairs existing in the initial matching point pair set by using a robust model estimation method to form a relatively stable and reliable second internal point set; and
and the correction parameter optimization unit is used for parameterizing the basic matrix, establishing an error equation for the matching point pairs in the second inner point set based on the parameterized basic matrix, and optimizing the correction parameters by utilizing a nonlinear least square method to obtain the optimal values of the correction parameters.
24. The stereoscopic image correction apparatus as claimed in claim 23, wherein the corner extraction unit extracts an Orientation Feature (OFAST) corner detected based on acceleration division using an Orientation Feature (OFAST) corner detection means detected based on acceleration division;
the OFAST corner detection component comprises an OFAST corner determination submodule and an OFAST corner main direction extraction submodule.
25. The stereoscopic image correction apparatus as claimed in claim 24, wherein in the OFAST corner point determining sub-module, for a certain point p to be detected, a brewson hamm circle is drawn with a radius r around the point p, if the gray values of n consecutive points on the brewson hamm circle are simultaneously larger than the gray values of n consecutive points on the brewson hamm circle
Figure 37782DEST_PATH_IMAGE070
Or simultaneously less thanThen, the point p is determined as an OFAST detection point, wherein,I(p) To representpThe gray-scale value of the point or points,
Figure 182456DEST_PATH_IMAGE003
the value range of n is [2 ]
Figure 295905DEST_PATH_IMAGE005
Figure 221136DEST_PATH_IMAGE007
];
In the OFAST corner point main direction extraction submodule, when points on the Brasenham circle are labeled for the determined OFAST corner points, the point right above the circle center is labeled with a point label 1, other points are sequentially labeled according to the clockwise direction, the labels of two end points in the n continuous points are respectively labeled as a and b according to the clockwise direction, the OFAST corner point main direction is determined by the following relational expression (1),
Figure 383127DEST_PATH_IMAGE072
wherein,
Figure DEST_PATH_IMAGE073
is the main direction of the OFAST corner point,the number of points included in the Brasson Hamm circle with the radius r.
26. The stereoscopic image correction apparatus as claimed in claim 23 or 24, wherein the corner matching unit employs an oriented binary independent primitive robust features (OBRIEF) feature descriptor and performs matching based on the OBRIEF feature descriptor.
27. The stereoscopic image correction apparatus as claimed in claim 26, wherein the corner matching unit comprises:
a component that generates a standard sampling pattern;
building an OBRIEF feature descriptor for the corner points; and
and a component for matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
28. The stereoscopic image correction apparatus as claimed in claim 25, wherein the corner matching unit comprises:
generating a standard sampling pattern component: by centering on a point (0, 0) and having a size of
Figure 932237DEST_PATH_IMAGE011
In a square frame, according to uniform distribution
Figure 79185DEST_PATH_IMAGE012
Or Gaussian distribution
Figure 107184DEST_PATH_IMAGE013
Random decimation
Figure 288766DEST_PATH_IMAGE014
Pair of points (
Figure 681702DEST_PATH_IMAGE015
Figure 581524DEST_PATH_IMAGE016
) Each group of point pairs (
Figure 147635DEST_PATH_IMAGE015
) Sampling point in
Figure 747561DEST_PATH_IMAGE015
And
Figure 869100DEST_PATH_IMAGE016
straight lines are connected to form line segments to generate the standard sampling pattern, wherein,and
Figure 395077DEST_PATH_IMAGE016
is as followsiTwo sampling points in the group point pair are more than or equal to 1i
Figure 126885DEST_PATH_IMAGE014
The value range of the side length S of the square frame is [2r +1, 12r +6 ]]R is the radius of a Brasson Hamm circle used in the OFAST corner detection method,
Figure 735721DEST_PATH_IMAGE014
section [2 ]]The inner even number;
constructing an OBRIEF feature description subcomponent: rotating the standard sampling pattern according to the principal direction of the OFAST corner, and then comparing the gray values of two sampling points in each group of point pairs in the rotated standard sampling pattern to construct a binary OBRIEF feature descriptor;
matching point pair forming means: and matching the corner points in the left view and the right view in the original stereo image pair by using the OBRIEF feature descriptors to form matched point pairs.
29. The stereoscopic image correcting apparatus as claimed in claim 28, wherein in the constructing an obief feature description subcomponent, an obief feature descriptor of a certain OFAST corner p
Figure 559954DEST_PATH_IMAGE019
Constructed by the following relation (2):
Figure 656086DEST_PATH_IMAGE020
wherein,
Figure 367690DEST_PATH_IMAGE073
for the OFAST corner pointpIn the direction of the main direction of the plane,is the point logarithm in the standard sampling pattern;
Figure 574998DEST_PATH_IMAGE015
Figure 892847DEST_PATH_IMAGE016
for a point pair in the standard sampling pattern,tto quantize the threshold, its value range [2, 64 ]];
Figure 408142DEST_PATH_IMAGE023
Represents a binary number;
Figure 331098DEST_PATH_IMAGE024
a 2-dimensional rotation matrix is represented,
Figure 640857DEST_PATH_IMAGE025
indicating points
Figure 446002DEST_PATH_IMAGE026
The gray-scale value of (a) is,
Figure 437092DEST_PATH_IMAGE027
indicating points
Figure 276872DEST_PATH_IMAGE028
The gray value of (d).
30. The stereoscopic image correction apparatus as claimed in claim 28, wherein in the matching point pair forming means, for the ast corner point, the matching point pair forming means is arranged to form a matching point pair
Figure 757532DEST_PATH_IMAGE035
And
Figure 315552DEST_PATH_IMAGE074
the similarity between the two is calculated by the following relation (3):
Figure 844753DEST_PATH_IMAGE076
wherein,
Figure 804619DEST_PATH_IMAGE035
the ast corner point of the left view,
Figure 456180DEST_PATH_IMAGE074
the ast corner point of the right view,XORrepresenting a bitwise exclusive-or operation,bitcountrepresenting the number of 1's in the statistical binary number,Dist(p L , p R ) Representing OFAST corner points
Figure 173600DEST_PATH_IMAGE035
And
Figure 568810DEST_PATH_IMAGE074
the similarity between them.
31. The stereoscopic image correction device according to claim 23, wherein the correction parameter optimization unit executes the following relational expression (4) to apply the parameters for the basis matrixThe parameterization is carried out, and the parameters are calculated,
Figure 205644DEST_PATH_IMAGE037
wherein,
Figure 410361DEST_PATH_IMAGE038
is a 3-dimensional vector
Figure 609261DEST_PATH_IMAGE039
A determined antisymmetric matrix, which is derived by the following relation (5):
Figure 278140DEST_PATH_IMAGE078
wherein,
Figure 206257DEST_PATH_IMAGE042
in the form of a basis matrix, the matrix,
Figure 960586DEST_PATH_IMAGE043
is a 3-dimensional rotation matrix and is,θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure DEST_PATH_IMAGE079
representing a direction of offset of a right camera used to capture a right view of the original stereo image pair relative to a left camera used to capture a left view of the original stereo image pair,
Figure 900861DEST_PATH_IMAGE046
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
Figure 424246DEST_PATH_IMAGE047
Figure 588511DEST_PATH_IMAGE048
respectively the width and height of the left view in pixels,
Figure 767819DEST_PATH_IMAGE049
Figure 308522DEST_PATH_IMAGE050
respectively, the width and height of the right view in pixels;and
Figure 287160DEST_PATH_IMAGE052
the focal lengths of the left camera and the right camera in pixel unit are respectively.
32. The stereoscopic image correction apparatus as claimed in claim 31, wherein the error equation is:
Figure DEST_PATH_IMAGE081
wherein,
Figure 688185DEST_PATH_IMAGE042
in the form of a basis matrix, the matrix,
Figure 32579DEST_PATH_IMAGE056
is a matrix
Figure 264977DEST_PATH_IMAGE042
The transpose of (a) is performed,
Figure 974307DEST_PATH_IMAGE057
the matching points in the second inner point set are the points corresponding to the left view,
Figure 924945DEST_PATH_IMAGE015
the matching points in the second inner point set correspond to the points in the middle right view,Errorindicating a correction error;
wherein the basic matrix
Figure 807451DEST_PATH_IMAGE042
Included parameters
Figure 97618DEST_PATH_IMAGE036
Is taken as an initial value
Figure 40166DEST_PATH_IMAGE058
33. The stereoscopic image correction apparatus as claimed in claim 20, wherein the correction module comprises:
a correction matrix construction unit for constructing a correction matrix using the correction parameters;
a correction matrix fine tuning unit for fine tuning the correction matrix at least in combination with the depth of field adjustment parameter; and
and the stereo image cutting unit is used for respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to acquire a corrected stereo image pair.
34. The stereoscopic image correction apparatus as claimed in claim 23, wherein the correction module comprises:
a correction matrix construction unit for constructing a correction matrix using the correction parameters;
a correction matrix fine tuning unit for fine tuning the correction matrix at least in combination with the depth of field adjustment parameter and a second inner point set; and
and the stereo image cutting unit is used for respectively processing the views in the original stereo image pair by using the fine-tuned correction matrix to acquire a corrected stereo image pair.
35. The stereoscopic image correction apparatus according to claim 33 or 34, wherein the correction matrix construction unit constructs the correction matrix by performing the following relational expression (7):
Figure 478101DEST_PATH_IMAGE082
wherein,
Figure 101980DEST_PATH_IMAGE061
a correction matrix corresponding to a left view of the original stereo image pair,
Figure 308970DEST_PATH_IMAGE062
a correction matrix corresponding to a right view of the original stereo image pair; k N As an internal parameter matrix of the corrected camera, KL(f L ) To correct the internal parameter matrix of the front left camera, KR(f R ) Correcting an internal parameter matrix of the front right camera;
Figure 422420DEST_PATH_IMAGE051
and
Figure 347651DEST_PATH_IMAGE052
focal lengths in pixel units of a left camera for photographing the left view and a right camera for photographing the right view, respectively;θin order to rotate the angle around the Z-axis of the camera,βto rotate the angle about the Y-axis of the camera,αthe angle is the rotation angle around the X axis of the camera;
Figure 509642DEST_PATH_IMAGE064
is the included angle between the offset direction of the right camera and the Y axis of the left camera,γthe included angle between the offset direction of the right camera and the Z axis of the left camera is formed;
wherein, KL(f L ) And KR(f R ) Calculated by the following relation:
Figure 836718DEST_PATH_IMAGE065
wherein,
Figure 268016DEST_PATH_IMAGE048
respectively the width and height of the left view in pixels,
Figure 230768DEST_PATH_IMAGE049
Figure 412351DEST_PATH_IMAGE050
respectively, the width and height of the right view in pixels;
wherein R is a rotation matrix, R L For correcting the rotation matrix of the left camera in the process, R R Is the rotation matrix of the right camera in the correction process.
36. The stereoscopic image correction apparatus according to claim 35, wherein the correction matrix fine adjustment unit performs calculation of the fine-adjusted correction matrix by the following relation (8):
Figure 867603DEST_PATH_IMAGE066
wherein,
Figure 501847DEST_PATH_IMAGE061
the correction matrix for the left view corresponds to,
Figure 5641DEST_PATH_IMAGE062
the correction matrix for the right view corresponds to,
Figure 307309DEST_PATH_IMAGE067
adjusting a parameter for said depth of field, M L
Figure 933462DEST_PATH_IMAGE067
) Indicating the depth adjustment matrix, M, corresponding to the left view R
Figure 55002DEST_PATH_IMAGE067
) Representing a depth adjustment matrix corresponding to the right view;
Figure 362487DEST_PATH_IMAGE068
as an adjustment amount in the vertical direction,
Figure 518661DEST_PATH_IMAGE069
is the adjustment amount in the horizontal direction.
37. The stereoscopic image correction apparatus as claimed in claim 33 or 34, wherein the stereoscopic image cropping unit comprises:
means for obtaining tailorable regions of the corrected left view and the corrected right view of the original stereoscopic image pair, respectively;
a component that acquires a maximum common clipping region between the corrected left view and the corrected right view; and
the parts of the left view and the right view in the corrected maximum common clipping region are filled in accordingly with the grey values of the original stereo image pair.
CN2012103205396A 2012-09-03 2012-09-03 Three-dimensional image correction method and apparatus Pending CN102905147A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012103205396A CN102905147A (en) 2012-09-03 2012-09-03 Three-dimensional image correction method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012103205396A CN102905147A (en) 2012-09-03 2012-09-03 Three-dimensional image correction method and apparatus

Publications (1)

Publication Number Publication Date
CN102905147A true CN102905147A (en) 2013-01-30

Family

ID=47577160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012103205396A Pending CN102905147A (en) 2012-09-03 2012-09-03 Three-dimensional image correction method and apparatus

Country Status (1)

Country Link
CN (1) CN102905147A (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927760A (en) * 2014-04-30 2014-07-16 重庆环视科技有限公司 Automatic stereoscopic vision color calibration system
CN104103255A (en) * 2013-04-07 2014-10-15 深圳富泰宏精密工业有限公司 Display screen correction system and method
CN104581136A (en) * 2013-10-14 2015-04-29 钰创科技股份有限公司 Image calibration system and calibration method of stereo camera
CN104735436A (en) * 2014-12-29 2015-06-24 深圳超多维光电子有限公司 Single camera three-dimensional imaging method and electronic device
CN105096296A (en) * 2014-05-05 2015-11-25 富士通株式会社 Stereo camera imaging correction device, method and electronic equipment
CN105812766A (en) * 2016-03-14 2016-07-27 吉林大学 Vertical parallax subtraction method
CN105898268A (en) * 2015-12-21 2016-08-24 乐视致新电子科技(天津)有限公司 Stereo camera system and virtual reality helmet
CN106663327A (en) * 2014-08-27 2017-05-10 卡尔斯特里姆保健公司 Automatic restitching of 3-D surfaces
US9697604B2 (en) 2014-01-28 2017-07-04 Altek Semiconductor Corp. Image capturing device and method for detecting image deformation thereof
WO2017113849A1 (en) * 2015-12-28 2017-07-06 乐视控股(北京)有限公司 Method and apparatus for adjusting parallax in virtual reality
CN106952231A (en) * 2017-03-20 2017-07-14 南京大学 A real-time image correction method based on mobile phone platform
CN104811688B (en) * 2014-01-28 2017-09-01 聚晶半导体股份有限公司 Image acquisition device and image deformation detection method thereof
CN107622510A (en) * 2017-08-25 2018-01-23 维沃移动通信有限公司 A kind of information processing method and device
CN108305281A (en) * 2018-02-09 2018-07-20 深圳市商汤科技有限公司 Calibration method, device, storage medium, program product and the electronic equipment of image
CN110741633A (en) * 2017-09-11 2020-01-31 深圳市柔宇科技有限公司 Image processing method, electronic device, and computer-readable storage medium
TWI692741B (en) * 2018-01-18 2020-05-01 鈺立微電子股份有限公司 System of camera calibration
CN111432117A (en) * 2020-03-23 2020-07-17 北京迈格威科技有限公司 Image rectification method, device and electronic system
CN111457886A (en) * 2020-04-01 2020-07-28 北京迈格威科技有限公司 Distance determination method, device and system
CN112911264A (en) * 2021-01-27 2021-06-04 广东未来科技有限公司 3D shooting method and device, storage medium and mobile terminal
CN113766209A (en) * 2020-05-29 2021-12-07 上海汉时信息科技有限公司 Camera offset processing method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101820550A (en) * 2009-02-26 2010-09-01 华为终端有限公司 Multi-viewpoint video image correction method, device and system
US20100277572A1 (en) * 2009-04-30 2010-11-04 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
CN102404595A (en) * 2011-08-16 2012-04-04 上海交通大学 Epipolar line correction method capable of providing 3D program shooting guidance
CN102592124A (en) * 2011-01-13 2012-07-18 汉王科技股份有限公司 Geometrical correction method, device and binocular stereoscopic vision system of text image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101820550A (en) * 2009-02-26 2010-09-01 华为终端有限公司 Multi-viewpoint video image correction method, device and system
US20100277572A1 (en) * 2009-04-30 2010-11-04 Canon Kabushiki Kaisha Information processing apparatus and control method thereof
CN102592124A (en) * 2011-01-13 2012-07-18 汉王科技股份有限公司 Geometrical correction method, device and binocular stereoscopic vision system of text image
CN102404595A (en) * 2011-08-16 2012-04-04 上海交通大学 Epipolar line correction method capable of providing 3D program shooting guidance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RUBLEE E. ET AL: "ORB: An efficient alternative to SIFT or SURF", 《IEEE XPLORE DIGITAL LIBRARY》, 13 November 2011 (2011-11-13), pages 2564 - 2571, XP032101497, DOI: doi:10.1109/ICCV.2011.6126544 *

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104103255A (en) * 2013-04-07 2014-10-15 深圳富泰宏精密工业有限公司 Display screen correction system and method
CN104103255B (en) * 2013-04-07 2018-09-07 深圳富泰宏精密工业有限公司 Display screen correction system and method
CN104581136A (en) * 2013-10-14 2015-04-29 钰创科技股份有限公司 Image calibration system and calibration method of stereo camera
US9697604B2 (en) 2014-01-28 2017-07-04 Altek Semiconductor Corp. Image capturing device and method for detecting image deformation thereof
CN104811688B (en) * 2014-01-28 2017-09-01 聚晶半导体股份有限公司 Image acquisition device and image deformation detection method thereof
CN103927760A (en) * 2014-04-30 2014-07-16 重庆环视科技有限公司 Automatic stereoscopic vision color calibration system
CN105096296A (en) * 2014-05-05 2015-11-25 富士通株式会社 Stereo camera imaging correction device, method and electronic equipment
CN105096296B (en) * 2014-05-05 2018-02-09 富士通株式会社 Means for correcting, method and the electronic equipment of stereoscopic camera imaging
CN106663327A (en) * 2014-08-27 2017-05-10 卡尔斯特里姆保健公司 Automatic restitching of 3-D surfaces
CN106663327B (en) * 2014-08-27 2020-01-24 锐珂牙科技术顶阔有限公司 Automatic rejoining of 3-D surfaces
CN104735436A (en) * 2014-12-29 2015-06-24 深圳超多维光电子有限公司 Single camera three-dimensional imaging method and electronic device
CN105898268A (en) * 2015-12-21 2016-08-24 乐视致新电子科技(天津)有限公司 Stereo camera system and virtual reality helmet
WO2017113849A1 (en) * 2015-12-28 2017-07-06 乐视控股(北京)有限公司 Method and apparatus for adjusting parallax in virtual reality
CN105812766A (en) * 2016-03-14 2016-07-27 吉林大学 Vertical parallax subtraction method
CN105812766B (en) * 2016-03-14 2017-07-04 吉林大学 A kind of vertical parallax method for reducing
CN106952231A (en) * 2017-03-20 2017-07-14 南京大学 A real-time image correction method based on mobile phone platform
CN106952231B (en) * 2017-03-20 2019-06-11 南京大学 A real-time image correction method based on mobile phone platform
CN107622510A (en) * 2017-08-25 2018-01-23 维沃移动通信有限公司 A kind of information processing method and device
CN110741633A (en) * 2017-09-11 2020-01-31 深圳市柔宇科技有限公司 Image processing method, electronic device, and computer-readable storage medium
TWI692741B (en) * 2018-01-18 2020-05-01 鈺立微電子股份有限公司 System of camera calibration
CN108305281A (en) * 2018-02-09 2018-07-20 深圳市商汤科技有限公司 Calibration method, device, storage medium, program product and the electronic equipment of image
CN108305281B (en) * 2018-02-09 2020-08-11 深圳市商汤科技有限公司 Image calibration method, device, storage medium, program product and electronic equipment
CN111432117A (en) * 2020-03-23 2020-07-17 北京迈格威科技有限公司 Image rectification method, device and electronic system
CN111432117B (en) * 2020-03-23 2021-08-10 北京迈格威科技有限公司 Image rectification method, device and electronic system
CN111457886A (en) * 2020-04-01 2020-07-28 北京迈格威科技有限公司 Distance determination method, device and system
CN111457886B (en) * 2020-04-01 2022-06-21 北京迈格威科技有限公司 Distance determination method, device and system
CN113766209A (en) * 2020-05-29 2021-12-07 上海汉时信息科技有限公司 Camera offset processing method and device
CN113766209B (en) * 2020-05-29 2024-04-30 上海汉时信息科技有限公司 Camera offset processing method and device
CN112911264A (en) * 2021-01-27 2021-06-04 广东未来科技有限公司 3D shooting method and device, storage medium and mobile terminal

Similar Documents

Publication Publication Date Title
CN102905147A (en) Three-dimensional image correction method and apparatus
CN107680128B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
EP3568971B1 (en) Apparatus and methods for the storage of overlapping regions of imaging data for the generation of optimized stitched images
CN107330439B (en) Method for determining posture of object in image, client and server
US20200160102A1 (en) Keypoint unwarping for machine vision applications
CN107852533B (en) Three-dimensional content generation device and three-dimensional content generation method thereof
US9727775B2 (en) Method and system of curved object recognition using image matching for image processing
Xu et al. Fast feature-based video stabilization without accumulative global motion estimation
US20130169760A1 (en) Image Enhancement Methods And Systems
CN106981078B (en) Sight line correction method and device, intelligent conference terminal and storage medium
CN103530599A (en) Method and system for distinguishing real face and picture face
CN107392958A (en) A kind of method and device that object volume is determined based on binocular stereo camera
CN106570899B (en) Target object detection method and device
US9613404B2 (en) Image processing method, image processing apparatus and electronic device
KR101510312B1 (en) 3D face-modeling device, system and method using Multiple cameras
WO2014180255A1 (en) Data processing method, apparatus, computer storage medium and user terminal
CN104392416A (en) Video stitching method for sports scene
CN113538569A (en) Weak texture object pose estimation method and system
KR101983586B1 (en) Method of stitching depth maps for stereo images
US20150131853A1 (en) Stereo matching system and method for generating disparity map using same
KR101853269B1 (en) Apparatus of stitching depth maps for stereo images
KR100943635B1 (en) Method and apparatus for generating disparity map using digital camera image
CN116012432A (en) Stereoscopic panoramic image generation method and device and computer equipment
CN108062765A (en) Binocular image processing method, imaging device and electronic equipment
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent of invention or patent application
CB03 Change of inventor or designer information

Inventor after: Yao Da

Inventor after: Zhong Xiongguang

Inventor after: Peng Chaojian

Inventor after: He Guangcai

Inventor before: Yao Hua

Inventor before: Zhong Xiongguang

Inventor before: Peng Chaojian

Inventor before: He Guangcai

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: YAO HUA ZHONG XIONGGUANG PENG CHAOJIAN HE GUANGCAI TO: YAO DA ZHONG XIONGGUANG PENG CHAOJIAN HE GUANGCAI

AD01 Patent right deemed abandoned

Effective date of abandoning: 20151216

C20 Patent right or utility model deemed to be abandoned or is abandoned