[go: up one dir, main page]

CN105282532B - 3D display method and apparatus - Google Patents

3D display method and apparatus Download PDF

Info

Publication number
CN105282532B
CN105282532B CN201410243385.4A CN201410243385A CN105282532B CN 105282532 B CN105282532 B CN 105282532B CN 201410243385 A CN201410243385 A CN 201410243385A CN 105282532 B CN105282532 B CN 105282532B
Authority
CN
China
Prior art keywords
plane
cameras
position coordinate
eye position
clipped value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410243385.4A
Other languages
Chinese (zh)
Other versions
CN105282532A (en
Inventor
李今
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Tuoshi Science & Technology Co Ltd
Original Assignee
Tianjin Tuoshi Science & Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Tuoshi Science & Technology Co Ltd filed Critical Tianjin Tuoshi Science & Technology Co Ltd
Priority to CN201410243385.4A priority Critical patent/CN105282532B/en
Publication of CN105282532A publication Critical patent/CN105282532A/en
Application granted granted Critical
Publication of CN105282532B publication Critical patent/CN105282532B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Stereoscopic And Panoramic Photography (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of 3D display method and apparatus, are related to field of virtual reality.Validity to solve the problems, such as prior art virtual scene shows deficiency and invents.Technical solution disclosed by the embodiments of the present invention includes:S10, the left eye position coordinate of user and right eye position coordinate are obtained;S20, according to the left eye position coordinate and right eye position coordinate, calculate the parameter of two cameras in the virtual scene of 3D display respectively;S30, described two cameras are adjusted according to the parameter respectively;S40, the viewport of two cameras after adjusting parameter is rendered, the image after being rendered;S50, the image after the rendering is sent to the 3D display device connected in advance, the 3D display device is made to be shown after handling the image after the rendering.The program can be applied in virtual reality system.

Description

3D display method and apparatus
Technical field
The present invention relates to field of virtual reality more particularly to a kind of 3D display method and apparatus.
Background technology
At present, common 3D display technology includes not flash 3D display technology, shutter 3D display technology and naked eye type 3D Display technology.In the prior art, 3D display technology includes:The corresponding two groups of virtual scene images difference of the right and left eyes of user is defeated Go out to eyes, left eye can only see the corresponding virtual scene image of left eye, and right eye can only see the corresponding virtual scene figure of right eye Picture, so as to fulfill 3D display.
However, when being in different location due to user, the two groups of virtual scene images exported to right and left eyes are constant, are made The validity performance of virtual scene is insufficient.
Invention content
The present invention provides a kind of 3D display method and apparatus, can improve the validity expressive force of virtual scene.
The present invention solves technical problem and adopts the following technical scheme that:A kind of 3D display method, including:S10, obtain user's Left eye position coordinate and right eye position coordinate;S20, according to the left eye position coordinate and right eye position coordinate, calculate 3D respectively The parameter of two cameras in the virtual scene of display;S30, described two cameras are adjusted according to the parameter respectively;S40, it exchanges The viewport of two cameras after whole parameter is rendered, the image after being rendered;S50, it is sent out to the 3D display device connected in advance The image after the rendering is sent, the 3D display device is made to be shown after handling the image after the rendering.
Optionally, the S20, including:S201, the location parameter for calculating two cameras in virtual scene respectively;S202, divide Not Ji Suan in virtual scene two cameras projection matrix parameters.
Optionally, the S201, including:S2011, the location parameter for calculating the camera in left side in described two cameras:A+ B*L;S2012, the location parameter for calculating the camera on right side in described two cameras:A+B*R;The A is sat for preset reference point Mark, the B are preset spin matrix, and the L is left eye position coordinate, and the R is right eye position coordinate.
Optionally, the S202, including:S2021, the left view cone cutting for calculating the camera in left side in described two cameras Plane, the left clipped value that the left view cone cuts plane is (Xl-W/2) * N/Zl, and the left view cone cuts the right sanction of plane Value is cut as (Xl+W/2) * N/Zl, the upper clipped value that the left view cone cuts plane is (Yl+H/2) * N/Zl, the left view cone Type cuts flat the lower clipped value in face as (Yl-H/2) * N/Zl, and the nearly clipped value that the left view cone cuts plane is N, the left side The remote clipped value that view frustums cut plane is F;S2022, the right view frustums cutting for calculating the camera on right side in described two cameras Plane, the left clipped value that the right view frustums cut plane is (Xr-W/2) * N/Zr, and the right view frustums cut the right sanction of plane Value is cut as (Xr+W/2) * N/Zr, the upper clipped value that the right view frustums cut plane is (Yr+H/2) * N/Zr, the right cone Type cuts flat the lower clipped value in face as (Yr-H/2) * N/Zr, and the nearly clipped value that the right view frustums cut plane is N, the right side The remote clipped value that view frustums cut plane is F;S2023, plane and the right side are cut according to the left view cone regarding centrum cutting Plane calculates the projection matrix parameters of described two cameras respectively;The left eye position coordinate is denoted as (Xl, Yl, Zl), the right side Eye position coordinates are denoted as (Xr, Yr, Zr), and the W is the length of the 3D display device, and the H is the width of the 3D display device, The N is preset nearly clipped value, and the F is preset remote clipped value.
Optionally, the S10, including:S101, the realtime graphic for obtaining user;S102, figure is carried out to the realtime graphic As identification, the left eye position coordinate of user and right eye position coordinate are obtained.
The present invention solves technical problem and also adopts the following technical scheme that:A kind of 3D display device, including:
Human eye positioning unit, for obtaining the left eye position coordinate of user and right eye position coordinate;
Parameter calculation unit is connected with the human eye positioning unit, for the left side obtained according to the human eye positioning unit Eye position coordinates and right eye position coordinate obtain the parameter of two cameras in the virtual scene of 3D display respectively;
Parameter adjustment unit is connected with the parameter calculation unit, for the ginseng obtained according to the parameter calculation unit Number, adjusts described two cameras respectively;
Figure rendering unit is connected with the parameter adjustment unit, after to the parameter adjustment unit adjusting parameter The viewports of two cameras rendered, the image after being rendered;
Image-display units are connected with the figure rendering unit, described in being sent to the 3D display device connected in advance Image after the rendering that figure rendering unit obtains makes the 3D display device be shown after handling the image after the rendering Show.
Optionally, the parameter calculation unit, including:
Location parameter computing module, for calculating the location parameter of two cameras in virtual scene respectively;
Matrix parameter computing module, for calculating the projection matrix parameters of two cameras in virtual scene respectively.
Optionally, the location parameter computing module, including:
Computational submodule is put in left position, for calculating the location parameter of the camera in left side in described two cameras:A+B*L;
Right position computational submodule, for calculating the location parameter of the camera on right side in described two cameras:A+B*R;
For the A to be preset with reference to point coordinates, the B is preset spin matrix, and the L is left eye position coordinate, institute R is stated as right eye position coordinate.
Optionally, the matrix parameter computing module, including:
Left plane computational submodule cuts plane for calculating the left view cone of the camera in left side in described two cameras, The left clipped value that the left view cone cuts plane is (Xl-W/2) * N/Zl, and the left view cone cuts the right clipped value of plane For (Xl+W/2) * N/Zl, the upper clipped value that the left view cone cuts plane is (Yl+H/2) * N/Zl, and the left view cone is cut out The lower clipped value for cutting flat face is (Yl-H/2) * N/Zl, and the nearly clipped value that the left view cone cuts plane is N, the left view cone The remote clipped value that type cuts flat face is F;
Right plane computations submodule cuts plane for calculating the right view frustums of the camera on right side in described two cameras, The left clipped value that the right view frustums cut plane is (Xr-W/2) * N/Zr, and the right view frustums cut the right clipped value of plane For (Xr+W/2) * N/Zr, upper clipped value that the right view frustums cut plane is (Yr+H/2) * N/Zr, the right cone type The lower clipped value in face is cut flat as (Yr-H/2) * N/Zr, the nearly clipped value that the right view frustums cut plane is N, the right cone The remote clipped value that type cuts flat face is F;
Matrix computational submodule is connected respectively with the left plane computational submodule and the right plane computations submodule, Left view cone for being calculated according to the left plane computational submodule cuts plane and the right plane computations submodule calculates The right side cut plane depending on centrum and calculate the projection matrix parameters of described two cameras respectively;
The left eye position coordinate is denoted as (Xl, Yl, Zl), and the right eye position coordinate is denoted as (Xr, Yr, Zr), and the W is The length of the 3D display device, the H are the width of the 3D display device, and the N is preset nearly clipped value, and the F is pre- If remote clipped value.
Optionally, the human eye positioning unit, including:
Image collection module, for obtaining the realtime graphic of user;
Picture recognition module is connected with described image acquisition module, real-time for being obtained to described image acquisition module Image carries out image identification, obtains the left eye position coordinate of user and right eye position coordinate.
The present invention has the advantages that:According to two phases of the left eye position coordinate of user and right eye position Coordinate Adjusting Machine, and after being rendered to the camera after adjustment, corresponding image is exported by 3D display device, so as to fulfill 3D display.Due to According to the parameter of two cameras of the left eye position coordinate of user and right eye position Coordinate Adjusting, thus by the real space position of human eye It puts and is contacted with virtual scene foundation, give the image of right and left eyes relative current position, make one visually have holographic effect, into And it realizes virtual hologram and shows.Technical solution provided in an embodiment of the present invention solves is in different positions in user in the prior art When putting, the two groups of virtual scene images exported to right and left eyes are constant, and the validity of virtual scene is made to show the problem of insufficient.
Description of the drawings
Fig. 1 is the flow chart of 3D display method that the embodiment of the present invention 1 provides;
Fig. 2 is the structure diagram of 3D display device that the embodiment of the present invention 2 provides;
Fig. 3 is the structure diagram of parameter acquiring unit in 3D display device shown in Fig. 2;
Fig. 4 is the structure diagram of location parameter acquisition module in parameter acquiring unit shown in Fig. 3;
Fig. 5 is the structure diagram of matrix parameter acquisition module in parameter acquiring unit shown in Fig. 3;
Fig. 6 is the structure diagram of human eye positioning unit in 3D display device shown in Fig. 2.
Specific embodiment
Technical scheme of the present invention is further elaborated with reference to embodiment and attached drawing.
Embodiment 1:
As shown in Figure 1, a kind of 3D display method is present embodiments provided, including:
Step 101, the left eye position coordinate of user and right eye position coordinate are obtained.
In the present embodiment, step 101 obtains left eye position coordinate and the process of right eye position coordinate and can include:First Obtain the realtime graphic of user;Then image identification is carried out to realtime graphic, obtains the left eye position coordinate of user and right eye position Put coordinate.Wherein it is possible to the realtime graphic of user is obtained by including the digital camera of imaging sensor;Particularly, it is Ensure the renewal rate of data, make the output frame rate of digital camera more than 60 frames are per second, which uses wide Angle mirror head.
In the present embodiment, in addition to above-mentioned image procossing mode, it can also use active target that the methods of assisting is marked to obtain Left eye position coordinate and right eye position coordinate are taken, this is no longer going to repeat them.
In the present embodiment, three dimensional space coordinate is referred to position of human eye coordinate in same in the virtual scene of 3D display One referential;In the virtual scene of 3D display three dimensional space coordinate and position of human eye coordinate can also with reference to different referentials, And same referential is converted to when in use.In order to represent that origin conveniently, can be set as to the screen centre position of 3D display device, Origin O is located at screen center, and x-axis is parallel to the horizontal direction of 3D display device, and y-axis is parallel to the longitudinal direction of 3D display device, z-axis It is parallel to the screen of 3D display device inwards.
Step 102, according to left eye position coordinate and right eye position coordinate, two are calculated in the virtual scene of 3D display respectively The parameter of camera.
In the present embodiment, the space scale relationship of position of human eye and virtual scene is established, i.e., it needs to be determined that virtual scene Space scale reaction scale visually, which can represent unit to refer to using the length and width of 3D display device.It is shown with 3D Show device width it is high by centimetre as unit of calculate for, eyes coordinates represent also by centimetre as unit of, then 1 unit in virtual scene Reflection is then visually 1 centimetre.
In the present embodiment, the parameter of two cameras calculated by step 102 can only include the position of two cameras Parameter;In order to which the image after rendering is made to meet the size of 3D display device, ensure the scale of virtual scene reflecting on Ocular measure Relationship is penetrated, the parameter of two cameras in addition to the location parameter including two cameras, can also include the projection square of two cameras Battle array parameter.
In the present embodiment, the process for calculating the location parameter of two cameras respectively includes:Calculate left side in two cameras Camera location parameter:A+B*L;Calculate the location parameter of the camera on right side in two cameras:A+B*R.Wherein, the A is Preset with reference to point coordinates, the B is preset spin matrix, and the L is left eye position coordinate, and the R is sat for right eye position Mark.Wherein, it is represented using spin matrix with reference to posture, and reference point and reference posture can be used as the control of scene logic state defeated Enter medium, roam in the scene so as to control or change current observation visual angle.
In the present embodiment, the process for calculating the projection matrix parameters of two cameras respectively includes:It calculates in two cameras The left view cone of the camera in left side cuts plane;The right view frustums for calculating the camera on right side in two cameras cut plane;According to Left view cone cuts plane and right view frustums cut the projection matrix parameters that plane calculates two cameras respectively.Wherein, the left side The left clipped value that view frustums cut plane is (Xl-W/2) * N/Zl, and the right clipped value that the left view cone cuts plane is (Xl+ W/2) * N/Zl, the upper clipped value that the left view cone cuts plane is (Yl+H/2) * N/Zl, and the left view cone cuts plane Lower clipped value be (Yl-H/2) * N/Zl, the nearly clipped value that the left view cone cuts plane is N, and the left view cone is cut The remote clipped value of plane is F;The left clipped value that the right view frustums cut plane is (Xr-W/2) * N/Zr, the right view frustums The right clipped value for cutting plane is (Xr+W/2) * N/Zr, and the upper clipped value that the right view frustums cut plane is (Yr+H/2) * N/ Zr, the lower clipped value that the right view frustums cut plane is (Yr-H/2) * N/Zr, and the right view frustums cut the nearly cutting of plane It is worth for N, the remote clipped value that the right view frustums cut plane is F;The left eye position coordinate is denoted as (Xl, Yl, Zl), the right side Eye position coordinates are denoted as (Xr, Yr, Zr), and the W is the length of the 3D display device, and the H is the width of the 3D display device, The N is preset nearly clipped value, and the F is preset remote clipped value.
Step 103, two cameras are adjusted according to the parameter respectively.
Step 104, the viewport of two cameras after adjusting parameter is rendered, the image after being rendered.
In the present embodiment, the viewport of two cameras is rendered by step 104, virtual three dimensional image is converted into Two dimensional image after rendering.
Step 105, the image after the rendering is sent to the 3D display device connected in advance, makes 3D display device to the figure after rendering As being shown after being handled.
In the present embodiment, 3D display device can be not flash 3D display device in step 105, or shutter 3D is shown Show device, can also be bore hole 3D display device, this is not restricted.
The present invention has the advantages that:According to two phases of the left eye position coordinate of user and right eye position Coordinate Adjusting Machine, and after being rendered to the camera after adjustment, corresponding image is exported by 3D display device, so as to fulfill 3D display.Due to According to the parameter of two cameras of the left eye position coordinate of user and right eye position Coordinate Adjusting, thus by the real space position of human eye It puts and is contacted with virtual scene foundation, give the image of right and left eyes relative current position, make one visually have holographic effect, into And it realizes virtual hologram and shows.Technical solution provided in an embodiment of the present invention solves is in different positions in user in the prior art When putting, the two groups of virtual scene images exported to right and left eyes are constant, and the validity of virtual scene is made to show the problem of insufficient.
Embodiment 2:
As shown in Fig. 2, the embodiment of the present invention provides a kind of 3D display device, including:
Human eye positioning unit 201, for obtaining the left eye position coordinate of user and right eye position coordinate;
Parameter calculation unit 202 is connected with human eye positioning unit, for the left eye position obtained according to human eye positioning unit Coordinate and right eye position coordinate calculate the parameter of two cameras in the virtual scene of 3D display respectively;
Parameter adjustment unit 203, is connected with parameter calculation unit, for the parameter calculated according to parameter calculation unit, divides It Tiao Zheng not two cameras;
Figure rendering unit 204, is connected with parameter adjustment unit, for two after parameter adjustment unit adjusting parameter The viewport of camera is rendered, the image after being rendered;
Image-display units 205 are connected with figure rendering unit, for sending figure wash with watercolours to the 3D display device connected in advance Image after the rendering that dye unit obtains, makes 3D display device be shown after handling the image after rendering.
In the present embodiment, by above-mentioned human eye positioning unit 201, parameter acquiring unit 202, parameter adjustment unit 203, Figure rendering unit 204 and image-display units 205 realize the process of 3D display, similar to 3D display process shown in FIG. 1, This is no longer repeated one by one.
Further, as shown in figure 3, parameter calculation unit 202 in the present embodiment, including:
Location parameter computing module 2021, for calculating the location parameter of two cameras in virtual scene respectively;
Matrix parameter computing module 2022, for calculating the projection matrix parameters of two cameras in virtual scene respectively.
Wherein, as shown in figure 4, the location parameter computing module 2021, including:
Computational submodule 20211 is put in left position, for calculating the location parameter of the camera in left side in two cameras:A+B*L;
Right position computational submodule 20212, for calculating the location parameter of the camera on right side in two cameras:A+B*R;
For A to be preset with reference to point coordinates, B is preset spin matrix, and L is left eye position coordinate, and R is sat for right eye position Mark.
Wherein, as shown in figure 5, the matrix parameter computing module 2022, including:
Left plane computational submodule 20221 cuts plane for calculating the left view cone of the camera in left side in two cameras, The left clipped value that left view cone cuts plane is (Xl-W/2) * N/Zl, and the right clipped value that left view cone cuts plane is (Xl+W/ 2) * N/Zl, the upper clipped value that left view cone cuts plane are (Yl+H/2) * N/Zl, and left view cone cuts the lower clipped value of plane For (Yl-H/2) * N/Zl, the nearly clipped value that left view cone cuts plane is N, and the remote clipped value that left view cone cuts plane is F;
Right plane computations submodule 20222 cuts plane for calculating the right view frustums of the camera on right side in two cameras, The left clipped value that right view frustums cut plane is (Xr-W/2) * N/Zr, and the right clipped value that right view frustums cut plane is (Xr+W/ 2) * N/Zr, the upper clipped value that right view frustums cut plane is (Yr+H/2) * N/Zr, and right view frustums cut the lower clipped value of plane For (Yr-H/2) * N/Zr, the nearly clipped value that right view frustums cut plane is N, and the remote clipped value that right view frustums cut plane is F;
Matrix computational submodule 20223 is connected with left plane computational submodule and right plane computations submodule, is used for respectively It is cut out regarding centrum on the right side that plane and the calculating of right plane computations submodule are cut according to the left view cone that left plane computational submodule calculates Cut flat the projection matrix parameters that face calculates two cameras respectively;
Left eye position coordinate is denoted as (Xl, Yl, Zl), and right eye position coordinate is denoted as (Xr, Yr, Zr), and W is the length of 3D display device Degree, H are the width of 3D display device, and N is preset nearly clipped value, and F is preset remote clipped value.
Further, as shown in fig. 6, human eye positioning unit 201 in the present embodiment, including:
Image collection module 2011, for obtaining the realtime graphic of user;
Picture recognition module 2012, is connected with image collection module, for the realtime graphic obtained to image collection module Image identification is carried out, obtains the left eye position coordinate of user and right eye position coordinate.
In the present embodiment, it can realize that human eye positions by image collection module 2011 and picture recognition module 2012, It can also realize that human eye positions by other means, this is no longer going to repeat them.
In the present embodiment, 3D display device can be not flash 3D display device, or shutter 3D display fills It puts, can also be bore hole 3D display device, this is not restricted.
The present invention has the advantages that:According to two phases of the left eye position coordinate of user and right eye position Coordinate Adjusting Machine, and after being rendered to the camera after adjustment, corresponding image is exported by 3D display device, so as to fulfill 3D display.Due to According to the parameter of two cameras of the left eye position coordinate of user and right eye position Coordinate Adjusting, thus by the real space position of human eye It puts and is contacted with virtual scene foundation, give the image of right and left eyes relative current position, make one visually have holographic effect, into And it realizes virtual hologram and shows.Technical solution provided in an embodiment of the present invention solves is in different positions in user in the prior art When putting, the two groups of virtual scene images exported to right and left eyes are constant, and the validity of virtual scene is made to show the problem of insufficient.
The sequencing of above example is not only for ease of description, represent the quality of embodiment.
Finally it should be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although The present invention is described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that:It still may be used To modify to the technical solution recorded in foregoing embodiments or carry out equivalent replacement to which part technical characteristic; And these modification or replace, various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution spirit and Range.

Claims (8)

  1. A kind of 1. 3D display method, which is characterized in that including:
    S10, the left eye position coordinate of user and right eye position coordinate are obtained;
    S20, according to the left eye position coordinate and right eye position coordinate, calculate two cameras in the virtual scene of 3D display respectively Parameter, establish the space scale relationship of position of human eye and virtual scene, determine that the space scale of virtual scene is reacted in vision On scale;
    S30, described two cameras are adjusted according to the parameter respectively;
    S40, the viewport of two cameras after adjusting parameter is rendered, the image after being rendered;
    S50, the image after the rendering is sent to the 3D display device connected in advance, after making the 3D display device to the rendering Image is shown after being handled;
    The step S20 is further included:
    S201, the location parameter for calculating two cameras in the virtual scene respectively;
    S202, the projection matrix parameters for calculating two cameras in the virtual scene respectively.
  2. 2. 3D display method according to claim 1, which is characterized in that the S201, including:
    S2011, the location parameter for calculating the camera in left side in described two cameras:A+B*L;
    S2012, the location parameter for calculating the camera on right side in described two cameras:A+B*R;
    The A is with reference to point coordinates in the preset virtual scene, and the B is preset spin matrix, and the L is left eye position Coordinate is put, the R is right eye position coordinate.
  3. 3. 3D display method according to claim 1 or 2, which is characterized in that the S202, including:
    S2021, the left view cone cutting plane for calculating the camera in left side in described two cameras, the left view cone cut plane Left clipped value be (Xl-W/2) * N/Zl, the right clipped value that the left view cone cuts plane is (Xl+W/2) * N/Zl, described The upper clipped value that left view cone cuts plane is (Yl+H/2) * N/Zl, and the lower clipped value that the left view cone cuts plane is (Yl-H/2) * N/Zl, the nearly clipped value that the left view cone cuts plane are N, and the left view cone cuts the remote cutting of plane It is worth for F;
    S2022, the right view frustums cutting plane for calculating the camera on right side in described two cameras, the right view frustums cut plane Left clipped value be (Xr-W/2) * N/Zr, the right clipped value that the right view frustums cut plane is (Xr+W/2) * N/Zr, described The upper clipped value that right view frustums cut plane is (Yr+H/2) * N/Zr, and the lower clipped value that the right view frustums cut plane is (Yr-H/2) * N/Zr, the nearly clipped value that the right view frustums cut plane are N, and the right view frustums cut the remote cutting of plane It is worth for F;
    S2023, described two cameras are calculated according to left view cone cutting plane and the right side respectively depending on centrum cutting plane Projection matrix parameters;
    The left eye position coordinate is denoted as (Xl, Yl, Zl), and the right eye position coordinate is denoted as (Xr, Yr, Zr), and the W is described The length of 3D display device, the H are the width of the 3D display device, and the N is preset nearly clipped value, and the F is preset Remote clipped value.
  4. 4. 3D display method according to claim 1, which is characterized in that the S10, including:
    S101, the realtime graphic for obtaining user;
    S102, image identification is carried out to the realtime graphic, obtains the left eye position coordinate of user and right eye position coordinate.
  5. 5. a kind of 3D display device, which is characterized in that including:
    Human eye positioning unit, for obtaining the left eye position coordinate of user and right eye position coordinate;
    Parameter calculation unit is connected with the human eye positioning unit, for the left eye position obtained according to the human eye positioning unit Put coordinate and right eye position coordinate, respectively calculate 3D display virtual scene in two cameras parameter, establish position of human eye with The space scale relationship of virtual scene determines the scale of the space scale reaction of virtual scene visually;
    Parameter adjustment unit is connected with the parameter calculation unit, for the parameter calculated according to the parameter calculation unit, divides Described two cameras are not adjusted;
    Figure rendering unit is connected with the parameter adjustment unit, for two after the parameter adjustment unit adjusting parameter The viewport of a camera is rendered, the image after being rendered;
    Image-display units are connected with the figure rendering unit, for sending the figure to the 3D display device connected in advance Image after the rendering that rendering unit obtains makes the 3D display device be shown after handling the image after the rendering;
    The parameter calculation unit, including:
    Location parameter computing module, for calculating the location parameter of two cameras in virtual scene respectively;
    Matrix parameter computing module, for calculating the projection matrix parameters of two cameras in virtual scene respectively.
  6. 6. 3D display device according to claim 5, which is characterized in that the location parameter computing module, including:
    Computational submodule is put in left position, for calculating the location parameter of the camera in left side in described two cameras:A+B*L;
    Right position computational submodule, for calculating the location parameter of the camera on right side in described two cameras:A+B*R;
    For the A to be preset with reference to point coordinates, the B is preset spin matrix, and the L is left eye position coordinate, and the R is Right eye position coordinate.
  7. 7. 3D display device according to claim 5, which is characterized in that the matrix parameter computing module, including:
    Left plane computational submodule cuts plane for calculating the left view cone of the camera in left side in described two cameras, described The left clipped value that left view cone cuts plane is (Xl-W/2) * N/Zl, and the right clipped value that the left view cone cuts plane is (Xl + W/2) * N/Zl, the upper clipped value that the left view cone cuts plane is (Yl+H/2) * N/Zl, and the left view cone cuts plane Lower clipped value be (Yl-H/2) * N/Zl, the nearly clipped value that the left view cone cuts plane is N, and the left view cone is cut The remote clipped value of plane is F;
    Right plane computations submodule cuts plane for calculating the right view frustums of the camera on right side in described two cameras, described The left clipped value that right view frustums cut plane is (Xr-W/2) * N/Zr, and the right clipped value that the right view frustums cut plane is (Xr + W/2) * N/Zr, the upper clipped value that the right view frustums cut plane is (Yr+H/2) * N/Zr, and the right view frustums cut plane Lower clipped value be (Yr-H/2) * N/Zr, the nearly clipped value that the right view frustums cut plane is N, and the right view frustums are cut The remote clipped value of plane is F;
    Matrix computational submodule is connected with the left plane computational submodule and the right plane computations submodule, is used for respectively The right side of plane and the right plane computations submodule calculating is cut according to the left view cone that the left plane computational submodule calculates Calculate the projection matrix parameters of described two cameras respectively depending on centrum cutting plane;
    The left eye position coordinate is denoted as (Xl, Yl, Zl), and the right eye position coordinate is denoted as (Xr, Yr, Zr), and the W is described The length of 3D display device, the H are the width of the 3D display device, and the N is preset nearly clipped value, and the F is preset Remote clipped value.
  8. 8. 3D display device according to claim 5, which is characterized in that the human eye positioning unit, including:
    Image collection module, for obtaining the realtime graphic of user;
    Picture recognition module is connected with described image acquisition module, for the realtime graphic obtained to described image acquisition module Image identification is carried out, obtains the left eye position coordinate of user and right eye position coordinate.
CN201410243385.4A 2014-06-03 2014-06-03 3D display method and apparatus Expired - Fee Related CN105282532B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410243385.4A CN105282532B (en) 2014-06-03 2014-06-03 3D display method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410243385.4A CN105282532B (en) 2014-06-03 2014-06-03 3D display method and apparatus

Publications (2)

Publication Number Publication Date
CN105282532A CN105282532A (en) 2016-01-27
CN105282532B true CN105282532B (en) 2018-06-22

Family

ID=55150747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410243385.4A Expired - Fee Related CN105282532B (en) 2014-06-03 2014-06-03 3D display method and apparatus

Country Status (1)

Country Link
CN (1) CN105282532B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107147899B (en) * 2017-06-06 2020-02-11 北京德火新媒体技术有限公司 CAVE display system and method adopting LED3D screen
CN115842907A (en) * 2018-03-27 2023-03-24 京东方科技集团股份有限公司 Rendering method, computer product and display device
CN109087260A (en) * 2018-08-01 2018-12-25 北京七鑫易维信息技术有限公司 A kind of image processing method and device
CN109640070A (en) * 2018-12-29 2019-04-16 上海曼恒数字技术股份有限公司 A kind of stereo display method, device, equipment and storage medium
CN109785445B (en) 2019-01-22 2024-03-08 京东方科技集团股份有限公司 Interaction method, device, system and computer readable storage medium
CN109829981B (en) * 2019-02-16 2023-06-27 深圳市未来感知科技有限公司 Three-dimensional scene presentation method, device, equipment and storage medium
CN109901713B (en) * 2019-02-25 2020-07-17 山东大学 Multi-person cooperative assembly system and method
CN110222289A (en) * 2019-06-13 2019-09-10 厦门商集网络科技有限责任公司 A kind of implementation method and computer media of the digital exhibition room that can flexibly manipulate
CN111142825B (en) * 2019-12-27 2024-04-16 杭州拓叭吧科技有限公司 Multi-screen visual field display method and system and electronic equipment
CN111915711B (en) * 2020-08-04 2025-01-28 北京吉威空间信息股份有限公司 Method and device for obtaining stereoscopic images of three-level land classification spots supporting virtual VR
CN112235562B (en) * 2020-10-12 2023-09-15 聚好看科技股份有限公司 3D display terminal, controller and image processing method
CN112354179B (en) * 2020-11-23 2023-09-05 浙江中控信息产业股份有限公司 Three-dimensional geographic information content display and interaction method
CN112929651B (en) * 2021-01-25 2025-01-24 北京信息科技大学 Display method, device, electronic device and storage medium
CN114327346B (en) * 2021-12-27 2023-09-29 北京百度网讯科技有限公司 Display method, display device, electronic apparatus, and storage medium
CN114862657A (en) * 2022-06-02 2022-08-05 北京蔚领时代科技有限公司 Dual-display-card rendering method and device
CN116880723B (en) * 2023-09-08 2023-11-17 江西格如灵科技股份有限公司 3D scene display method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157012A (en) * 2011-03-23 2011-08-17 深圳超多维光电子有限公司 Method for three-dimensionally rendering scene, graphic image treatment device, equipment and system
CN102520970A (en) * 2011-12-28 2012-06-27 Tcl集团股份有限公司 Dimensional user interface generating method and device
CN103279942A (en) * 2013-04-10 2013-09-04 北京航空航天大学 Control method for realizing virtual 3D (3-dimension) display on 2D (2-dimension) screen on basis of environment sensor
CN103402106A (en) * 2013-07-25 2013-11-20 青岛海信电器股份有限公司 Method and device for displaying three-dimensional image

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2361341A1 (en) * 2001-11-07 2003-05-07 Idelix Software Inc. Use of detail-in-context presentation on stereoscopically paired images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157012A (en) * 2011-03-23 2011-08-17 深圳超多维光电子有限公司 Method for three-dimensionally rendering scene, graphic image treatment device, equipment and system
CN102520970A (en) * 2011-12-28 2012-06-27 Tcl集团股份有限公司 Dimensional user interface generating method and device
CN103279942A (en) * 2013-04-10 2013-09-04 北京航空航天大学 Control method for realizing virtual 3D (3-dimension) display on 2D (2-dimension) screen on basis of environment sensor
CN103402106A (en) * 2013-07-25 2013-11-20 青岛海信电器股份有限公司 Method and device for displaying three-dimensional image

Also Published As

Publication number Publication date
CN105282532A (en) 2016-01-27

Similar Documents

Publication Publication Date Title
CN105282532B (en) 3D display method and apparatus
US10269139B2 (en) Computer program, head-mounted display device, and calibration method
CN109446892B (en) Human eye attention positioning method and system based on deep neural network
KR102096730B1 (en) Image display method, method for manufacturing irregular screen having curved surface, and head-mounted display device
CN105959665B (en) A kind of panorama 3D video generation methods being directed to virtual reality device
WO2023071834A1 (en) Alignment method and alignment apparatus for display device, and vehicle-mounted display system
US20160295194A1 (en) Stereoscopic vision system generatng stereoscopic images with a monoscopic endoscope and an external adapter lens and method using the same to generate stereoscopic images
US10553014B2 (en) Image generating method, device and computer executable non-volatile storage medium
JP2008535116A (en) Method and apparatus for three-dimensional rendering
TWI594018B (en) Wide angle stereoscopic image display method, stereoscopic image display device and operation method thereof
JPH0974573A (en) 3D CG image generator
CN106170086B (en) Method and device thereof, the system of drawing three-dimensional image
CN105825499A (en) Reference plane determination method and determination system
CN106713894B (en) A kind of tracking mode stereo display method and equipment
CN110827392A (en) Monocular image three-dimensional reconstruction method, system and device with good scene usability
TW202123694A (en) 3d display apparatus and 3D image display method
CN108345108A (en) Head-mounted display apparatus, the generation method of three-dimensional image information and device
US20140168375A1 (en) Image conversion device, camera, video system, image conversion method and recording medium recording a program
US20120120068A1 (en) Display device and display method
US11403830B2 (en) Image processing device, image processing method, and program
JP2019185283A (en) Three-dimensional model generation apparatus and program thereof, and IP stereoscopic image display system
CN111491159A (en) Augmented reality display system and method
CN102244732B (en) A kind of parameter setting method of stereo camera, device and this stereo camera
WO2016179694A1 (en) Spherical omnipolar imaging
JPH05288532A (en) Stereoscopic image input device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180622

Termination date: 20200603