CN114100134B - Picture display method, device, equipment, medium and program product of virtual scene - Google Patents
Picture display method, device, equipment, medium and program product of virtual scene Download PDFInfo
- Publication number
- CN114100134B CN114100134B CN202111672379.7A CN202111672379A CN114100134B CN 114100134 B CN114100134 B CN 114100134B CN 202111672379 A CN202111672379 A CN 202111672379A CN 114100134 B CN114100134 B CN 114100134B
- Authority
- CN
- China
- Prior art keywords
- pupil
- sighting telescope
- state
- frame
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 115
- 210000001747 pupil Anatomy 0.000 claims abstract description 254
- 238000012634 optical imaging Methods 0.000 claims abstract description 49
- 230000000694 effects Effects 0.000 claims abstract description 46
- 230000000007 visual effect Effects 0.000 claims abstract description 33
- 238000003860 storage Methods 0.000 claims abstract description 14
- 238000004590 computer program Methods 0.000 claims abstract description 13
- 230000008569 process Effects 0.000 claims description 49
- 238000013507 mapping Methods 0.000 claims description 37
- 239000000463 material Substances 0.000 claims description 26
- 230000015654 memory Effects 0.000 claims description 23
- 230000004044 response Effects 0.000 claims description 21
- 238000005070 sampling Methods 0.000 claims description 20
- 238000009877 rendering Methods 0.000 claims description 19
- 239000003086 colorant Substances 0.000 claims description 17
- 238000003384 imaging method Methods 0.000 claims description 14
- 230000008859 change Effects 0.000 claims description 12
- 238000010304 firing Methods 0.000 claims description 11
- 230000001960 triggered effect Effects 0.000 claims description 11
- 230000003068 static effect Effects 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 16
- 230000001276 controlling effect Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 12
- 230000003287 optical effect Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 230000008447 perception Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000004438 eyesight Effects 0.000 description 4
- 230000016776 visual perception Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000011068 loading method Methods 0.000 description 3
- 238000013515 script Methods 0.000 description 3
- 238000002835 absorbance Methods 0.000 description 2
- 235000019800 disodium phosphate Nutrition 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- KRHYYFGTRYWZRS-UHFFFAOYSA-M Fluoride anion Chemical compound [F-] KRHYYFGTRYWZRS-UHFFFAOYSA-M 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000001339 gustatory effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004549 pulsed laser deposition Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/307—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a picture display method, a device, equipment, a computer readable storage medium and a computer program product of a virtual scene; the method comprises the following steps: adopting a first person viewing angle to present an interface of aiming the virtual object by using an aiming lens included in the shooting prop; displaying a frame of an eyepiece in the sighting telescope in the interface, wherein a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an inner area of the frame; displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture; when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously so as to adapt to the changed state. According to the application, the real and vivid scene picture can be displayed so as to improve the presence in the virtual scene.
Description
Priority description
The application claims a picture display method, a device, equipment, a medium and a priority of a program product, wherein the application number is 202111226478.2, the application date is 2021, 10 and 21, and the name is virtual scene.
Technical Field
The present application relates to image processing technology, and in particular, to a method, an apparatus, a device, a computer readable storage medium, and a computer program product for displaying a virtual scene.
Background
In applications of virtual scenes such as games, the presence is a key factor that determines whether a player can be brought into the virtual scene, for example, for shooting-type games, the key factor that encourages the player to be immersed in the shooting scene is: whether the game scene observed by the player through the sighting telescope of the shooting prop is realistic or not, the related technology is lack of technical means for realizing the presence feeling.
Disclosure of Invention
Embodiments of the present application provide a method, apparatus, device, computer readable storage medium, and computer program product for displaying a scene of a virtual scene, which can display a scene of a real and vivid scene to improve the sense of reality in the virtual scene.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a picture display method of a virtual scene, which comprises the following steps:
adopting a first person viewing angle to present an interface of aiming the virtual object by using an aiming lens included in the shooting prop;
Displaying a frame of an eyepiece in the sighting telescope in the interface, wherein a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an inner area of the frame;
Displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture;
when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously so as to adapt to the changed state.
The embodiment of the application provides a picture display device of a virtual scene, which comprises:
The first presenting module is used for presenting an interface of aiming the virtual object by using an aiming lens included in the shooting prop by adopting a first person perspective;
The second presentation module is used for presenting the frame of the ocular in the sighting telescope in the interface, and the internal area of the frame is provided with a visual field picture of the virtual scene seen by the virtual object through the sighting telescope;
The third presentation module is used for displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture;
And the presentation control module is used for controlling the display effect of the pupil to synchronously change when the state of the sighting telescope or the state of the virtual object is changed so as to adapt to the changed state.
In the above solution, the first presenting module is further configured to perform a lens opening operation for the sighting telescope in response to a lens opening instruction for the shooting prop;
in the process of executing the operation of opening the sighting telescope, a first-person view angle is adopted, and a process that a sighting interface corresponding to the sighting telescope is gradually opened is displayed;
And the third presentation module is also used for displaying the change process of pupil formed by optical imaging of the outer frame of the sighting telescope from partial to full along with the execution of the mirror opening operation between the frame and the visual field picture in the process of executing the mirror opening operation of the sighting telescope.
In the above scheme, the sighting telescope further comprises an objective lens and a lens cone, wherein the objective lens and the ocular lens are respectively positioned at two ends of the lens cone;
the first presentation module is further configured to display, between the frame and the view field picture, a pupil formed by optical imaging of a barrel of the sighting telescope, where a color of the pupil is consistent with a color of the barrel.
In the above scheme, the device further includes:
And the color switching module is used for switching the current color of the frame of the ocular in the interface to a target color and switching the color of the pupil to the target color in response to a color switching instruction for the lens barrel.
In the above aspect, the third presenting module is further configured to obtain a material of an objective lens included in the sighting telescope and a material of the eyepiece;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
And displaying a pupil formed by optical imaging of the outer frame of the sighting telescope based on the transparency between the frame and the visual field picture.
In the above aspect, the third presenting module is further configured to display, between the frame and the view field picture, a pupil in a first state formed by performing optical imaging on an outer frame of the telescope, in a process that the telescope is in a stationary state or the virtual object is in a stationary state;
The first state is static, and when the pupil is in the first state, the pupil comprises two concentric circles, wherein the colors of the pupils in the outer circle are the same, and the colors of the pupils in the inner circle gradually fade along the radial direction from the edge of the inner circle to the center of the eyepiece.
In the above solution, the third presenting module is further configured to display, in a process that the virtual object is in a traveling state, a pupil in a second state formed by performing optical imaging on an outer frame of the telescope;
Wherein the second state is dynamic, and the pupil part masks the view field picture and the size of the masked view field picture changes in the process that the pupil is in the second state.
In the above scheme, the device further includes:
And the fourth presentation module is used for dynamically displaying the mask proportion of the pupil to the visual field picture in the process of displaying the pupil in the second state.
In the above scheme, the device further includes:
a state holding module for controlling the shooting prop to continuously shoot the object aimed by the sighting telescope in response to a continuous shooting instruction for the shooting prop;
and in the process of continuously shooting the object, keeping the pupil in a default state so that the display effect of the pupil in the process of continuously shooting the object is unchanged.
In the above scheme, the device further includes:
a state control module for controlling the shooting prop to shoot the target aimed by the sighting telescope in response to the shooting command to the shooting prop triggered at the first moment, and
And controlling the pupil to be in a default state in a target period taking the first moment as a starting moment, so that the display effect of the pupil in the target period is unchanged.
In the above scheme, the device further includes:
A display updating module for releasing the default state of the pupil and when a second time after the first time arrives and the time interval between the second time and the first time is the same as the target period
Acquiring the state of the sighting telescope and the state of the virtual object;
And updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
In the above scheme, the shooting prop is also equipped with a side sighting telescope, and the device further comprises:
a scope switching module for switching the scope to the side scope in response to a scope switching instruction for the firing prop, the magnification of the side scope being different from the scope;
and displaying a corresponding aiming interface of the side aiming mirror, and displaying a first pupil formed by optical imaging of an outer frame of the side aiming mirror in the aiming interface, wherein the width of the first pupil corresponds to the magnification of the side aiming mirror.
In the above scheme, the device further includes:
the magnification selection module is used for presenting at least two magnification options in the interface;
based on the at least two magnification options, responding to a switching instruction for the sighting telescope, canceling an interface for displaying the interface to display the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
responding to a mirror opening instruction triggered based on the interface of the virtual scene, and presenting a sighting interface corresponding to the target sighting telescope;
And displaying a second pupil formed by optical imaging of the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
In the above scheme, the device further includes:
The definition adjusting module is used for acquiring light environments in the virtual scene where the virtual object is located, and the light intensities in different light environments are different;
When the light environment where the virtual object is located is changed, the definition of the pupil is synchronously changed so as to adapt to the changed light environment.
In the above solution, before the displaying the view field picture of the virtual scene that the virtual object sees through the sighting telescope, the apparatus further includes:
The surface patch determining module is used for creating a surface patch consistent with an eyepiece of the sighting telescope and drawing a mapping corresponding to the surface patch, wherein the mapping comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent;
Combining the map with the dough sheet to obtain a target dough sheet;
the third presentation module is further configured to obtain a scene patch corresponding to the virtual scene and a frame patch corresponding to an outer frame of the sighting telescope;
And rendering the scene surface patch, the frame surface patch and the target surface patch according to a preset rendering sequence so as to display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the visual field picture.
In the above scheme, the patch determining module is further configured to obtain an initial map corresponding to the patch, where the initial map includes the first portion and an initial portion corresponding to the pupil, and a color value of each pixel in the initial portion corresponds to a default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining a center point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the center point, sampling pixels in the initial mapping to obtain the mapping corresponding to the patch.
The embodiment of the application provides a terminal device, which comprises:
a memory for storing executable instructions;
And the processor is used for realizing the picture display method of the virtual scene provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium which stores executable instructions for causing a processor to execute, thereby realizing the picture display method of the virtual scene.
The embodiment of the application has the following beneficial effects:
In the process of aiming through the sighting telescope included by the shooting prop, besides presenting the visual field picture of the virtual scene, the pupil formed by optical imaging of the outer frame of the sighting telescope is displayed between the frame and the visual field picture of the sighting telescope, and when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously, so that the real optical phenomenon in real life is introduced in the application of aiming of the virtual scene, the real and vivid prop picture can be displayed, and the presence in the virtual scene is improved.
Drawings
Fig. 1A is an application mode schematic diagram of a method for displaying a picture of a virtual scene according to an embodiment of the present application;
Fig. 1B is an application mode schematic diagram of a method for displaying a picture of a virtual scene according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device 400 according to an embodiment of the present application;
Fig. 3 is a flowchart illustrating a method for displaying a picture of a virtual scene according to an embodiment of the present application;
Fig. 4 is a schematic diagram of a composition structure of a telescope according to an embodiment of the present application;
FIG. 5 is a schematic view of an aiming interface provided by an embodiment of the present application;
FIG. 6 is a schematic view of an aiming interface of a mirror opening process according to an embodiment of the present application;
FIG. 7 is a schematic view of an aiming interface according to an embodiment of the present application;
FIG. 8 is a schematic view of a telescope according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a mapping provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a map display provided by an embodiment of the present application;
FIG. 11 is a flow chart of a method of generating a map provided by an embodiment of the present application;
FIG. 12 is a diagram illustrating a map update according to an embodiment of the present application;
fig. 13 is a flowchart of a mapping updating method according to an embodiment of the present application.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the term "first\second …" is merely to distinguish similar objects and does not represent a particular ordering for objects, it being understood that "first\second …" may be interchanged in a particular order or precedence where allowed, to enable embodiments of the application described herein to be implemented in an order other than that illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) And a client, an application program for providing various services, such as a video playing client, a game client, etc., running in the terminal.
2) In response to a condition or state that is used to represent the condition or state upon which the performed operation depends, the performed operation or operations may be in real-time or with a set delay when the condition or state upon which it depends is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) The virtual scene is a virtual scene displayed (or provided) when the application program runs on the terminal, and the virtual scene can be a simulation environment for a real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, the virtual scene may include sky, land, sea, etc., the land may include environmental elements such as desert, city, etc., and the user may control the virtual object to move in the virtual scene.
4) Virtual objects, images of various people and objects in a virtual scene that can interact, or movable objects in a virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as a character, an animal, etc., displayed in a virtual scene. The virtual object may be an avatar in the virtual scene for representing a user. A virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene, occupying a portion of space in the virtual scene.
5) Scene data representing feature data in a virtual scene may include, for example, the position of a virtual object in the virtual scene, time to wait when various functions are configured in the virtual scene (depending on the number of times the same function can be used in a specific time), and attribute values representing various states of the game virtual object, such as a life value and a magic value.
6) A diaphragm refers to an entity that plays a limiting role on a light beam in an optical system, for example, a pupil that limits the imaging light speed, or a pupil or frame that limits the imaging range. It may be the edge of a lens, a frame or a specially arranged perforated screen. The aperture diaphragm or the imaging corresponding to the aperture diaphragm is required to be outside the optical system, so that the pupil of the eye is overlapped with the aperture diaphragm, and a good observation effect is achieved.
7) Pupil, which is the imaging of aperture diaphragm, is divided into entrance pupil and exit pupil, wherein the entrance pupil is the imaging formed by the front optical system, and is called as entrance pupil for short; the exit pupil is an image formed by the optical system behind it, simply referred to as the exit pupil.
The embodiment of the application provides a picture display method, a device, terminal equipment, a computer readable storage medium and a computer program product for a virtual scene, which can display a real and vivid scene picture to improve the sense of reality in the virtual scene. In order to facilitate easier understanding of the method for displaying a picture of a virtual scene provided by the embodiment of the present application, first, an exemplary implementation scenario of the method for displaying a picture of a virtual scene provided by the embodiment of the present application is described, where the virtual scene in the method for displaying a picture of a virtual scene provided by the embodiment of the present application may be output based on a terminal device entirely, or may be output based on a cooperation of the terminal device and a server.
In some embodiments, the virtual scene may also be an environment for interaction of game characters, for example, the game characters may fight in the virtual scene, and both parties may interact in the virtual scene by controlling actions of the game characters, so that a user can relax life pressure in the game process.
In an implementation scenario, referring to fig. 1A, fig. 1A is a schematic application mode diagram of a method for displaying a picture of a virtual scenario provided in an embodiment of the present application, which is suitable for some application modes that can complete relevant data calculation of the virtual scenario 100 completely depending on the graphics processing hardware computing capability of the terminal device 400, for example, a game in a stand-alone mode/an offline mode, and output of the virtual scenario is completed through various different types of terminal devices 400 such as a smart phone, a tablet computer, and a virtual reality/augmented reality device. By way of example, the types of graphics processing hardware include central processing units (CPU, central Processing Unit) and graphics processors (GPU, graphics Processing Unit).
When forming the visual perception of the virtual scene 100, the terminal device 400 calculates the data required for display through the graphic computing hardware, and completes loading, analysis and rendering of the display data, and outputs video frames capable of forming the visual perception for the virtual scene at the graphic output hardware, for example, video frames in two dimensions are presented on the display screen of the smart phone, or video frames for realizing three-dimensional display effect are projected on the lenses of the augmented reality/virtual reality glasses; in addition, to enrich the perceived effect, the terminal device 400 may also form one or more of auditory perception, tactile perception, motion perception and gustatory perception by means of different hardware.
As an example, the terminal device 400 has a client 410 (e.g., a stand-alone game application) running thereon, and outputs a virtual scene including role playing during the running of the client 410, where the virtual scene may be an environment for interaction of a game character, such as a plain, a street, a valley, etc. for the game character to fight against; taking the example of displaying the virtual scene 100 from the first person perspective, an interface 101 is presented in the virtual scene 100, in which the virtual object is aimed by using a sighting telescope included in the shooting prop; wherein the virtual object may be a game character controlled by a user, i.e. the virtual object is controlled by a real user, will move in the virtual scene 100 in response to the operation of the real user on a controller (e.g. touch screen, voice operated switch, keyboard, mouse, joystick, etc.), for example when the real user moves the joystick to the right, the virtual object will move to the right in the virtual scene 100, and may also remain stationary in place, jump, control the virtual object to perform shooting operations, etc.
For example, an interface 101 where a virtual object is aimed by using a sighting telescope included in a shooting prop is presented in a virtual scene 100, a frame 102 of an eyepiece in the sighting telescope is presented in the aimed interface 101, a field view picture 103 of the virtual scene seen by the virtual object through the sighting telescope is presented in an inner area of the frame, and a pupil 104 formed by optical imaging of an outer frame of the sighting telescope is displayed between the frame 102 and the field view picture 103; when the state of the scope or the state of the virtual object is changed, the display effect of the control pupil is synchronously changed to adapt to the changed state, for example, when the color of the lens barrel in the scope is black, a black pupil is displayed between the frame 102 and the view field picture 103, and when the color of the lens barrel in the scope is switched from black to red, the color of the pupil displayed between the frame 102 and the view field picture 103 is switched from black to red; in this way, compared with the prior art that all view images are presented in the frame 102, pupils corresponding to the sighting telescope are also presented between the frame 102 and the view images 103, so that a real stereoscopic impression is formed, and the presence of a user in a virtual scene is improved.
In another implementation scenario, referring to fig. 1B, fig. 1B is a schematic application mode diagram of a method for displaying a picture of a virtual scene, which is applied to a terminal device 400 and a server 200 and is applicable to an application mode that completes calculation of the virtual scene depending on the computing power of the server 200 and outputs the virtual scene at the terminal device 400.
Taking the example of forming the visual perception of the virtual scene 100, the server 200 performs calculation of virtual scene related display data (such as scene data) and sends the calculated display data to the terminal device 400 through the network 300, the terminal device 400 finishes loading, analyzing and rendering the calculated display data depending on the graphic calculation hardware, and outputs the virtual scene depending on the graphic output hardware to form the visual perception, for example, a two-dimensional video frame can be presented on a display screen of a smart phone or a video frame for realizing a three-dimensional display effect can be projected on a lens of an augmented reality/virtual reality glasses; as regards the perception of the form of the virtual scene, it is understood that the auditory perception may be formed by means of the corresponding hardware output of the terminal device 400, for example using a microphone, the tactile perception may be formed using a vibrator, etc.
As an example, the terminal device 400 is provided with a client 410 (e.g. a network version of a game application) running thereon, and performs game interaction with other users through the connection server 200 (e.g. a game server), the terminal device 400 outputs a virtual scene 100 of the client 410, and takes the first person perspective as an example to display the virtual scene 100, and presents an interface 101 of aiming by using an aiming lens included in shooting props in the virtual scene 100 by a virtual object; wherein the virtual object may be a game character controlled by a user, i.e. the virtual object is controlled by a real user, will move in the virtual scene 100 in response to the operation of the real user on a controller (e.g. touch screen, voice operated switch, keyboard, mouse, joystick, etc.), for example when the real user moves the joystick to the right, the virtual object will move to the right in the virtual scene 100, and may also remain stationary in place, jump, control the virtual object to perform shooting operations, etc.
For example, an interface 101 where a virtual object is aimed by using a sighting telescope included in a shooting prop is presented in a virtual scene 100, a frame 102 of an eyepiece in the sighting telescope is presented in the aimed interface 101, a field view picture 103 of the virtual scene seen by the virtual object through the sighting telescope is presented in an inner area of the frame, and a pupil 104 formed by optical imaging of an outer frame of the sighting telescope is displayed between the frame 102 and the field view picture 103; when the state of the scope or the state of the virtual object is changed, the display effect of the control pupil is synchronously changed to adapt to the changed state, for example, when the color of the lens barrel in the scope is black, a black pupil is displayed between the frame 102 and the view field picture 103, and when the color of the lens barrel in the scope is switched from black to red, the color of the pupil displayed between the frame 102 and the view field picture 103 is switched from black to red; in this way, compared with the prior art that all view images are presented in the frame 102, pupils corresponding to the sighting telescope are also presented between the frame 102 and the view images 103, so that a real stereoscopic impression is formed, and the presence of a user in a virtual scene is improved.
In some embodiments, the terminal device 400 may implement the method for displaying a picture of a virtual scene provided by the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a local (Native) application (APP, APPlication), i.e., a program that needs to be installed in an operating system to run, such as a shooting game APP (i.e., client 410 described above); the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; but also a game applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
Taking a computer program as an example of an application program, in actual implementation, the terminal device 400 installs and runs an application program supporting a virtual scene. The application may be any of a First person shooter game (FPS), a third person shooter game, a virtual reality application, a three-dimensional map program, or a multiplayer gunfight survival game. The user uses the terminal device 400 to operate a virtual object located in a virtual scene to perform activities including, but not limited to: at least one of body posture adjustment, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing, building a virtual building. Illustratively, the virtual object may be a virtual character, such as an emulated persona or a cartoon persona, or the like.
In other embodiments, the embodiments of the present application may also be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology that unifies serial resources such as hardware, software, networks, etc. in a wide area network or a local area network, so as to implement calculation, storage, processing, and sharing of data.
The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources.
For example, the server 200 in fig. 1B may be an independent physical server, or may be a server cluster or a distributed system formed by a plurality of physical servers, or may be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The terminal device 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The terminal device 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present application.
The structure of the terminal device 400 shown in fig. 1A is explained below. Referring to fig. 2, fig. 2 is a schematic structural diagram of a terminal device 400 provided in an embodiment of the present application, and the terminal device 400 shown in fig. 2 includes: at least one processor 420, a memory 460, at least one network interface 430, and a user interface 440. The various components in terminal device 400 are coupled together by bus system 450. It is understood that bus system 450 is used to implement the connected communications between these components. The bus system 450 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 450 in fig. 2.
The Processor 420 may be an integrated circuit chip having signal processing capabilities such as a general purpose Processor, such as a microprocessor or any conventional Processor, a digital signal Processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
The user interface 440 includes one or more output devices 441 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 440 also includes one or more input devices 442, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
Memory 460 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 460 optionally includes one or more storage devices physically remote from processor 420.
Memory 460 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM) and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 460 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 460 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 461 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
Network communication module 462 for reaching other computing devices via one or more (wired or wireless) network interfaces 430, the exemplary network interfaces 430 comprising: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
a presentation module 463 for enabling presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 441 (e.g., a display screen, speakers, etc.) associated with the user interface 440;
an input processing module 464 for detecting one or more user inputs or interactions from one of the one or more input devices 442 and translating the detected inputs or interactions.
In some embodiments, the device for displaying a picture of a virtual scene provided by the embodiments of the present application may be implemented in a software manner, and fig. 2 shows a device for displaying a picture 465 of a virtual scene stored in a memory 460, which may be software in the form of a program, a plug-in, or the like, and includes the following software modules: the first 4651, second 4652, third 4653 and presentation control 4654 modules are logical, so that any combination or further splitting may be performed depending on the functions implemented, the functions of each module being described below.
In other embodiments, the apparatus for displaying a virtual scene according to the embodiments of the present application may be implemented in hardware, and by way of example, the apparatus for displaying a virtual scene according to the embodiments of the present application may be a processor in the form of a hardware decoding processor that is programmed to perform the method for displaying a virtual scene according to the embodiments of the present application, for example, the processor in the form of a hardware decoding processor may use one or more Application specific integrated circuits (ASICs, applications SPECIFIC INTEGRATED circuits), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex Programmable logic devices (CPLDs, complex Programmable Logic Device), field-Programmable gate arrays (FPGAs), field-Programmable GATE ARRAY), or other electronic components.
The method for displaying the virtual scene provided by the embodiment of the application is specifically described below with reference to the accompanying drawings. The method for displaying the virtual scene provided by the embodiment of the application can be independently executed by the terminal equipment 400 in fig. 1A, or can be cooperatively executed by the terminal equipment 400 and the server 200 in fig. 1B.
Next, a picture presentation method of a virtual scene provided by the embodiment of the present application is described by taking a terminal device 400 in fig. 1A as an example. Referring to fig. 3, fig. 3 is a flowchart of a method for displaying a picture of a virtual scene according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 3.
It should be noted that the method shown in fig. 3 may be executed by various computer programs running on the terminal device 400, and is not limited to the above-mentioned client 410, but may also be the operating system 461, software modules and scripts described above, and therefore the client should not be considered as limiting the embodiments of the present application.
Step 101: the terminal adopts a first person viewing angle, and presents an interface of aiming the virtual object by using an aiming mirror included in the shooting prop.
The sighting telescope is used for improving the accuracy of shooting targets by using shooting props, and the virtual object can observe a scene picture of a virtual scene through the sighting telescope.
Referring to fig. 4, fig. 4 is a schematic diagram of a composition structure of a telescope provided by the embodiment of the present application, as shown in fig. 4, the telescope includes an objective lens, an inverted image group, an adjusting hand wheel, a lens barrel, an eyepiece, and other components, where the objective lens is located at a front end of the telescope, and is a component of the telescope for receiving an external light source, the larger the diameter of the objective lens is, the more light sources can be received, the larger the diameter of the telescope is at the same distance, then a clearer image can be seen by using a virtual object of the telescope, and in order to obtain more lighting, a layer of fluoride can be plated on a surface of the objective lens to increase the light transmission amount and reduce the reflection amount, if the objective lens is seen to have purple or yellow reflection, the reflection is caused by the plated film. The adjusting hand wheel is located the middle part of lens cone, including the focusing hand wheel of direction hand wheel and the objective of gun sight, wherein, the direction hand wheel is used for adjusting the horizontal direction in order to revise windage and moving target's advance, can make the image more clear through the focusing hand wheel, can reduce the error simultaneously. The image inverting group is used for image alignment because the convex lens forms an inverted and enlarged real image outside the double focal length, and the eyepiece lens sees an enlarged and inverted virtual image due to the too small distance from the pupil, so that the enlarged and inverted image seen from the eyepiece lens is aligned by the image inverting group. The lens cone is used for loading the reverse image group, the adjusting hand wheel and other parts, plays roles in protecting and transmitting light, and the larger the diameter of the lens cone is, the larger the light brightness is and the lower the refraction angle is, so that the image can be clearer. The eyepiece is positioned at the rearmost end of the sighting telescope, and the eyepiece is used for further magnifying and transmitting the image magnified from the objective lens into human eyes.
Step 102: and displaying the frame of the ocular in the sighting telescope in the interface, wherein the inner area of the frame is provided with a visual field picture of the virtual scene seen by the virtual object through the sighting telescope.
Step 103: and displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture.
Referring to fig. 5, fig. 5 is a schematic view of an aiming interface provided by an embodiment of the present application, in which a first-person viewing angle is adopted to present an interface 501 where a virtual object is aimed by using a sighting telescope, a frame 502 of an eyepiece is presented in the interface 501, and a field view screen 503 of a virtual scene where the virtual object is seen by the sighting telescope is presented in an inner area of the frame 502, that is, scene elements of the virtual scene at corresponding positions where the virtual object is seen by the sighting telescope, such as virtual trees, stones, grasslands, etc., are displayed in the field view screen 503; a pupil, such as a black border 504 having a certain width, is displayed between the bezel 502 and the field of view 503, modeling a realistic stereoscopic impression.
In some embodiments, the terminal may employ the first-person perspective to present an interface at which the virtual object is aimed using a scope included with the shooting prop by: in response to a mirror opening instruction for a shooting prop, performing a mirror opening operation for a sighting mirror; in the process of executing the operation of opening the sighting telescope, a first-person visual angle is adopted, and a process that a sighting interface corresponding to the sighting telescope is gradually opened is displayed; correspondingly, the terminal can display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture in the following manner: in the course of executing the open mirror operation for the sighting telescope, along with the execution of the open mirror operation, the pupil formed by optical imaging of the outer frame of the sighting telescope is displayed in a full-range changing process from the side frame to the view field picture.
The operation of opening a scope refers to a process of moving a scope included in a shooting prop to a sight position of a virtual object, which is also referred to as an aiming operation or a focusing operation, for example, referring to fig. 6, fig. 6 is a schematic diagram of an aiming interface in the opening a scope process provided in an embodiment of the present application, in the process of executing the opening a scope, the aiming interface for viewing a virtual scene through the scope is gradually opened, that is, a process of displaying that a view image in the aiming interface is from none to full is displayed, and accordingly, along with execution of the opening a pupil is changed from partial to full, when the opening a scope is completed, the pupil is uniformly distributed between a frame and the view image.
In some embodiments, in the process of executing the mirror opening operation, if the user opens and closes the mirror opening operation, the display aiming interface is canceled, and vision switching is performed, for example, the first person vision is switched to the third person vision, and a picture of the virtual scene under the third person vision is presented.
In some embodiments, the sighting telescope further comprises an objective lens and a lens barrel, wherein the objective lens and the eyepiece are respectively positioned at two ends of the lens barrel; the terminal can display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture in the following manner: and displaying a pupil formed by optical imaging of the lens barrel of the sighting telescope between the frame and the view field picture, wherein the color of the pupil is consistent with that of the lens barrel.
Here, in general, a lens barrel connecting an eyepiece and an objective lens affects the color of a pupil, and in order to give a user a sense of reality, the pupil may be displayed in a color corresponding to the color of the lens barrel, for example, the color of the lens barrel is black, and a black pupil is displayed between a frame and a field of view.
In some embodiments, the terminal may implement the switching of pupil colors by: and responding to a color switching instruction aiming at the lens barrel, switching the current color of the frame of the ocular in the interface to a target color, and switching the color of the pupil to the target color.
The color switching instruction can be triggered based on a color switching control presented in the aimed interface, or can be triggered by triggering a color switching button located on the launching prop. In some embodiments, a plurality of colors may be preset for a lens barrel in a sighting telescope of an emission prop, and the plurality of colors may be arranged according to a certain sequence, when the colors of the pupils are switched, the colors may be switched according to the preset sequence, for example, if the colors of the current lens barrel are black, at this time, the frame of an eyepiece in the sighting interface is black, the colors of the pupils are also black, and the red in the arrangement sequence is close to the back of the black, when the terminal receives a color switching instruction, the colors of the eyepiece in the sighting interface are automatically switched from black to red, and the colors of the pupils are switched from black to red. In other embodiments, when the terminal receives the color switching instruction, at least two options for the user to select the color may be further presented, and when the user selects the option corresponding to the target color, the terminal responds to the selection operation to automatically switch the current color of the frame of the eyepiece in the aimed interface to the selected target color, and switch the color of the pupil from the current color to the target color.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer rim of the scope between the rim and the field of view by: acquiring the material of an objective lens and the material of an eyepiece of the sighting telescope; determining an adapted transparency based on the material of the objective lens and the material of the eyepiece; and displaying a pupil formed by optical imaging of the outer frame of the sighting telescope based on transparency between the frame and the visual field picture.
Here, the transparency effect of the pupil is related to the materials of the eyepiece and the objective lens in the scope, the absorbance and transmittance of the eyepiece and the objective lens of different materials are different, and in order to make the pupil observed through the scope more realistic, the corresponding transparency can be determined based on the materials of the objective lens and the eyepiece lens, and the pupil of the determined transparency is displayed.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer rim of the scope between the rim and the field of view by: displaying a pupil in a first state, which is formed by optical imaging of an outer frame of the sighting telescope, between the frame and a view field picture in the process that the sighting telescope is in a static state or a virtual object is in a static state; the first state is that when the static pupil is in the first state, the static pupil comprises two concentric circular rings, wherein the colors of the pupils in the outer circular ring are the same, and the colors of the pupils in the inner circular ring gradually fade along the radial direction from the edge of the inner circular ring to the center of the eyepiece.
For example, in fig. 5, when a virtual object using a scope or the scope is in a stationary state, light in a virtual scene is incident into the scope in a direction parallel to the scope, the objective lens is on the same axis as the eyepiece, the light is not blocked, a pupil observed through the scope exhibits a progressive effect, as the pupil is displayed divided into two concentric rings, wherein the pupil color in the outer ring is unchanged, the pupil color in the inner ring is radial to the center of the eyepiece, and the pupil color exhibits a progressive effect.
In some embodiments, the terminal may display a pupil formed by optically imaging the outer rim of the scope between the rim and the field of view by: displaying a pupil in a second state formed by optical imaging of an outer frame of the sighting telescope in the process that the virtual object is in a traveling state; the second state is dynamic, and in the process that the pupil is in the second state, the pupil partially masks the view field picture, and the size of the mask view field picture changes.
Here, when the virtual object is in a traveling state, a shake of the shooting prop is caused, so that light in the virtual scene cannot be incident into the sighting telescope along a direction parallel to the sighting telescope, that is, the light is obliquely incident into the sighting telescope, and part of the light is blocked by the lens barrel, so that imaging differences are caused, for example, referring to fig. 7, fig. 7 is a schematic view of a sighting interface in an embodiment of the present application, during the traveling process of the virtual object, a view field picture is masked through a pupil seen in the sighting interface, and the size of the view field picture of the mask dynamically changes with the traveling of the virtual object.
In some embodiments, the terminal may also dynamically show the mask proportion of the pupil to the view field picture in displaying the pupil in the second state. Here, the mask ratio is used to indicate the degree of deviation of the direction of the shooting prop from the line of sight direction of the virtual object, and thereby indicate the player to adjust the direction of the shooting prop.
In some embodiments, the terminal may also control the firing prop to fire continuously on the object aimed by the scope in response to a continuous firing instruction for the firing prop; the pupil is maintained in a default state during successive shots of the aimed object, so that the pupil is unchanged in display during successive shots of the aimed object.
Here, on the basis of displaying the pupil between the frame and the view, in general, when shooting a target virtually using the launching prop, the width of the pupil may become large due to an angle problem, or there may be shake of the scope during the mirror opening operation, in which case the presented pupil may be changed, which will have a negative effect on the aiming accuracy, according to the embodiment of the application, when the virtual object is controlled to shoot the aimed target object by using the launching prop, the pupil is controlled to be in a default state, for example, when the pupil is black, the width of the black is limited to be unchanged within a certain range, so that the black cannot shade an oversized view field picture, negative influence of the presented pupil effect on aiming can be avoided, and the balance between reality and playability of the virtual scene is ensured.
Step 104: when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the control pupil is synchronously changed to adapt to the changed state.
Here, after the pupil is displayed, when the state of the virtual object is changed, the display effect of the control pupil is changed in synchronization to adapt to the changed state of the scope or the virtual object.
In some embodiments, the terminal may respond to a shooting instruction for a shooting prop triggered at a first moment, control the shooting prop to shoot an object aimed by the sighting telescope, and control the pupil to be in a default state in a target period starting at the first moment, so that the display effect of the pupil in the target period is unchanged.
Here, when the shooting command is triggered once, the shooting prop is controlled to shoot a bullet to execute shooting on the aimed target object once, the pupil can be controlled to be in a default state in the next target time period of the shooting prop to shoot the target object for the first time, for example, when the pupil is black, the width of the black edge is limited to be unchanged within a certain range in the target time period, so that the black edge can not shade an oversized view field picture, and the requirement of a user for shooting the target object for many times in a short time is met.
In some embodiments, when a second time arrives after the first time, a time interval between the second time and the first time is the same as the target period, the default state of the pupil is released, and the state of the sighting telescope and the state of the virtual object are acquired; and updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
Here, when the duration of the control pupil in the default state reaches the target duration, the default state of the pupil is released, and the display effect of the pupil is updated in real time according to the state of the sighting telescope and the state of the virtual object, so that the updated display effect is matched with the current state of the sighting telescope and the state of the virtual object.
In some embodiments, the firing prop is further equipped with a side scope, the terminal further being operable to switch the scope to the side scope in response to a scope switch instruction for the firing prop, wherein a magnification of the side scope is different from the scope; and displaying a first pupil formed by optical imaging of the outer frame of the side sighting telescope in the sighting interface, wherein the width of the first pupil corresponds to the magnification of the side sighting telescope.
In practical applications, the diameters and lengths of the objective lenses or the ocular lenses of the sighting telescope with different magnifications are different, and the widths of the formed pupils are different, as shown in fig. 8, fig. 8 is a schematic diagram showing the sighting telescope according to the embodiment of the present application, and after the current sighting telescope is switched to the side sighting telescope with different magnifications, the pupil corresponding to the magnification of the switched side sighting telescope is displayed in the sighting interface corresponding to the passing side sighting telescope.
In some embodiments, in the aiming interface, the terminal may also present at least two magnification options; based on at least two options of magnification, responding to a switching instruction for the sighting telescope, canceling an interface for displaying aiming to display an interface of a virtual scene, and controlling shooting prop to assemble a target sighting telescope indicated by the switching instruction; responding to a mirror opening instruction triggered by an interface based on a virtual scene, and presenting a aiming interface corresponding to a target aiming mirror; and displaying a second pupil formed by optical imaging of the outer frame of the target sighting telescope in a sighting interface corresponding to the target sighting telescope, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
Here, in the interface of the virtual scene, the terminal may further present an option for performing scope selection, where magnification of the scope corresponding to the different options is different, when the user selects the target option, the terminal receives a switching instruction for the target scope corresponding to the target option, and in response to the switching instruction, the terminal cancels the currently aimed interface and controls the shooting prop to prepare the target scope, and when the opening operation is performed for the target scope, presents the aimed interface corresponding to the target scope, and presents a pupil related to the target scope.
In some embodiments, the terminal may further obtain a light environment in a virtual scene in which the virtual object is located, where light intensities in different light environments are different; when the light environment where the virtual object is located is changed, the definition of the pupil is synchronously changed so as to adapt to the changed light environment.
When the light environment where the virtual object is located is changed, the visibility of the pupil can be synchronously changed in addition to the definition of the pupil.
In some embodiments, before displaying the view field picture of the virtual scene seen by the virtual object through the sighting telescope, the terminal may further create a patch consistent with an eyepiece of the sighting telescope, and draw a map corresponding to the patch, where the map includes a first portion corresponding to the view field picture and a second portion corresponding to the pupil, and the first portion is a transparent color; combining the mapping with the dough sheet to obtain a target dough sheet; correspondingly, the terminal can display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture in the following manner: acquiring a scene surface patch corresponding to the virtual scene and a frame surface patch corresponding to the outer frame of the sighting telescope; and rendering the scene surface patch, the frame surface patch and the target surface patch according to a preset rendering sequence so as to display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture.
For example, the rendering order of the scene patches is located before the frame patches, the rendering order of the frame patches is located before the target patches, and the scene patches, the frame patches and the target patches are sequentially rendered, so that a pupil formed by optical imaging of the outer frame of the sighting telescope can be displayed between the frame and the view field picture.
In some embodiments, the terminal may draw the corresponding map of the patch by: acquiring an initial mapping corresponding to the patch, wherein the initial mapping comprises a first part and an initial part corresponding to a pupil, and color values of pixels in the initial part correspond to a default state of the pupil; determining the state of the sighting telescope and the state of the virtual object, and determining the center point of pixel sampling according to the state of the sighting telescope and the state of the virtual object; and based on the center point, sampling pixels in the initial mapping to obtain the mapping corresponding to the patch.
Here, in practical applications, the pupil has attributes such as width, color, gradient, transparency, etc., in order to achieve the corresponding pupil effect, the corresponding attributes may be set, for example, the color value of each pixel in the initial map corresponding to the patch is obtained to correspond to the default state of the pupil, such as the first portion of the transparent color, the width of the pupil (corresponding to the width of the black side), and the gradient color (corresponding to the gradient of the black side); then, determining the state of the sighting telescope and the state of the virtual object, determining a center point of pixel sampling according to the state of the sighting telescope and the state of the virtual object, and finally, performing pixel sampling in the initial mapping based on the center point to obtain a mapping corresponding to the patch, for example, drawing a circle by taking the center point as a circle center and taking the size of a first part as a radius to obtain a circle, wherein the inside of the circle is a pure white part, namely the area where the view picture is located; and drawing a circle by taking the center point as the center of a circle and taking the first part size as the radius and the sum of the first part size and the gradient color size as the radius respectively to obtain a concentric ring, wherein the inside of the ring is a gradient area, and the rest part is a black area.
In practical application, when the virtual object or the sighting telescope is in a motion state, the coordinate value during the map sampling can be modified, for example, the coordinate value is not sampled by taking the center point as the center of a circle, so that the pupil is changed along with the change of the virtual object, for example, in the process that the virtual object moves leftwards, the left pupil is more, so that the sampling center of the circle is likewise left-shifted according to the current running state of the virtual object, and then the sampling is performed, so that the corresponding updated map of the corresponding patch is obtained, and the picture rendering is performed based on the updated map, so that the desired pupil effect can be presented, for example, along with the execution of the operation of opening the mirror, the pupil is presented to be deflected to the full change effect in the sighting interface.
In the following, an exemplary application of the embodiment of the present application in a practical application scenario will be described. Taking a virtual scene as an example of a game, when a virtual object is controlled to observe a game picture through a sighting telescope of a shooting prop, in order to display a real and vivid picture to improve the feeling of reality of the game, the embodiment of the application provides a picture display method of the virtual scene, wherein a first person viewing angle is adopted to display an interface of the virtual object, which is aimed by using the sighting telescope included in the shooting prop; and when the state of the sighting telescope or the state of the virtual object is changed, the display effect of the pupil is controlled to be changed synchronously so as to adapt to the changed state, and the pupil effect of reality display by utilizing the optical characteristics in reality is realized. Next, a pupil having a black side with a certain width will be described as an example.
As shown in fig. 6, in the process of controlling the virtual object to execute the operation of opening the scope, the virtual object is gradually opened through the sighting interface of the sighting telescope to view the virtual scene, that is, the process of displaying the view field picture in the sighting interface from none to full is shown, correspondingly, along with the execution of the operation of opening the scope, the pupil (black edge) presented in the sighting interface changes from partial to full, and when the operation of opening the scope is completed, the pupil (black edge) is uniformly distributed between the frame and the view field picture.
As shown in fig. 5, when the scope is in a static state, light in the virtual scene is incident into the scope along a direction parallel to the scope, in this case, the objective lens and the eyepiece are located on the same axis, the light is not blocked, a pupil (black edge) observed through the scope shows a gradual-in gradual effect, and the black edge is divided into two concentric circles as shown, wherein the color of the black edge in the outer circle is unchanged, the color of the black edge in the inner circle is radial to the center of the eyepiece, and the color of the black edge shows a gradual-fade effect.
As shown in fig. 7, when the virtual object is in a traveling state, a shake of the shooting prop is caused, so that light in the virtual scene cannot be incident into the sighting telescope along a direction parallel to the sighting telescope, that is, the light is obliquely incident into the sighting telescope, and part of the light is blocked by the lens barrel, so that imaging difference is caused, in this case, a view field picture is masked through a pupil (black edge) seen in a sighting interface, and the size of the view field picture of the mask dynamically changes with the traveling of the virtual object.
In order to enrich the display modes of the pupil, the embodiment of the application also provides various selectable debugging options, such as the color of the pupil, the width of the pupil, the gradient of the pupil, the transparency of the pupil and the like, wherein the color of the pupil is consistent with the color of the lens barrel, for example, the color of the lens barrel is black, and the black pupil is displayed between a frame and a view field picture, namely, black edge is short; the width and the gradient of the pupil are related to the incident angle of the light speed; the transparency of the pupil is related to the materials of the ocular lens and the objective lens in the sighting telescope, the absorbance and the transmittance of the ocular lens and the objective lens of different materials are different, and the pupil displaying the corresponding transparency can be determined based on the materials of the objective lens and the ocular lens.
In practical application, on the basis of displaying a pupil (black edge) between a frame and a view field picture, when a virtual object is controlled to shoot a target by using an emission prop, the width of the pupil may become larger due to an angle problem in normal circumstances, and in such circumstances, if the presented pupil (black edge) is synchronously transmitted to change, the changed pupil (black edge) will negatively affect aiming accuracy.
In the rendering process, in order to display the pupil (black edge) effect, a surface patch consistent with an eyepiece of the sighting telescope can be created, and a mapping corresponding to the surface patch is drawn, wherein the mapping comprises a first part corresponding to a visual field picture and a second part corresponding to the pupil (black edge), and the first part is transparent color; combining the mapping with the dough sheet to obtain a target dough sheet; acquiring a scene surface patch corresponding to the virtual scene and a frame surface patch corresponding to the outer frame of the sighting telescope; and rendering the scene, the frame and the target surface according to a preset rendering sequence, and displaying a rendered interface, namely an interface aimed through an aiming mirror, wherein the interface presents a frame of the target and a view field picture in a content area of the frame so as to display a pupil (black edge) between the frame and the view field picture.
Referring to fig. 9, fig. 9 is a schematic diagram of a map provided by an embodiment of the present application, in fig. 9, a white portion is a first portion of transparent color, a black portion is a second portion of opaque color, in a normal state of the scope, a circular patch is centered on the middle of the map, a length and width value (length and width are consistent) of the map is a diameter, a color value on the sample map is obtained, and a middle gray portion is a transitional effect between a pupil (black edge) and a view field seen through the scope.
In practical application, because the requirements of modifying the width of the pupil (black edge), the gradient of the pupil (black edge), the transparency of the pupil (black edge) and the like are planned, when in sampling, the proper mapping can be dynamically generated by dynamically modifying the actual sampling effect of the mapping according to the position and the transparency of the current sampling coordinate and according to the attribute values configured according to the attribute requirements, as shown in fig. 10, fig. 10 is a schematic diagram of mapping display provided by the embodiment of the application, and the mapping shown in fig. 10 is generated according to the attributes of the black edge width, gradient, transparency and the like.
Referring to fig. 11, fig. 11 is a flowchart of a method for generating a map according to an embodiment of the present application, where the method includes:
step 201: and obtaining an initial mapping corresponding to the patch.
Wherein, the color value of each pixel in the initial paste corresponds to the default state of the pupil, such as the first part of the transparent color (i.e. white edge in fig. 10), black edge size (corresponding to black edge width) and gradient color size (corresponding to gradient of black edge);
Step 202: and determining the state of the sighting telescope and the state of the virtual object, and determining the center point of the pixel sampling according to the state of the sighting telescope and the state of the virtual object.
Step 203: and based on the center point, sampling pixels in the initial mapping to obtain the mapping corresponding to the patch.
Here, a circle is drawn by taking the center point as the center and the size of the white edge as the radius, so as to obtain a circle, wherein the inside of the circle is a pure white part, namely the area where the visual field picture is positioned; and drawing a circle by taking the center point as the center of a circle, taking the size of the white edge as the radius, and drawing a circle by taking the sum of the size of the white edge and the size of the gradual change as the radius, so as to obtain a concentric ring, wherein the inside of the ring is a gradual change area, and the rest part is a black area.
In practical application, when the virtual object or the sighting telescope is in a motion state, the pupil can be changed along with the change of the virtual object by modifying the coordinate value during map sampling, for example, sampling without taking the center point as the center of a circle. As shown in fig. 12, fig. 12 is a schematic diagram of updating the map provided in the embodiment of the present application, for example, when a virtual object moves to the left, the black edge of the left pupil should be more, so that according to the current running state of the virtual object, the sampling center of the circle is likewise moved to the left and then sampled, so as to obtain an updated map corresponding to the corresponding patch, and based on the updated map, the image rendering is performed, so that the desired pupil effect can be presented, for example, with the execution of the open mirror operation, the effect of changing the pupil (black edge) from the full to the full is presented in the aiming interface.
Referring to fig. 13, fig. 13 is a flowchart of a method for updating a map according to an embodiment of the present application, where the method includes: in step 301, the moving speed of the virtual object is acquired; in step 302, determining an offset magnitude difference using a center point based on the moving speed of the virtual object, wherein the offset magnitude difference is proportional to the moving speed of the virtual object; in step 303, an updated center point is determined based on the offset magnitude difference and the center point of the original pixel sample; in step 304, based on the updated center point, pixel sampling is performed in the initial mapping to obtain an updated mapping corresponding to the patch; after the updated mapping is obtained, the required pupil effect can be presented after the picture rendering is performed based on the updated mapping.
Through the mode, in the process of aiming through the sighting telescope, the real optical phenomenon in real life is introduced into the aiming interface, so that the real and vivid prop effect can be displayed, and the unique game telepresence is molded.
Continuing with the description below of an exemplary structure implemented as a software module for the visual presentation device 465 of a virtual scene provided by an embodiment of the present application, in some embodiments, the software modules stored in the visual presentation device 465 of a virtual scene of the memory 460 of fig. 2 may include:
A first presenting module 4651 configured to present an interface for aiming a virtual object using a sighting telescope included in a shooting prop, using a first person perspective;
a second presenting module 4652, configured to present, in the interface, a bezel of an eyepiece in the scope, where a view field of a virtual scene seen by the virtual object through the scope is presented in an inner area of the bezel;
a third presenting module 4653, configured to display, between the frame and the view field frame, a pupil formed by optical imaging of an outer frame of the scope;
and the presentation control module 4654 is configured to control the display effect of the pupil to be changed synchronously when the state of the sighting telescope or the state of the virtual object is changed, so as to adapt to the changed state.
In some embodiments, the first presentation module is further for performing a scope opening operation for the scope in response to a scope opening instruction for the firing prop;
in the process of executing the operation of opening the sighting telescope, a first-person view angle is adopted, and a process that a sighting interface corresponding to the sighting telescope is gradually opened is displayed;
And the third presentation module is also used for displaying the change process of pupil formed by optical imaging of the outer frame of the sighting telescope from partial to full along with the execution of the mirror opening operation between the frame and the visual field picture in the process of executing the mirror opening operation of the sighting telescope.
In some embodiments, the sighting telescope further comprises an objective lens and a lens cone, wherein the objective lens and the ocular lens are respectively positioned at two ends of the lens cone;
the first presentation module is further configured to display, between the frame and the view field picture, a pupil formed by optical imaging of a barrel of the sighting telescope, where a color of the pupil is consistent with a color of the barrel.
In some embodiments, the apparatus further comprises:
And the color switching module is used for switching the current color of the frame of the ocular in the interface to a target color and switching the color of the pupil to the target color in response to a color switching instruction for the lens barrel.
In some embodiments, the third rendering module is further configured to obtain a material of an objective lens included in the scope and a material of the eyepiece;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
And displaying a pupil formed by optical imaging of the outer frame of the sighting telescope based on the transparency between the frame and the visual field picture.
In some embodiments, the third presenting module is further configured to display, between the frame and the field of view picture, a pupil in a first state formed by optically imaging an outer frame of the scope, in a process that the scope is in a stationary state or the virtual object is in a stationary state;
The first state is static, and when the pupil is in the first state, the pupil comprises two concentric circles, wherein the colors of the pupils in the outer circle are the same, and the colors of the pupils in the inner circle gradually fade along the radial direction from the edge of the inner circle to the center of the eyepiece.
In some embodiments, the third presenting module is further configured to display, during the process that the virtual object is in the traveling state, a pupil in a second state formed by optically imaging an outer frame of the scope;
Wherein the second state is dynamic, and the pupil part masks the view field picture and the size of the masked view field picture changes in the process that the pupil is in the second state.
In some embodiments, the apparatus further comprises:
And the fourth presentation module is used for dynamically displaying the mask proportion of the pupil to the visual field picture in the process of displaying the pupil in the second state.
In some embodiments, the apparatus further comprises:
a state holding module for controlling the shooting prop to continuously shoot the object aimed by the sighting telescope in response to a continuous shooting instruction for the shooting prop;
and in the process of continuously shooting the object, keeping the pupil in a default state so that the display effect of the pupil in the process of continuously shooting the object is unchanged.
In some embodiments, the apparatus further comprises:
a state control module for controlling the shooting prop to shoot the target aimed by the sighting telescope in response to the shooting command to the shooting prop triggered at the first moment, and
And controlling the pupil to be in a default state in a target period taking the first moment as a starting moment, so that the display effect of the pupil in the target period is unchanged.
In some embodiments, the apparatus further comprises:
A display updating module for releasing the default state of the pupil and when a second time after the first time arrives and the time interval between the second time and the first time is the same as the target period
Acquiring the state of the sighting telescope and the state of the virtual object;
And updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
In some embodiments, the firing prop is further equipped with a side scope, the device further comprising:
a scope switching module for switching the scope to the side scope in response to a scope switching instruction for the firing prop, the magnification of the side scope being different from the scope;
and displaying a corresponding aiming interface of the side aiming mirror, and displaying a first pupil formed by optical imaging of an outer frame of the side aiming mirror in the aiming interface, wherein the width of the first pupil corresponds to the magnification of the side aiming mirror.
In some embodiments, the apparatus further comprises:
the magnification selection module is used for presenting at least two magnification options in the interface;
based on the at least two magnification options, responding to a switching instruction for the sighting telescope, canceling an interface for displaying the interface to display the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
responding to a mirror opening instruction triggered based on the interface of the virtual scene, and presenting a sighting interface corresponding to the target sighting telescope;
And displaying a second pupil formed by optical imaging of the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
In the above scheme, the device further includes:
The definition adjusting module is used for acquiring light environments in the virtual scene where the virtual object is located, and the light intensities in different light environments are different;
When the light environment where the virtual object is located is changed, the definition of the pupil is synchronously changed so as to adapt to the changed light environment.
In some embodiments, before the presenting the view of the virtual scene seen by the virtual object through the scope, the apparatus further comprises:
The surface patch determining module is used for creating a surface patch consistent with an eyepiece of the sighting telescope and drawing a mapping corresponding to the surface patch, wherein the mapping comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent;
Combining the map with the dough sheet to obtain a target dough sheet;
the third presentation module is further configured to obtain a scene patch corresponding to the virtual scene and a frame patch corresponding to an outer frame of the sighting telescope;
And rendering the scene surface patch, the frame surface patch and the target surface patch according to a preset rendering sequence so as to display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the visual field picture.
In some embodiments, the patch determining module is further configured to obtain an initial map corresponding to the patch, where the initial map includes the first portion and an initial portion corresponding to the pupil, and a color value of each pixel in the initial portion corresponds to a default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining a center point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the center point, sampling pixels in the initial mapping to obtain the mapping corresponding to the patch.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the embodiments of the present application described above. . The method.
Embodiments of the present application provide a computer readable storage medium storing executable instructions that, when executed by a processor, cause the processor to perform a method for displaying a picture of a virtual scene provided by embodiments of the present application, for example, a method as shown in fig. 3.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (19)
1. A picture presentation method of a virtual scene, the method comprising:
adopting a first person viewing angle to present an interface of aiming the virtual object by using an aiming lens included in the shooting prop;
Displaying a frame of an eyepiece in the sighting telescope in the interface, wherein a visual field picture of a virtual scene seen by the virtual object through the sighting telescope is displayed in an inner area of the frame;
Displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture;
when the state of the sighting telescope or the state of the virtual object is changed, controlling the display effect of the pupil to synchronously change so as to adapt to the changed state;
wherein, between the frame and the view field picture, a pupil formed by optical imaging of the outer frame of the sighting telescope is displayed, and the method comprises the following steps:
acquiring a material of an objective lens included in the sighting telescope and a material of the ocular lens;
determining an adapted transparency based on the material of the objective lens and the material of the eyepiece;
And displaying a pupil formed by optical imaging of the outer frame of the sighting telescope based on the transparency between the frame and the visual field picture.
2. The method of claim 1, wherein presenting an interface for aiming a virtual object using a sighting telescope included in a shooting prop using a first person perspective comprises:
in response to a mirror-opening instruction for the shooting prop, performing a mirror-opening operation for the sighting telescope;
in the process of executing the operation of opening the sighting telescope, a first-person view angle is adopted, and a process that a sighting interface corresponding to the sighting telescope is gradually opened is displayed;
and displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture, wherein the pupil comprises:
And in the process of executing the opening operation of the sighting telescope, displaying the full-range change process of the pupil formed by optical imaging of the outer frame of the sighting telescope along with the execution of the opening operation between the frame and the visual field picture.
3. The method of claim 1, wherein the scope further comprises an objective lens and a barrel, the objective lens and the eyepiece being located at two ends of the barrel, respectively;
and displaying a pupil formed by optical imaging of an outer frame of the sighting telescope between the frame and the view field picture, wherein the pupil comprises the following components:
and displaying a pupil formed by optical imaging of a lens barrel of the sighting telescope between the frame and the view field picture, wherein the color of the pupil is consistent with that of the lens barrel.
4. A method as claimed in claim 3, wherein the method further comprises:
and responding to a color switching instruction aiming at the lens barrel, switching the current color of the frame of the ocular in the interface to a target color, and switching the color of the pupil to the target color.
5. The method of claim 1, wherein displaying a pupil formed by optically imaging an outer rim of the scope between the rim and the field of view comprises:
Displaying a pupil in a first state, which is formed by optical imaging of an outer frame of the sighting telescope, between the frame and the view field picture in the process that the sighting telescope is in a static state or the virtual object is in a static state;
The first state is static, and when the pupil is in the first state, the pupil comprises two concentric circles, wherein the colors of the pupils in the outer circle are the same, and the colors of the pupils in the inner circle gradually fade along the radial direction from the edge of the inner circle to the center of the eyepiece.
6. The method of claim 1, wherein displaying a pupil formed by optically imaging an outer rim of the scope between the rim and the field of view comprises:
displaying a pupil in a second state formed by optical imaging of an outer frame of the sighting telescope in the process that the virtual object is in a traveling state;
Wherein the second state is dynamic, and the pupil part masks the view field picture and the size of the masked view field picture changes in the process that the pupil is in the second state.
7. The method of claim 6, wherein the method further comprises:
the mask ratio of the pupil to the field of view picture is dynamically shown in displaying the pupil in the second state.
8. The method of claim 1, wherein the method further comprises:
in response to a continuous shooting instruction aiming at the shooting prop, controlling the shooting prop to continuously shoot the object aimed by the sighting telescope;
and in the process of continuously shooting the object, keeping the pupil in a default state so that the display effect of the pupil in the process of continuously shooting the object is unchanged.
9. The method of claim 1, wherein the method further comprises:
In response to a shooting instruction for the shooting prop triggered at a first moment, controlling the shooting prop to shoot an object aimed by the sighting telescope, and
And controlling the pupil to be in a default state in a target period taking the first moment as a starting moment, so that the display effect of the pupil in the target period is unchanged.
10. The method of claim 9, wherein the method further comprises:
when a second time after the first time arrives, the time interval between the second time and the first time is the same as the target period, releasing the default state of the pupil, and
Acquiring the state of the sighting telescope and the state of the virtual object;
And updating the display effect of the pupil according to the state of the sighting telescope and the state of the virtual object.
11. The method of claim 1, wherein the firing prop is further equipped with a side scope, the method further comprising:
Switching the scope to the side scope in response to a scope switching instruction for the firing prop, the magnification of the side scope being different from the scope;
and displaying a corresponding aiming interface of the side aiming mirror, and displaying a first pupil formed by optical imaging of an outer frame of the side aiming mirror in the aiming interface, wherein the width of the first pupil corresponds to the magnification of the side aiming mirror.
12. The method of claim 1, wherein the method further comprises:
In the interface, presenting at least two magnification options;
based on the at least two magnification options, responding to a switching instruction for the sighting telescope, canceling an interface for displaying the interface to display the virtual scene, and controlling the shooting prop to assemble a target sighting telescope indicated by the switching instruction;
responding to a mirror opening instruction triggered based on the interface of the virtual scene, and presenting a sighting interface corresponding to the target sighting telescope;
And displaying a second pupil formed by optical imaging of the outer frame of the target sighting telescope in the sighting interface, wherein the width of the second pupil corresponds to the magnification of the target sighting telescope.
13. The method of claim 1, wherein the method further comprises:
acquiring light environments in a virtual scene where the virtual object is located, wherein the light intensities in different light environments are different;
When the light environment where the virtual object is located is changed, the definition of the pupil is synchronously changed so as to adapt to the changed light environment.
14. The method of claim 1, wherein the exposing the view of the virtual scene as seen by the virtual object through the scope is preceded by the method further comprising:
creating a patch consistent with an eyepiece of the sighting telescope, and drawing a mapping corresponding to the patch, wherein the mapping comprises a first part corresponding to the visual field picture and a second part corresponding to the pupil, and the first part is transparent;
Combining the map with the dough sheet to obtain a target dough sheet;
and displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture, wherein the pupil comprises:
Acquiring a scene surface patch corresponding to the virtual scene and a frame surface patch corresponding to an outer frame of the sighting telescope;
And rendering the scene surface patch, the frame surface patch and the target surface patch according to a preset rendering sequence so as to display a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the visual field picture.
15. The method of claim 14, wherein the drawing the corresponding map for the patch comprises:
Acquiring an initial mapping corresponding to the patch, wherein the initial mapping comprises the first part and an initial part corresponding to the pupil, and the color value of each pixel in the initial part corresponds to the default state of the pupil;
determining the state of the sighting telescope and the state of the virtual object, and determining a center point of pixel sampling according to the state of the sighting telescope and the state of the virtual object;
and based on the center point, sampling pixels in the initial mapping to obtain the mapping corresponding to the patch.
16. A visual presentation apparatus for a virtual scene, the apparatus comprising:
The first presenting module is used for presenting an interface of aiming the virtual object by using an aiming lens included in the shooting prop by adopting a first person perspective;
The second presentation module is used for presenting the frame of the ocular in the sighting telescope in the interface, and the internal area of the frame is provided with a visual field picture of the virtual scene seen by the virtual object through the sighting telescope;
The third presentation module is used for displaying a pupil formed by optical imaging of the outer frame of the sighting telescope between the frame and the view field picture;
the display control module is used for controlling the display effect of the pupil to synchronously change when the state of the sighting telescope or the state of the virtual object is changed so as to adapt to the changed state;
The third presentation module is further configured to obtain a material of an objective lens included in the sighting telescope and a material of the eyepiece; determining an adapted transparency based on the material of the objective lens and the material of the eyepiece; and displaying a pupil formed by optical imaging of the outer frame of the sighting telescope based on the transparency between the frame and the visual field picture.
17. A terminal device, comprising:
a memory for storing executable instructions;
A processor for implementing the picture presentation method of a virtual scene as claimed in any one of claims 1 to 15 when executing executable instructions stored in said memory.
18. A computer readable storage medium storing executable instructions for implementing the visual presentation method of a virtual scene according to any one of claims 1 to 15 when executed by a processor.
19. A computer program product comprising a computer program or instructions which, when executed by a processor, implements the method of visual presentation of a virtual scene as claimed in any one of claims 1 to 15.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2021112264782 | 2021-10-21 | ||
CN202111226478.2A CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114100134A CN114100134A (en) | 2022-03-01 |
CN114100134B true CN114100134B (en) | 2024-09-20 |
Family
ID=79280830
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111226478.2A Withdrawn CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
CN202111672379.7A Active CN114100134B (en) | 2021-10-21 | 2021-12-31 | Picture display method, device, equipment, medium and program product of virtual scene |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111226478.2A Withdrawn CN113926194A (en) | 2021-10-21 | 2021-10-21 | Method, apparatus, device, medium, and program product for displaying picture of virtual scene |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN113926194A (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108579083B (en) * | 2018-04-27 | 2020-04-21 | 腾讯科技(深圳)有限公司 | Virtual scene display method and device, electronic device and storage medium |
CN108671540A (en) * | 2018-05-09 | 2018-10-19 | 腾讯科技(深圳)有限公司 | Accessory switching method, equipment and storage medium in virtual environment |
CN110141869A (en) * | 2019-04-11 | 2019-08-20 | 腾讯科技(深圳)有限公司 | Method of controlling operation thereof, device, electronic equipment and storage medium |
CN112221134B (en) * | 2020-11-09 | 2022-05-31 | 腾讯科技(深圳)有限公司 | Virtual environment-based picture display method, device, equipment and medium |
-
2021
- 2021-10-21 CN CN202111226478.2A patent/CN113926194A/en not_active Withdrawn
- 2021-12-31 CN CN202111672379.7A patent/CN114100134B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN114100134A (en) | 2022-03-01 |
CN113926194A (en) | 2022-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11620780B2 (en) | Multiple device sensor input based avatar | |
CN115016642B (en) | Reality Mixer for Mixed Reality | |
US11120775B2 (en) | Compositing an image for display | |
JP6730286B2 (en) | Augmented Reality Object Follower | |
CN112955850B (en) | Method and apparatus for attenuating joint user interactions in a Simulated Reality (SR) space | |
CN106471420B (en) | Menu navigation in head-mounted display | |
WO2018000629A1 (en) | Brightness adjustment method and apparatus | |
CN112076473A (en) | Control method and device of virtual prop, electronic equipment and storage medium | |
US20190114841A1 (en) | Method, program and apparatus for providing virtual experience | |
US20230330534A1 (en) | Method and apparatus for controlling opening operations in virtual scene | |
US20230288701A1 (en) | Sensor emulation | |
US12148090B2 (en) | Method and device for visualizing sensory perception | |
CN120303636A (en) | Methods for managing overlapping windows and applying visual effects | |
JP7145944B2 (en) | Display device and display method using means for providing visual cues | |
CN114130006B (en) | Virtual prop control method, device, equipment, storage medium and program product | |
CN114100134B (en) | Picture display method, device, equipment, medium and program product of virtual scene | |
JP6529571B1 (en) | Program, method executed by computer to provide virtual space, and information processing apparatus for executing program | |
US20250005864A1 (en) | Methods for optimization of virtual user interfaces in a three-dimensional environment | |
KR102549822B1 (en) | Methods and devices for presenting and manipulating conditionally dependent synthetic reality content threads | |
KR20200100797A (en) | Creation of objectives for objective launchers from synthesized reality settings | |
US20250001302A1 (en) | Virtual item display | |
CN112891930B (en) | Information display method, device, equipment and storage medium in virtual scene | |
US11823343B1 (en) | Method and device for modifying content according to various simulation characteristics | |
Peraica | Selfies and the World Behind Our Back. | |
Wang et al. | Design and Development of VR Virtual Chinese Painting Based on Unity3D |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |