CN111467803A - In-game display control method and device, storage medium, and electronic device - Google Patents
In-game display control method and device, storage medium, and electronic device Download PDFInfo
- Publication number
- CN111467803A CN111467803A CN202010255567.9A CN202010255567A CN111467803A CN 111467803 A CN111467803 A CN 111467803A CN 202010255567 A CN202010255567 A CN 202010255567A CN 111467803 A CN111467803 A CN 111467803A
- Authority
- CN
- China
- Prior art keywords
- display
- game
- scene
- game scene
- auxiliary game
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 230000000007 visual effect Effects 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 3
- 238000009877 rendering Methods 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 abstract description 5
- 238000006243 chemical reaction Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 33
- 230000006870 function Effects 0.000 description 12
- 230000004044 response Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 210000004027 cell Anatomy 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 238000007654 immersion Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000005155 neural progenitor cell Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000008092 positive effect Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5255—Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/807—Role playing or strategy games
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure relates to the technical field of games, and provides an in-game display control method, an in-game display control device, a computer storage medium and an electronic device, wherein the in-game display control method comprises the following steps: responding to a display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene; generating a display interface according to the auxiliary game scene and the virtual role in the game; and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface. The display control method in the game can solve the problem that two scenes cannot be opened simultaneously to realize rapid switching of interfaces in battle in the prior art, can also solve the problem that background rotation and virtual character rotation are separated in the prior art, and realizes synchronous conversion of game scenes and virtual characters.
Description
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to a display control method in a game, a display control apparatus in a game, a computer storage medium, and an electronic device.
Background
With the development of computer technology, the related game field is rapidly developed, and the requirements of people on the substitution feeling and the display effect of the game are gradually improved, so that the fusion of the detailed description of the virtual character in the game and the environment becomes a focus of attention of related developers.
Currently, a 3D (three-dimensional) character is generally projected to a UI (User Interface, UI) Interface, a shadow part of the character is generated by a program, a background still uses a 2D (two-dimensional) picture, the 2D background is not associated with the 3D model, and the edge is hard and is not uniform with the environment, and the rotation of the character and the rotation of the background are separated.
In view of the above, there is a need in the art to develop a new display control method and device for a game.
It is to be noted that the information disclosed in the background section above is only used to enhance understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure is directed to a display control method in a game, a display control apparatus in a game, a computer storage medium, and an electronic device, so as to avoid, at least to a certain extent, a defect that a scene and a virtual character cannot be synchronously changed in the prior art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an in-game display control method for providing a graphical user interface through a terminal device, the method comprising: responding to a display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene; generating a display interface according to the auxiliary game scene and the virtual role in the game; and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
In an exemplary embodiment of the present disclosure, the graphical user interface includes at least one function control, and the obtaining a pre-created auxiliary game scene in response to a presentation trigger operation acting on the graphical user interface includes: and responding to a first touch operation acted on the functional control to acquire the pre-created auxiliary game scene.
In an exemplary embodiment of the present disclosure, the generating a presentation interface according to the auxiliary game scene and the virtual character in the game includes: and generating the display interface corresponding to the functional control according to the auxiliary game scene and the virtual role in the game.
In an exemplary embodiment of the present disclosure, a virtual camera is disposed in the auxiliary game scene, and generating a display interface according to the auxiliary game scene and a virtual character in the game includes: acquiring scene images in the auxiliary game scene through the virtual camera; and generating the display interface according to the scene image and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the secondary game scenario includes at least one of: the virtual character is located in a game scene area in a main game scene of the game; a particular game scene area in the primary game scene of the game; a background scene associated with the primary game scene in the game.
In an exemplary embodiment of the present disclosure, the viewing angle adjusting operation includes a second touch operation acting on the presentation interface; the responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface comprises the following steps: acquiring an operation parameter corresponding to the second touch operation acting on the display interface; determining a visual angle adjusting parameter according to the operating parameter; and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface according to the visual angle adjusting parameters.
In an exemplary embodiment of the present disclosure, the determining a viewing angle adjustment parameter according to the operation parameter includes: determining rotation parameters of a virtual camera arranged in the auxiliary game scene according to the operation parameters; and determining the rotation parameter as the view angle adjusting parameter.
In an exemplary embodiment of the present disclosure, the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation; determining a viewing angle adjustment parameter according to the operating parameter, including: determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the movement distance of the touch point; and determining the rotating direction of the virtual camera according to the sliding direction.
In an exemplary embodiment of the present disclosure, the second touch operation is a click operation or a re-click operation, and the operation parameters include a relative distance and a relative position relationship between a touch point position of the second touch operation and a preset reference position; determining a viewing angle adjustment parameter according to the operating parameter, including: determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the relative distance; and determining the rotation direction of the virtual camera according to the relative position relation.
In an exemplary embodiment of the present disclosure, the touch point movement distance includes a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction; the determining a rotation angle of a virtual camera disposed in the auxiliary game scene according to the movement distance of the touch point includes: determining a first rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera; determining a second rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction, the movement distance of the touch point in the vertical direction and the rotation radius of the virtual camera; and determining the first rotation angle and the second rotation angle as rotation angles corresponding to virtual cameras arranged in the auxiliary game scene.
In an exemplary embodiment of the present disclosure, the generating the presentation interface according to the scene image and the virtual character in the game includes: blurring the scene image; adjusting the display parameters of the scene image after the fuzzification processing to target display parameters; rendering the scene image adjusted to the target display parameter and the virtual character in the game to the graphical user interface to obtain the display interface.
In an exemplary embodiment of the present disclosure, the method further comprises: and zooming and adjusting the image size of the scene image after the target display parameters are adjusted according to the screen size of the terminal equipment.
According to a second aspect of the present disclosure, there is provided an in-game display control apparatus that provides a graphical user interface through a terminal device, the apparatus comprising: the acquisition module is used for responding to the display triggering operation acted on the graphical user interface and acquiring a pre-created auxiliary game scene; the interface generation module is used for generating a display interface according to the auxiliary game scene and the virtual role in the game; and the visual angle adjusting module is used for responding to visual angle adjusting operation acted on the display interface and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the in-game display control method of the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the in-game display control method of the first aspect via execution of the executable instructions.
As can be seen from the foregoing technical solutions, the in-game display control method, the in-game display control apparatus, the computer storage medium, and the electronic device in the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical solutions provided in some embodiments of the present disclosure, on one hand, a pre-created auxiliary game scene is obtained in response to a display trigger operation acting on a graphical user interface, and a display interface is generated according to the auxiliary game scene and a virtual character in a game, so that a technical problem that only a single scene can be opened but not two scenes can be opened simultaneously in the prior art can be solved, and a rapid switch of interfaces in a battle can be realized. The visual angle adjusting operation acting on the display interface is responded, the display visual angles of the auxiliary game scene and the virtual character in the display interface are adjusted, the technical problem that the background rotation and the virtual character rotation are separated in the prior art can be solved, the synchronous rotation of the game scene and the virtual character is realized, and the display quality and the game substituting feeling of the game are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 shows a flow diagram of a method for in-game display control in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a primary game scenario in an exemplary embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of a secondary game scenario in an exemplary embodiment of the present disclosure;
FIG. 4 is a sub-flow diagram illustrating a method of controlling display in a game according to an exemplary embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of a graphical user interface in an exemplary embodiment of the present disclosure;
FIG. 6 is a sub-flow diagram illustrating a method of controlling display in a game according to an exemplary embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 8 is a sub-flow diagram illustrating a method of controlling display in a game according to an exemplary embodiment of the present disclosure;
FIG. 9 is a sub-flow diagram illustrating a method of controlling display in a game according to an exemplary embodiment of the present disclosure;
FIG. 10 is a sub-flow diagram illustrating a method of controlling display in a game according to an exemplary embodiment of the present disclosure;
FIG. 11A illustrates a schematic diagram of display effects in an exemplary embodiment of the present disclosure;
fig. 11B illustrates a schematic diagram of a display effect after performing display perspective adjustment in an exemplary embodiment of the present disclosure;
fig. 12 is a schematic view showing the structure of a display control device in a game in an exemplary embodiment of the present disclosure;
FIG. 13 shows a schematic diagram of a structure of a computer storage medium in an exemplary embodiment of the disclosure;
fig. 14 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/parts/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. other than the listed elements/components/etc.; the terms "first" and "second", etc. are used merely as labels, and are not limiting on the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
Currently, 3D characters are generally projected to a UI interface, shadow parts of the characters are generated by a program, the background still uses 2D pictures, the 2D background is not associated with the 3D model, the edges are hard and not uniform with the environment, and the rotation of the characters and the rotation of the background are separated. Or, the 3D scene and the character are directly used, so that only a single scene can be opened, two scenes cannot be opened simultaneously, and the rapid switching of interfaces in the battle cannot be performed. Or, the 2D character animation is produced, so that the stereoscopic impression, the shadow, the depth of field and the animation deformation are limited, and the user experience is poor.
In the embodiment of the disclosure, firstly, a display control method in a game is provided, which overcomes the defect that a scene and a virtual character cannot rotate synchronously in the prior art at least to a certain extent.
Fig. 1 is a flowchart illustrating a display control method in a game according to an exemplary embodiment of the present disclosure, where an execution subject of the display control method in the game may be a server that performs display control on the game.
Referring to fig. 1, a display control method in a game according to one embodiment of the present disclosure includes the steps of:
step S110, responding to a display trigger operation acting on a graphical user interface, and acquiring a pre-created auxiliary game scene;
step S120, generating a display interface according to the auxiliary game scene and the virtual role in the game;
step S130, adjusting the display view angles of the auxiliary game scene and the virtual character in the display interface in response to the view angle adjustment operation applied to the display interface.
In the technical solution provided in the embodiment shown in fig. 1, on one hand, a pre-created auxiliary game scene is obtained in response to a display trigger operation acting on a graphical user interface, and a display interface is generated according to the auxiliary game scene and a virtual character in a game, so that the technical problem that only a single scene can be opened but not two scenes can be opened simultaneously in the prior art can be solved, and the rapid switching of interfaces in a battle can be realized. The visual angle adjusting operation acting on the display interface is responded, the display visual angles of the auxiliary game scene and the virtual character in the display interface are adjusted, the technical problem that the background rotation and the virtual character rotation are separated in the prior art can be solved, the synchronous rotation of the game scene and the virtual character is realized, and the display quality and the game substituting feeling of the game are improved.
The following describes the specific implementation of each step in fig. 1 in detail:
in an exemplary embodiment of the present disclosure, a terminal device, i.e., a computer display terminal, is a device that inputs a program and data to a computer or receives a computer output processing result via a communication facility, and is an input/output device of a computer system. For example, the terminal device in the present disclosure may be a mobile phone, a computer, an iPad, a tablet computer, a vehicle-mounted computer, or the like.
A Graphical User Interface (GUI) is a computer operation user interface displayed in a Graphical manner, and allows a user to use an input device such as a mouse to manipulate icons or menu options on a screen to select commands, call files, start programs, or perform other daily tasks. Graphical user interfaces have many advantages over character interfaces that enter text or character commands through a keyboard to accomplish routine tasks. The graphical user interface consists of windows, pull-down menus, dialog boxes and corresponding control mechanisms, and is standardized in various new applications, i.e. the same operations are always performed in the same way, and graphical objects are seen and operated by users in the graphical user interface, and the technology of computer graphics is applied.
The game scene is the environment, building, machinery, props, etc. in the game. A game scenario is generally understood as a game that restores available elements such as buildings, trees, sky, roads, etc. in the game according to the planned requirements (including weapon props and NPCs (Non-Player Character, NPC is a type of Character in the game, meaning a Non-Player Character, and refers to a game Character that is not manipulated by a Player in the game), etc.
In an exemplary embodiment of the present disclosure, at least a portion of the primary game scene and at least one functionality control may be displayed in the graphical user interface described above. The main game scene can be larger than the size of a display screen of the graphical user interface, a player can control game characters to play strange, upgrade, take over tasks, run and other game operations in the main game scene, and in the moving process of the game characters, the content of the main game scene displayed by the graphical user interface can be updated along with the main game scene; the function control can be one or more of a virtual prop control, a skill control, a role attribute control and a weapon control, and the function control can be used for opening a function interface. For example, referring to fig. 2, fig. 2 shows an interface schematic diagram of a main game scene in an exemplary embodiment of the present disclosure, 201 shows a part of the main game scene, 202 shows a virtual character in the main game scene, and 203 shows the above function control, such as a virtual prop control, for opening a virtual prop interface.
For example, an auxiliary game scene may be created in advance, and a virtual camera may be set in the auxiliary game scene to capture a scene image of the auxiliary game scene in real time. The auxiliary game scene may be a game scene area where a virtual character in the game main game scene is located, for example: the same scene as the display of the main game scene.
The secondary game scene may also be a particular game scene area in the primary game scene (i.e., a portion of the game scene in the primary game scene), such as: when the virtual character runs in the main game scene and passes through the training field, the auxiliary game scene can adopt a training field scene; when the virtual character runs in the main game scene and passes through the weapon shop, the auxiliary game scene can adopt a weapon shop scene; when the virtual character runs in the main game scene, the auxiliary game scene can adopt a hot spring scene, so that the auxiliary game scene can be updated according to the real-time position of the virtual character, and the picture feeling and the interestingness of the game are improved.
The secondary game scene may also be a reconstructed background scene associated with the primary game scene in the game, such as: may be a background scene that is different from the display of the main game scene, but consistent with the display attributes (e.g., style, type) of the main game scene, such as: a wind game scene, a Chinese wind game scene, a realistic wind game scene, a last day wind game scene, etc. Illustratively, when the auxiliary game scene 301 is a game scene area where a virtual character 302 in a game main game scene is located, reference may be made to fig. 3, where fig. 3 illustrates a scene schematic diagram of the auxiliary game scene in an exemplary embodiment of the present disclosure, where 301 is the auxiliary game scene, and 302 is a virtual character in the auxiliary game scene (the same as the virtual character in the main game scene).
With continued reference to fig. 1, in step S110, a pre-created auxiliary game scene is acquired in response to a presentation trigger operation applied to the graphical user interface.
In an exemplary embodiment of the present disclosure, with reference to the related explanation of the above steps, taking a function control as a virtual item control for explanation, for example, with continuing reference to fig. 2, 203 is shown as the above function control (for example, a backpack control BAG in the virtual item control), when a first touch operation (for example, a click operation, a long press operation, a drag operation, and a slide operation) of the user on the above function control 203 is received, a pre-created auxiliary game scene may be obtained, for example: the backpack scene corresponding to the backpack control BAG described above may be, for example, a background when a backpack of a virtual character is opened and items in the backpack are displayed in a game.
In step S120, a presentation interface is generated according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, a presentation interface corresponding to a functionality control may be generated according to an auxiliary game scene and a virtual character in a game.
Specifically, a scene image in the auxiliary game scene may be acquired by the virtual camera, and then a display interface may be generated according to the scene image and the virtual character in the game. The content displayed on the display interface may further include virtual articles (e.g., a backpack interface corresponding to a backpack control, virtual medicines, virtual equipment, virtual ores, virtual foods, etc.), a display position relationship between the virtual articles and the virtual characters, and the like (e.g., the virtual articles may be displayed at the bottom of the display interface, the virtual characters may be displayed in the middle of the display interface, and the like, which may be set according to actual conditions, and belong to the protection scope of the present disclosure).
After the scene image is acquired, referring to fig. 4, fig. 4 shows a sub-flow diagram of a display control method in a game in an exemplary embodiment of the present disclosure, specifically a sub-flow diagram of generating a presentation interface according to an auxiliary game scene and a virtual character in the game, including steps S401 to S403, and the following explains step S120 with reference to fig. 4.
In step S401, the scene image is subjected to blurring processing.
In an exemplary embodiment of the present disclosure, after the scene image of the auxiliary game scene is acquired, the scene image may be blurred, and specifically, a depth of field (DOF, which refers to a range of distances between front and rear objects measured by imaging that can obtain a sharp image at the front edge of a camera lens or other imager) of the scene image may be blurred. Therefore, the scene image has the depth of field effect, the light sensation, the color and the immersion of the game picture are improved, and the phenomenon that the 3D character and the 2D background are disjointed can be effectively avoided.
In step S402, the display parameters of the scene image after the blurring process are adjusted to target display parameters.
In an exemplary embodiment of the present disclosure, the display parameters of the scene image after the blurring process may be adjusted to target display parameters, wherein the display parameters may be color, brightness, color temperature, and the like. For example, when the display parameter is a color, the display parameter may be adjusted according to the display requirement of the user, so as to adjust the display parameter to the target display parameter. For example, when the game type is a battle type, the color can be adjusted to be a dark color system to improve the game atmosphere and the game substituting feeling; when the game type is the intelligence-developing game, the color can be adjusted to be brighter and more beautiful so as to increase the interest of the game.
In the exemplary embodiment of the present disclosure, after the display parameters are adjusted to the target display parameters, if a full-screen image is to be displayed, the image size of the scene image after being adjusted to the target display parameters may be scaled according to different types of terminal devices to ensure that the scene image may be matched with the display screens of multiple terminal devices at the same time, for example, the Render Texture of the scene image may be set to a standard cell phone size of 1920 × 1080, or the Render Texture of the scene image may be set to 2340 × 1440 to ensure that the Texture of the scene image is matched with the cell phone and the Ipad screen at the same time.
In an exemplary embodiment of the present disclosure, an anchor point (which is a kind of hyperlink in web page production and is also called a named anchor) of the scene image may also be set to be centered, so as to avoid edge leakage and optimize the visual experience of the user.
In step S403, the scene image and the virtual character in the game after being adjusted to the target display parameter are rendered on the graphical user interface, so as to obtain a display interface.
In an exemplary embodiment of the present disclosure, after the display parameters of the scene image are adjusted to the target display parameters, the scene image and the virtual character in the game may be rendered on the graphical user interface, resulting in a presentation interface.
After obtaining the presentation interface, for example, referring to fig. 5, fig. 5 shows a schematic diagram of a graphical user interface in an exemplary embodiment of the present disclosure, specifically shows a schematic diagram of displaying a main game scene and the presentation interface on the graphical user interface, and as can be seen from fig. 5, the present disclosure can open multiple game interfaces simultaneously, and implement fast switching of battle interfaces in a game.
In step S130, in response to the view angle adjustment operation applied to the display interface, the display view angles of the auxiliary game scene and the virtual character in the display interface are adjusted.
In an exemplary embodiment of the present disclosure, in response to a viewing angle adjustment operation applied to the presentation interface, a display viewing angle of the auxiliary game scene and the virtual character in the presentation interface may be adjusted.
Specifically, the above-mentioned view angle adjusting operation may be a second touch operation applied to the presentation interface, for example, referring to fig. 6, fig. 6 shows a sub-flow diagram of a display control method in a game in an exemplary embodiment of the present disclosure, and specifically shows a sub-flow diagram for adjusting the display view angles of the auxiliary game scene and the virtual character in the presentation interface in response to the view angle adjusting operation applied to the presentation interface, which includes steps S601 to S603, and the following explains step S130 with reference to fig. 6.
In step S601, an operation parameter corresponding to a second touch operation applied to the display interface is obtained.
In an exemplary embodiment of the present disclosure, an operation parameter corresponding to a second touch operation applied to a presentation interface may be obtained, for example, referring to fig. 7, fig. 7 shows a schematic diagram of a display control method in a game in an exemplary embodiment of the present disclosure, and specifically shows a schematic diagram of an operation parameter corresponding to a second touch operation, referring to fig. 7, 701, which shows the presentation interface described above, where point O is a world center point, XOYZ is a spatial coordinate system, and OA is a rotation radius r of a virtual camera.
When the second touch operation is a sliding operation, the operation parameters may include a touch point movement distance of the sliding operation (including a touch point movement distance x in a horizontal direction and a touch point movement distance y in a vertical direction) and a sliding direction, and for example, when the second touch operation is a sliding operation of sliding from a point a to a point B, the touch point movement distance AB and the sliding direction from the point a to the point B are the above operation parameters, and then, the touch point movement distance AB may be decomposed to obtain the touch point movement distance x in the horizontal direction and the touch point movement distance y in the vertical direction.
When the second touch operation is a click operation or a re-click operation, the operation parameters include a relative distance (including a relative distance x in the horizontal direction and a relative distance y in the vertical direction) and a relative position relationship between a touch point position of the second touch operation and a preset reference position, where the preset reference position may be a position where a virtual character in the game is located, and exemplarily, when the preset reference position is a position where the virtual character in the game is located (for example, a point a in fig. 7), when a touch point position of a player is a point B, a relative distance between the touch point position and the preset reference position is AB, and further, the relative distance AB may be decomposed to obtain the relative distance x in the horizontal direction and the relative distance y in the vertical direction. The relative position relationship between the touch point position and the preset reference position is as follows: the position of the touch point is located on the right side of the preset reference position.
In step S602, a viewing angle adjustment parameter is determined according to the operation parameter.
In an exemplary embodiment of the present disclosure, after obtaining the operation parameter, the viewing angle adjustment parameter may be determined according to the operation parameter. Specifically, the rotation parameter of the virtual camera set in the auxiliary game scene may be determined according to the operation parameter, and then the rotation parameter of the virtual camera may be determined as the viewing angle adjustment parameter.
For example, referring to the related explanation of the above step S601, when the second touch operation is a sliding operation, the operation parameters may include a touch point moving distance (including a touch point moving distance x in a horizontal direction and a touch point moving distance y in a vertical direction) and a sliding direction of the sliding operation, and further, referring to fig. 8, fig. 8 shows a sub-flow diagram of a display control method in a game in an exemplary embodiment of the present disclosure, and specifically shows a sub-flow diagram for determining the viewing angle adjustment parameter according to the operation parameters (the touch point moving distance and the sliding direction), including steps S801-S802, and the following describes a specific implementation manner with reference to fig. 8.
In step S801, the rotation angle of the virtual camera set in the auxiliary game scene is determined according to the touch point movement distance.
For example, referring to fig. 9, fig. 9 shows a sub-flowchart of a display control method in a game in an exemplary embodiment of the present disclosure, and in particular, shows a sub-flowchart of determining a rotation angle of a virtual camera disposed in an auxiliary game scene according to a movement distance of a touch point, including steps S901 to S903, and the following explains step S801 in conjunction with fig. 9.
In step S901, a first rotation angle corresponding to the virtual camera is determined according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera.
In an exemplary embodiment of the present disclosure, a first rotation angle α corresponding to the virtual camera (i.e., a rotation angle of the virtual camera in the horizontal direction) may be determined according to the movement distance x of the touch point of the second touch operation in the horizontal direction and the rotation radius r of the virtual camera, specifically, a first rotation angle α corresponding to the virtual camera may be determined according to the following formula 1:
in step S902, a second rotation angle corresponding to the virtual camera is determined according to the touch point moving distance of the second touch operation in the horizontal direction, the touch point moving distance of the second touch operation in the vertical direction, and the rotation radius of the virtual camera.
In an exemplary embodiment of the present disclosure, the second rotation angle β corresponding to the virtual camera (i.e., the rotation angle of the virtual camera in the horizontal direction) may be determined according to the touch point movement distance x of the second touch operation in the horizontal direction, the touch point movement distance y of the second touch operation in the vertical direction, and the rotation radius r of the virtual camera, and specifically, the second rotation angle β corresponding to the virtual camera may be determined according to the following formula 2.
In step S903, the first rotation angle and the second rotation angle are determined as rotation angles corresponding to the virtual camera set in the secondary game scene.
In an exemplary embodiment of the present disclosure, the above-described first and second rotation angles α and β may be determined as rotation angles corresponding to virtual cameras in the auxiliary game scene.
With continued reference to fig. 8, in step S802, the rotation direction of the virtual camera is determined according to the sliding direction.
For example, when the sliding direction of the sliding operation is leftward sliding, the rotating direction of the virtual camera may be determined to be clockwise rotating, and when the sliding direction of the sliding operation is rightward sliding, the rotating direction of the virtual camera may be determined to be counterclockwise rotating. For example, when the sliding direction is a right sliding operation from point a to point B, the rotation angle corresponding to the virtual camera may be determined to be a counterclockwise rotation. It should be noted that the corresponding relationship between the sliding direction and the rotating direction can be set according to the actual situation, and belongs to the protection scope of the present disclosure.
For example, referring to the related explanation of step S601 mentioned above, when the second touch operation is a click operation or a re-click operation, the operation parameters include a relative distance (including a relative distance x in the horizontal direction and a relative distance y in the vertical direction) and a relative position relationship between a touch point position of the second touch operation and a preset reference position, and further, referring to fig. 10, fig. 10 shows a sub-flow diagram of a display control method in a game in an exemplary embodiment of the present disclosure, and specifically shows a sub-flow diagram for determining a viewing angle adjustment parameter according to the operation parameters (a touch point movement distance and a sliding direction), including steps S1001 to S1002, and the following explains a specific implementation manner with reference to fig. 10.
In step S1001, the rotation angle of the virtual camera set in the secondary game scene is determined according to the relative distance.
In the exemplary embodiment of the present disclosure, after the relative distance x in the horizontal direction and the relative distance y in the vertical direction are obtained, the first rotation angle α corresponding to the virtual camera may be determined according to the relative distance x in the horizontal direction and the rotation radius r of the virtual camera, and the second rotation angle β corresponding to the virtual camera may be determined according to the relative distance x in the horizontal direction, the relative distance y in the vertical direction and the rotation radius r of the virtual camera, and the first rotation angle α and the second rotation angle β may be determined as the rotation angles corresponding to the virtual camera disposed in the auxiliary game scene, with reference to the relevant explanations of the above-described steps S901 to S903.
In step S1002, the rotation direction of the virtual camera is determined from the relative positional relationship.
In an exemplary embodiment of the present disclosure, when the relative position relationship between the touch point position and the preset reference position is: when the position of the touch point is positioned on the left side of the preset reference position, the rotation direction of the virtual camera can be determined to be clockwise rotation; when the position of the touch point is located on the right of the preset reference position, it may be determined that the rotation direction of the virtual camera is counterclockwise rotation. Referring to the above explanation of the steps, when the touch point position B is located on the right side of the virtual character, the rotation direction of the virtual camera may be determined to be counterclockwise rotation.
In an exemplary embodiment of the present disclosure, after the rotation parameters of the virtual camera are obtained, the rotation parameters of the virtual camera (including the rotation angle and the rotation direction) may be determined as the viewing angle adjustment parameters.
With continuing reference to fig. 6, in step S603, the display view angles of the auxiliary game scenes and the virtual characters in the display interface are adjusted according to the view angle adjustment parameter.
In an exemplary embodiment of the present disclosure, after obtaining the above-mentioned viewing angle adjustment parameter, the display viewing angles of the auxiliary game scene and the virtual character in the display interface may be adjusted according to the viewing angle adjustment parameter, and for example, the virtual camera may be controlled to rotate counterclockwise α degrees in the horizontal direction and rotate counterclockwise β degrees in the vertical direction, so as to achieve adjustment of the display viewing angles of the auxiliary game scene and the virtual character.
In an exemplary embodiment of the present disclosure, fig. 11A and 11B are schematic diagrams illustrating a display effect of the display control method in a game of the present disclosure in a Messiah engine, 1101 is an auxiliary game scene, 1102 is a virtual character in the auxiliary game scene, specifically, fig. 11A is a schematic diagram illustrating a display effect of a presentation interface, and fig. 11B is a schematic diagram illustrating a display effect after a display perspective of the auxiliary game scene and the virtual character in the presentation interface is adjusted in response to a perspective adjustment operation applied to the presentation interface. As can be seen from fig. 11A and 11B, when the display angle of view of the virtual character in the game changes, the display angle of view of the auxiliary game scene also changes synchronously, that is, the present disclosure can achieve synchronous rotation of the auxiliary game scene and the virtual character in the game, improve the sense of substitution of the game, and solve the technical problem of separation of rotation of the scene and the character in the prior art.
Based on the technical scheme, on one hand, the method can increase the spatial sense immersion sense and adjust camera parameters such as field depth blurring and color correction, so that the phenomenon that a 3D character and a 2D background are disjointed can be effectively avoided. Furthermore, the virtual role is not limited to a single light source, and the rich and fine degree of the light shadow can be ensured to the maximum extent. On the other hand, the game scene can use abundant 3D special effects and dynamic materials, the synchronous transformation of the virtual character and the game scene is realized, the beautiful environment scene can be appreciated while the details of the virtual character are appreciated, and the display quality of the game is greatly improved.
The present disclosure also provides an in-game display control apparatus, and fig. 12 shows a schematic structural diagram of the in-game display control apparatus in an exemplary embodiment of the present disclosure; as shown in fig. 12, the in-game display control apparatus 1200 may include an acquisition module 1201, an interface generation module 1202, and a viewing angle adjustment module 1203. Wherein:
an obtaining module 1201, configured to obtain a pre-created auxiliary game scene in response to a display trigger operation acting on the graphical user interface.
In an exemplary embodiment of the present disclosure, the graphical user interface includes at least one function control, the display triggering operation performed on the graphical user interface is responded, and the obtaining module is configured to obtain the pre-created auxiliary game scene in response to the first touch operation performed on the function control.
In an exemplary embodiment of the present disclosure, a virtual camera is disposed in an auxiliary game scene, and a display interface is generated according to the auxiliary game scene and a virtual character in a game, including: acquiring scene images in the auxiliary game scene through a virtual camera; and generating a display interface according to the scene image and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the secondary game scenario includes at least one of: a game scene area where a virtual character is located in a main game scene of a game; a specific game scene area in a main game scene of the game; a background scene associated with a primary game scene in the game.
An interface generating module 1202, configured to generate a display interface according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the disclosure, the interface generating module is configured to generate a presentation interface corresponding to the function control according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the interface generating module is configured to perform blurring processing on the scene image; adjusting the display parameters of the scene image after the fuzzification processing to target display parameters; and rendering the scene image and the virtual character in the game after the target display parameters are adjusted to the graphical user interface to obtain a display interface.
In an exemplary embodiment of the present disclosure, the interface generating module is configured to perform zoom adjustment on the image size of the scene image after being adjusted to the target display parameter according to the screen size of the terminal device.
A view angle adjusting module 1203, configured to adjust a display view angle of the auxiliary game scene and the virtual character in the display interface in response to a view angle adjusting operation applied to the display interface.
In an exemplary embodiment of the present disclosure, the viewing angle adjustment operation includes a second touch operation that acts on the presentation interface; the visual angle adjusting module is used for acquiring operation parameters corresponding to second touch control operation acting on the display interface; determining a visual angle adjusting parameter according to the operation parameter; and adjusting the display visual angles of the auxiliary game scenes and the virtual characters in the display interface according to the visual angle adjusting parameters.
In an exemplary embodiment of the present disclosure, the view angle adjusting module is configured to determine a rotation parameter of a virtual camera disposed in the auxiliary game scene according to the operation parameter; and determining the rotation parameter as a visual angle adjusting parameter.
In an exemplary embodiment of the present disclosure, the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation; the visual angle adjusting module is used for determining the rotation angle of a virtual camera arranged in the auxiliary game scene according to the movement distance of the touch point; the rotation direction of the virtual camera is determined according to the sliding direction.
In an exemplary embodiment of the present disclosure, the second touch operation is a click operation or a re-click operation, and the operation parameters include a relative distance and a relative position relationship between a touch point position of the second touch operation and a preset reference position; the visual angle adjusting module is used for determining the rotation angle of the virtual camera arranged in the auxiliary game scene according to the relative distance; and determining the rotating direction of the virtual camera according to the relative position relation.
In an exemplary embodiment of the present disclosure, the touch point movement distance includes a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction; the visual angle adjusting module is used for determining a first rotating angle corresponding to the virtual camera according to the moving distance of the touch point in the horizontal direction and the rotating radius of the virtual camera; determining a second rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction, the movement distance of the touch point in the vertical direction and the rotation radius of the virtual camera; and determining the first rotation angle and the second rotation angle as rotation angles corresponding to the virtual camera arranged in the auxiliary game scene.
The specific details of each module in the display control device in the game are already described in detail in the corresponding display control method in the game, and therefore, the details are not repeated here.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a mobile terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer storage medium capable of implementing the above method. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 13, a program product 1300 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including AN object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages, such as the "C" language or similar programming languages.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1400 according to such an embodiment of the present disclosure is described below with reference to fig. 14. The electronic device 1400 shown in fig. 14 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 14, the electronic device 1400 is embodied in the form of a general purpose computing device. The components of the electronic device 1400 may include, but are not limited to: the at least one processing unit 1410, the at least one memory unit 1420, the bus 1430 that connects the various system components (including the memory unit 1420 and the processing unit 1410), and the display unit 1440.
Wherein the storage unit stores program code that is executable by the processing unit 1410, such that the processing unit 1410 performs steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above in this specification. For example, the processing unit 1410 may execute the following as shown in fig. 1: step S110, responding to the display triggering operation acted on the graphical user interface, and acquiring a pre-established auxiliary game scene; step S120, generating a display interface according to the auxiliary game scene and the virtual roles in the game; step S130, responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
The storage unit 1420 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)14201 and/or a cache memory unit 14202, and may further include a read only memory unit (ROM) 14203.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
Claims (15)
1. A display control method in a game, characterized in that a graphical user interface is provided by a terminal device, the method comprising:
responding to a display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene;
generating a display interface according to the auxiliary game scene and the virtual role in the game;
and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
2. The method of claim 1, wherein the graphical user interface includes at least one functionality control,
the responding to the display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene, comprising:
and responding to a first touch operation acted on the functional control to acquire the pre-created auxiliary game scene.
3. The method of claim 2, wherein generating a presentation interface from the secondary game scene and the virtual character in the game comprises:
and generating the display interface corresponding to the functional control according to the auxiliary game scene and the virtual role in the game.
4. The method of claim 1, wherein a virtual camera is disposed in the auxiliary game scene, and wherein generating a presentation interface according to the auxiliary game scene and a virtual character in the game comprises:
acquiring scene images in the auxiliary game scene through the virtual camera;
and generating the display interface according to the scene image and the virtual character in the game.
5. The method of claim 1, wherein the secondary game scenario comprises at least one of:
the virtual character is located in a game scene area in a main game scene of the game;
a particular game scene area in the primary game scene of the game;
a background scene associated with the primary game scene in the game.
6. The method according to claim 1 or 2, wherein the viewing angle adjustment operation comprises a second touch operation acting on the presentation interface;
the responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface comprises the following steps:
acquiring an operation parameter corresponding to the second touch operation acting on the display interface;
determining a visual angle adjusting parameter according to the operating parameter;
and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface according to the visual angle adjusting parameters.
7. The method of claim 6, wherein determining a view angle adjustment parameter based on the operating parameter comprises:
determining rotation parameters of a virtual camera arranged in the auxiliary game scene according to the operation parameters;
and determining the rotation parameter as the view angle adjusting parameter.
8. The method according to claim 7, wherein the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation;
determining a viewing angle adjustment parameter according to the operating parameter, including:
determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the movement distance of the touch point;
and determining the rotating direction of the virtual camera according to the sliding direction.
9. The method according to claim 7, wherein the second touch operation is a click operation or a re-click operation, and the operation parameters include a relative distance and a relative position relationship between a touch point position of the second touch operation and a preset reference position;
determining a viewing angle adjustment parameter according to the operating parameter, including:
determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the relative distance;
and determining the rotation direction of the virtual camera according to the relative position relation.
10. The method according to claim 8, wherein the touch point movement distance includes a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction;
the determining a rotation angle of a virtual camera disposed in the auxiliary game scene according to the movement distance of the touch point includes:
determining a first rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera;
determining a second rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction, the movement distance of the touch point in the vertical direction and the rotation radius of the virtual camera;
and determining the first rotation angle and the second rotation angle as rotation angles corresponding to virtual cameras arranged in the auxiliary game scene.
11. The method of claim 4, wherein generating the presentation interface from the scene image and the virtual character in the game comprises:
blurring the scene image;
adjusting the display parameters of the scene image after the fuzzification processing to target display parameters;
rendering the scene image adjusted to the target display parameter and the virtual character in the game to the graphical user interface to obtain the display interface.
12. The method of claim 11, further comprising:
and zooming and adjusting the image size of the scene image after the target display parameters are adjusted according to the screen size of the terminal equipment.
13. An in-game display control apparatus that provides a graphical user interface through a terminal device, the apparatus comprising:
the acquisition module is used for responding to the display triggering operation acted on the graphical user interface and acquiring a pre-created auxiliary game scene;
the interface generation module is used for generating a display interface according to the auxiliary game scene and the virtual role in the game;
and the visual angle adjusting module is used for responding to visual angle adjusting operation acted on the display interface and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
14. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the in-game display control method of any one of claims 1 to 12.
15. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the in-game display control method of any one of claims 1 to 12 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255567.9A CN111467803B (en) | 2020-04-02 | 2020-04-02 | Display control method and device in game, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255567.9A CN111467803B (en) | 2020-04-02 | 2020-04-02 | Display control method and device in game, storage medium and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111467803A true CN111467803A (en) | 2020-07-31 |
CN111467803B CN111467803B (en) | 2023-07-14 |
Family
ID=71749636
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010255567.9A Active CN111467803B (en) | 2020-04-02 | 2020-04-02 | Display control method and device in game, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111467803B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112044064A (en) * | 2020-09-02 | 2020-12-08 | 完美世界(北京)软件科技发展有限公司 | Game skill display method, device, equipment and storage medium |
CN112076470A (en) * | 2020-08-26 | 2020-12-15 | 北京完美赤金科技有限公司 | Virtual object display method, device and equipment |
CN112121415A (en) * | 2020-09-30 | 2020-12-25 | 腾讯科技(深圳)有限公司 | Method, device and equipment for controlling interface display and storage medium |
CN113440846A (en) * | 2021-07-15 | 2021-09-28 | 网易(杭州)网络有限公司 | Game display control method and device, storage medium and electronic equipment |
CN113750529A (en) * | 2021-09-13 | 2021-12-07 | 网易(杭州)网络有限公司 | Direction indicating method and device in game, electronic equipment and readable storage medium |
CN113893531A (en) * | 2021-09-30 | 2022-01-07 | 完美世界(北京)软件科技发展有限公司 | Game role creating method and device, storage medium and computer equipment |
CN115054919A (en) * | 2022-06-09 | 2022-09-16 | 咪咕互动娱乐有限公司 | Method, system, equipment and storage medium for adjusting angle of attack picture of cloud game |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105068706A (en) * | 2015-07-31 | 2015-11-18 | 张维谦 | Slide steering method and device of shooting game |
CN107977141A (en) * | 2017-11-24 | 2018-05-01 | 网易(杭州)网络有限公司 | Interaction control method, device, electronic equipment and storage medium |
CN108854068A (en) * | 2018-06-27 | 2018-11-23 | 网易(杭州)网络有限公司 | Display control method and device, storage medium and terminal in game |
CN110180168A (en) * | 2019-05-31 | 2019-08-30 | 网易(杭州)网络有限公司 | A kind of display methods and device, storage medium and processor of game picture |
-
2020
- 2020-04-02 CN CN202010255567.9A patent/CN111467803B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105068706A (en) * | 2015-07-31 | 2015-11-18 | 张维谦 | Slide steering method and device of shooting game |
CN107977141A (en) * | 2017-11-24 | 2018-05-01 | 网易(杭州)网络有限公司 | Interaction control method, device, electronic equipment and storage medium |
CN108854068A (en) * | 2018-06-27 | 2018-11-23 | 网易(杭州)网络有限公司 | Display control method and device, storage medium and terminal in game |
CN110180168A (en) * | 2019-05-31 | 2019-08-30 | 网易(杭州)网络有限公司 | A kind of display methods and device, storage medium and processor of game picture |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112076470A (en) * | 2020-08-26 | 2020-12-15 | 北京完美赤金科技有限公司 | Virtual object display method, device and equipment |
CN112044064A (en) * | 2020-09-02 | 2020-12-08 | 完美世界(北京)软件科技发展有限公司 | Game skill display method, device, equipment and storage medium |
CN112121415A (en) * | 2020-09-30 | 2020-12-25 | 腾讯科技(深圳)有限公司 | Method, device and equipment for controlling interface display and storage medium |
CN113440846A (en) * | 2021-07-15 | 2021-09-28 | 网易(杭州)网络有限公司 | Game display control method and device, storage medium and electronic equipment |
CN113440846B (en) * | 2021-07-15 | 2024-05-10 | 网易(杭州)网络有限公司 | Game display control method and device, storage medium and electronic equipment |
CN113750529A (en) * | 2021-09-13 | 2021-12-07 | 网易(杭州)网络有限公司 | Direction indicating method and device in game, electronic equipment and readable storage medium |
CN113750529B (en) * | 2021-09-13 | 2024-05-28 | 网易(杭州)网络有限公司 | Direction indication method and device in game, electronic equipment and readable storage medium |
CN113893531A (en) * | 2021-09-30 | 2022-01-07 | 完美世界(北京)软件科技发展有限公司 | Game role creating method and device, storage medium and computer equipment |
CN115054919A (en) * | 2022-06-09 | 2022-09-16 | 咪咕互动娱乐有限公司 | Method, system, equipment and storage medium for adjusting angle of attack picture of cloud game |
Also Published As
Publication number | Publication date |
---|---|
CN111467803B (en) | 2023-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111467803B (en) | Display control method and device in game, storage medium and electronic equipment | |
US10863168B2 (en) | 3D user interface—360-degree visualization of 2D webpage content | |
CN108762482B (en) | A method and system for data interaction between large screen and augmented reality glasses | |
US11003305B2 (en) | 3D user interface | |
CN112243583B (en) | Multi-endpoint mixed reality conference | |
WO2018188499A1 (en) | Image processing method and device, video processing method and device, virtual reality device and storage medium | |
KR101086570B1 (en) | Dynamic window structure | |
CN111701238A (en) | Virtual picture volume display method, device, equipment and storage medium | |
US20090251460A1 (en) | Systems and methods for incorporating reflection of a user and surrounding environment into a graphical user interface | |
US10049490B2 (en) | Generating virtual shadows for displayable elements | |
US7982751B2 (en) | Methods and systems for controlling a computer using a video image and for combining the video image with a computer desktop | |
CN114928673B (en) | Shot picture display method, terminal and storage medium | |
US10623713B2 (en) | 3D user interface—non-native stereoscopic image conversion | |
CN104811639B (en) | Information processing method and electronic equipment | |
CN111973984A (en) | Coordinate control method and device for virtual scene, electronic equipment and storage medium | |
CN116820291A (en) | Article display method, device and storage medium of VR scene | |
CN114546228A (en) | Expression image sending method, device, equipment and medium | |
US20240020910A1 (en) | Video playing method and apparatus, electronic device, medium, and program product | |
CN115145451B (en) | Frame selection method, device and equipment on terminal equipment and storage medium | |
Xu et al. | Design and implementation of interactive game based on augmented reality | |
Li | Design and Implementation of 3D Virtual Campus Online Interaction Based on Untiy3D | |
Huynh et al. | Motion Recognition Control by Mediapipe Holistic | |
HK40073409A (en) | Display method of shooting picture, terminal and storage medium | |
HK40073409B (en) | Display method of shooting picture, terminal and storage medium | |
WO2025020657A1 (en) | Image display method for virtual scene, device, medium and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |