CN113926191B - Virtual camera control method, device and electronic device - Google Patents
Virtual camera control method, device and electronic device Download PDFInfo
- Publication number
- CN113926191B CN113926191B CN202111372360.0A CN202111372360A CN113926191B CN 113926191 B CN113926191 B CN 113926191B CN 202111372360 A CN202111372360 A CN 202111372360A CN 113926191 B CN113926191 B CN 113926191B
- Authority
- CN
- China
- Prior art keywords
- virtual
- virtual camera
- plots
- reference surfaces
- sampling points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 238000005070 sampling Methods 0.000 claims abstract description 114
- 238000004590 computer program Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 2
- 230000011218 segmentation Effects 0.000 claims 1
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000008901 benefit Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013016 damping Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a virtual camera control method, a virtual camera control device and electronic equipment. The method comprises the steps of displaying a game scene captured through a virtual camera through the graphical user interface, enabling the game scene to comprise a plurality of virtual plots, enabling the virtual plots to be separated from each other, sampling the virtual plots to obtain a plurality of sampling points, conducting connection operation on the sampling points to generate a plurality of continuous reference surfaces, and controlling the virtual camera to move according to the reference surfaces. The invention solves the technical problem that the virtual camera cannot continuously move on the discrete virtual land in the prior art.
Description
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for controlling a virtual camera, and an electronic device.
Background
Virtual cameras in games are typically used to capture a game scene, thereby enabling a player to view the game scene and play the game. Wherein the virtual camera can shoot the game scene in the form of a depression angle, for example, in the camera navigation simulating the operation type game, the virtual camera moves along with the fluctuation of the continuous surface in the game scene.
However, in the prior art, the camera navigation scheme on the continuous ground is not suitable for the scene of the discrete plots, and if the camera navigation scheme on the continuous ground is adopted, when the camera moves to the blank area of the discrete plots, the virtual camera may fall off due to no support of the plots, or even if the virtual camera does not fall off, the scene picture shot by the camera may be unrealistic due to the height difference between the discrete plots. Thus, the prior art lacks a navigation solution for virtual cameras in a game scene that includes discrete plots.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a method, a device and electronic equipment for controlling a virtual camera, which at least solve the technical problem that the virtual camera cannot continuously move on a discrete virtual land in the prior art.
According to one aspect of the embodiment of the invention, a method for controlling a virtual camera is provided, and a graphical user interface is provided through terminal equipment, wherein the method comprises the steps of displaying a game scene captured through the virtual camera through the graphical user interface, wherein the game scene comprises a plurality of virtual plots, and the plurality of virtual plots are separated from each other; the method comprises the steps of respectively sampling a plurality of virtual plots to obtain a plurality of sampling points, connecting the plurality of sampling points to generate a plurality of continuous reference surfaces, and controlling the virtual camera to move according to the plurality of reference surfaces.
Further, the method of virtual camera control further includes determining that an edit completion instruction for at least one of the plurality of virtual parcels is received before performing a connection operation on the plurality of sampling points to generate a continuous plurality of reference planes.
Further, the virtual camera control method further comprises the step of generating an editing completion instruction for at least one virtual land parcel of the plurality of virtual land parcel when touch operation for the editing completion control is received or when the state after the editing of the at least one virtual land parcel of the plurality of virtual land parcel is detected to meet a preset condition.
The virtual camera control method further comprises the steps of projecting the plurality of sampling points onto a preset plane to obtain reference points corresponding to the plurality of sampling points on the preset plane, performing triangulation operation based on the plurality of reference points to obtain a subdivision result, and performing connection operation on the plurality of sampling points based on the subdivision result to generate a plurality of continuous reference surfaces.
Further, the virtual camera control method further comprises the steps of obtaining the association relation between the virtual camera and the plurality of reference surfaces, obtaining navigation coordinates corresponding to the virtual camera according to the association relation and the plurality of reference surfaces, and controlling the virtual camera to move according to the navigation coordinates.
Further, the method for controlling the virtual camera further comprises the steps of responding to the dragging operation for the game scene, generating dragging data, and determining the association relation between the virtual camera and the multiple reference surfaces according to the dragging data.
The virtual camera control method further comprises the steps of obtaining initial coordinates of the virtual camera corresponding to the multiple reference surfaces according to the association relation and the multiple reference surfaces, and correcting the initial coordinates according to a preset algorithm to obtain navigation coordinates corresponding to the virtual camera.
Further, the method for controlling the virtual camera further comprises adjusting the angle of the virtual camera in the moving process according to the plurality of reference surfaces.
According to another aspect of the embodiment of the invention, a device for controlling a virtual camera is provided, and a graphical user interface is provided through a terminal device, wherein the device comprises a display module for displaying a game scene captured by the virtual camera through the graphical user interface, the game scene comprises a plurality of virtual plots, the virtual plots are separated from each other, a sampling module for respectively sampling the virtual plots to obtain a plurality of sampling points, a connection module for performing connection operation on the sampling points to generate a plurality of continuous reference surfaces, and a control module for controlling the virtual camera to move according to the reference surfaces.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the above-described method of virtual camera control when run.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including one or more processors, and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement a method for running the programs, wherein the programs are configured to perform the above-described virtual camera control method when run.
In the embodiment of the invention, a mode of sampling and connecting discrete plots to generate a continuous read reference surface is adopted, a game scene captured by a virtual camera is displayed on a graphical user interface provided by terminal equipment, a plurality of virtual plots summarized by the game scene are respectively sampled to obtain a plurality of sampling points, then the plurality of sampling points are connected to generate a plurality of continuous reference surfaces, and finally the virtual camera is controlled to move according to the plurality of reference surfaces, wherein the plurality of virtual plots are separated from each other.
In the above process, the plurality of virtual plots are separated from each other, that is, the plurality of virtual plots are in a discrete state. In the application, the discontinuous reference surface is generated into the continuous reference surface by connecting the sampling points on the plurality of virtual plots in the discrete state, so that the virtual camera can continuously move in the game scene, and the continuous movement of the virtual camera in the plurality of virtual plots in the discrete state is realized. Compared with the prior art, the method and the device have the advantages that a player can view the game scene only by performing two-dimensional operation on the virtual camera, and operation on the virtual camera is simplified.
Therefore, the scheme provided by the application achieves the purpose of enabling the virtual camera to continuously move in a plurality of discrete virtual plots, thereby realizing the technical effect of applying the virtual camera to the discrete virtual plots and further solving the technical problem that the virtual camera cannot continuously move on the discrete virtual plots in the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic view of camera navigation of a continuous surface according to the prior art;
FIG. 2 is a diagram of camera navigation on a discrete surface according to the prior art;
FIG. 3 is a flow chart of a method of virtual camera control according to an embodiment of the invention;
FIG. 4 is a top view of an alternative virtual parcel in accordance with an embodiment of the invention;
FIG. 5 is a schematic diagram of an alternative virtual parcel shape editing according to an embodiment of the invention;
FIG. 6 is a schematic diagram of an alternative triangulation according to an embodiment of the present invention;
FIG. 7 is a grid schematic of an alternative navigation surface in accordance with an embodiment of the present invention;
fig. 8 is a schematic diagram of an apparatus for virtual camera control according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of virtual camera control, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and, although a logical order is shown in the flowchart, in some cases, the steps shown or described may be performed in an order other than that shown or described herein.
In addition, it should be noted that the terminal device running the game may perform the method for controlling the virtual camera provided in this embodiment, where the terminal device may be a portable terminal device (e.g., a smart phone, a smart tablet) or a non-mobile device (e.g., a computer). Optionally, in this embodiment, a graphical user interface is provided through the terminal device, where the graphical user interface may display the game scene captured by the virtual camera.
In an alternative embodiment, fig. 3 is a flowchart of a method of virtual camera control according to an embodiment of the present invention, as shown in fig. 3, the method comprising the steps of:
In step S302, a game scene captured by the virtual camera is displayed through the graphical user interface, wherein the game scene includes a plurality of virtual plots, and the plurality of virtual plots are separated from each other.
In step S302, the plurality of virtual plots refer to plots in the game, and the plurality of virtual plots are separated from each other to refer to a plurality of virtual plots that are not connected to each other, for example, in fig. 2, two virtual plots are not connected to each other. In addition, there may be a height difference between multiple virtual plots, for example, in fig. 2, there is a height difference between two virtual plots. Alternatively, in the present application, the separation between the plurality of virtual plots means that the plurality of virtual plots are not connected, and a height difference exists between the plurality of virtual plots.
In an alternative embodiment, in the simulated operation game based on the suspension island, after detecting the game start instruction, the terminal device acquires the game scene captured by the virtual camera in real time, and displays the game scene captured by the virtual camera to the player through the graphic user interface, so that the player can play the game according to the game scene displayed by the terminal device.
Step S304, sampling is carried out on the plurality of virtual plots respectively, and a plurality of sampling points are obtained.
In step S304, in the process of sampling the plurality of virtual plots, the terminal device displays the plurality of virtual plots in the form of a depression angle in the graphical user interface, so that the plurality of virtual plots are displayed in the two-dimensional space. Optionally, during sampling of multiple virtual plots, the locations and/or the sampling points on each virtual plot may be the same or may be different, e.g., in the top view of the virtual plot shown in fig. 4, the number of sampling points on two virtual plots is the same, but the locations of the sampling points on the virtual plots are different.
In an alternative embodiment, each virtual land block is a standard module, and the terminal device may receive the sampling instruction of the player and sample the plurality of virtual land blocks after receiving the sampling instruction of the player. Optionally, after receiving the sampling instruction, the terminal device enters a sampling mode, and in this mode, the player may mark a sampling point on each virtual land block in a form of a mouse click or a finger touch. For example, in FIG. 4, the player has three sample points marked on each of the two virtual plots.
In an alternative embodiment, after receiving the sampling instruction, the terminal device enters a sampling mode, in which the electronic device invokes a preset sampling algorithm to automatically sample each virtual land parcel. Optionally, the terminal device determines the number of sampling points and the positions of the sampling points of each virtual land block by running a preset sampling algorithm. The terminal equipment determines the number and the positions of sampling points of each virtual land block according to the shape information, the area information and the position information of each virtual land block and combining with a game scene. For example, the terminal device determines the number of sampling points according to the area of the virtual land parcels, for example, one sampling point is set for the small-area virtual land parcels, and a plurality of sampling points are set for the large-area virtual land parcels, so that the problem of system resource waste caused by setting a plurality of sampling points for the small-area virtual land parcels is avoided, the system performance is further saved, and the utilization rate of the system resources is maximized.
In an alternative embodiment, after receiving the sampling instruction, the terminal device enters a sampling mode, in which the terminal device reads the relevant information of the sampling point from the preset storage area, and samples each virtual land block according to the read relevant information of the sampling point. The above-mentioned sampling point information may be, but is not limited to, the number information of sampling points and position information. The designer may mark the sampling points in advance and store the related information of the sampling points.
Step S306, performing connection operation on the plurality of sampling points to generate a plurality of continuous reference planes.
In step S304, the plurality of virtual plots are not connected to each other, and a height difference exists between the plurality of virtual plots. In step S306, a plurality of virtual plots in discrete states (i.e., unconnected between the plurality of virtual plots) are connected by performing a connection operation on a plurality of sampling points on the plurality of virtual plots, so as to generate a plurality of continuous reference planes, and the plurality of continuous reference planes are used for navigation of the virtual camera, thereby solving the problem that the virtual camera in the prior art cannot navigate in the game scene of the discrete plots.
Step S308, the virtual camera is controlled to move according to the plurality of reference planes.
It should be noted that, the reference plane generated in step S306 is a continuous reference plane, that is, in step S306, the terminal device generates a plurality of complete and continuous reference planes through a plurality of virtual plots in discrete states, so in step S308, the terminal device may control the virtual camera to move along with the reference plane according to a certain rule, and then the player may move the virtual camera in a continuous view angle in a control formed by the plurality of virtual plots in discrete states.
Furthermore, it should be noted that in the prior art, the camera navigation scheme of the continuous surface is not suitable for the scene of the discrete plots, for example, in the camera navigation schematic diagram of the continuous surface shown in fig. 1, the camera navigation of the continuous surface is essentially that the virtual camera keeps a certain height with the surface on the continuous surface with a certain rule. In this scenario, the surface is typically a pre-set and continuous surface, which cannot be applied to a plurality of discrete plots of unconnected and level-difference surfaces. For discrete plots, when the virtual camera moves to the boundary of the current plot, the virtual camera may fall off because the virtual camera has no support on the ground. Even if the virtual camera floats in the game scene, when the virtual camera is moved, the virtual camera is far away from the low-altitude ground surface and is near to the high-altitude ground surface, so that the virtual camera cannot well shoot the game scene which fluctuates along with the ground surface. For example, in a game scene of a simulated operation game based on a floating island, if a game designer does not put a land block in a certain area, the area does not have the land block, so that the land block of the game scene is discontinuously put, and the prior art scheme is not applicable to the camera navigation schematic diagram of the discrete ground surface shown in fig. 2.
In addition, in the ghost mode or the flying mode of various main viewing angle games, although the virtual camera can freely move in the three-dimensional space, the player is required to control the translation and the direction, at least 4 parameters (including two translation parameters and two rotation parameters) are involved in control, and a keyboard, a mouse and a double-rocker mode are usually required to operate. Although the scheme can achieve the maximum visual angle freedom, the scheme is complex to operate and is generally difficult to be suitable for non-main visual angle games.
Based on the scheme defined in the steps S302 to S308, it can be known that in the embodiment of the present invention, a manner of sampling and connecting discrete plots to generate a continuous reference plane is adopted, a game scene captured by a virtual camera is displayed on a graphical user interface provided by a terminal device, a plurality of virtual plots summarized by the game scene are respectively sampled to obtain a plurality of sampling points, then the plurality of sampling points are connected to generate a plurality of continuous reference planes, and finally the virtual camera is controlled to move according to the plurality of reference planes, wherein the plurality of virtual plots are separated from each other.
It is easy to note that in the above process, the plurality of virtual plots are separated from each other, i.e., the plurality of virtual plots are in a discrete state. In the application, the discontinuous reference surface is generated into the continuous reference surface by connecting the sampling points on the plurality of virtual plots in the discrete state, so that the virtual camera can continuously move in the game scene, and the continuous movement of the virtual camera in the plurality of virtual plots in the discrete state is realized. Compared with the prior art, the method and the device have the advantages that a player can view the game scene only by performing two-dimensional operation on the virtual camera, and operation on the virtual camera is simplified.
Therefore, the scheme provided by the application achieves the purpose of enabling the virtual camera to continuously move in a plurality of discrete virtual plots, thereby realizing the technical effect of applying the virtual camera to the discrete virtual plots and further solving the technical problem that the virtual camera cannot continuously move on the discrete virtual plots in the prior art.
In an alternative embodiment, before performing a connection operation on the plurality of sampling points to generate a plurality of continuous reference planes, the terminal device determines that an edit completion instruction for at least one of the plurality of virtual parcels is received. The editing completion instruction may be a control instruction generated by a player after operating a control representing the completion of editing in the graphical user interface, and the editing completion instruction may also be automatically generated by the terminal device according to the state of the virtual land parcel.
Specifically, when a touch operation for the editing completion control is received, or when it is detected that the state after editing at least one of the plurality of virtual plots meets a preset condition, the terminal device generates an editing completion instruction for at least one of the plurality of virtual plots.
In an alternative embodiment, an edit completion control is provided in the graphical user interface. After detecting that the player performs touch operation on the editing completion control, the terminal device determines that the player completes editing operation on at least one virtual land block in the plurality of virtual land blocks, and at this time, the terminal device generates an editing completion instruction.
In another alternative embodiment, when the placement style of the at least one virtual land parcel is a preset style and/or the shape of the at least one virtual land parcel is a preset shape, the terminal device determines that the player has completed the editing operation on the at least one virtual land parcel, and generates an editing completion instruction.
Further, after the terminal device receives the editing completion instruction, the terminal device stops moving the plurality of virtual plots, and/or the terminal device stops modifying the shape of at least one virtual plot, and/or the terminal device stops placing at least one virtual plot. Namely, when the terminal equipment receives an editing completion instruction, the terminal equipment completes editing operation on at least one virtual land block, and/or when detecting that the placement style of the at least one virtual land block is the placement style selected by a player, the terminal equipment determines that the editing operation on the at least one virtual land block is completed.
It should be noted that, before performing the connection operation on the plurality of sampling points and generating the continuous plurality of reference planes, the terminal device may further perform the editing operation on at least one virtual land parcel. The terminal equipment can edit the position and/or shape and/or placement style of at least one virtual land block.
Optionally, the terminal device receives a first editing instruction for the at least one virtual land parcel, and determines a position of the at least one virtual land parcel in the three-dimensional space. The player can set the position of each virtual land block in the three-dimensional space by dragging the virtual land block through a control medium such as a mouse or a finger. For example, the player drags the virtual land parcel a to position 2 by dragging the virtual land parcel a to position 1 in the three-dimensional space with the mouse.
The position of the at least one virtual land is a position of the virtual land in a three-dimensional space, that is, the position of the at least one virtual land is a three-dimensional position.
Optionally, the terminal device may further receive a second editing instruction for the plurality of virtual plots, and determine a shape of at least one virtual plot in the three-dimensional space. A plurality of editing points are arranged on each virtual land, and the shape of the virtual land is changed by dragging the editing points through a mouse or a finger and other control media. For example, in the shape editing schematic diagram of the virtual land parcel shown in fig. 5, the player may drag the editing point to achieve the purpose of editing the shape of the virtual land parcel.
It should be noted that, the player may set an edit point on the virtual land parcel through a manipulation medium such as a mouse or a finger. In addition, the player may send a second editing instruction for editing the shapes of the plurality of virtual plots to the terminal device, and after receiving the second editing instruction, the terminal device displays an editing point on the virtual plot, for example, the player double clicks the virtual plot or presses the virtual plot for a long time, and the terminal device may receive the second editing instruction, and at this time, the virtual plot with the editing point is displayed in a graphical user interface of the terminal device.
In addition, it should be noted that after at least one virtual plot is re-edited, the terminal device re-calculates the generated continuous multiple reference surfaces through the edited at least one virtual plot, that is, after at least one virtual plot is re-edited, the terminal device samples the edited at least one virtual plot respectively, and performs connection operation on the obtained multiple sampling points to generate the continuous multiple reference surfaces.
Optionally, the terminal device may further receive a third editing instruction for the plurality of virtual plots, and put at least one virtual plot in the three-dimensional space according to a preset placement style. The player can set the placement style of the virtual plots on the terminal device through a control medium such as a mouse or a finger, for example, the virtual plots A, B and C are set on the vertices of an equilateral triangle. In addition, the player can also determine a preset placement style from a plurality of placement styles pre-stored in the terminal device, and the terminal device automatically places at least one virtual land parcel according to the preset placement style after determining the preset placement style selected by the player.
In an alternative embodiment, after sampling a plurality of virtual plots respectively to obtain a plurality of sampling points, the terminal device performs a connection operation on the plurality of sampling points to generate a plurality of continuous reference planes. Specifically, the terminal device projects a plurality of sampling points onto a preset plane to obtain reference points corresponding to the sampling points on the preset plane, then performs triangulation operation based on the reference points to obtain a subdivision result, and finally performs connection operation on the sampling points based on the subdivision result to generate a plurality of continuous reference planes.
Optionally, the terminal device projects all sampling points of all virtual plots onto a preset plane, where the preset plane may be a ground plane or a plane preset by a player. After all the sampling points are projected to a preset plane, the terminal equipment acquires a plurality of projected sampling points projected to the preset plane, wherein the plurality of projected sampling points are sampling points on the preset plane. Then, the terminal device may adopt delaunay triangulation algorithm to triangulate the multiple projected sampling points, and then connect each sampling point according to the triangularization result to generate a grid, so as to generate a continuous reference plane, that is, connect the multiple sampling points in the stereoscopic space based on the triangularization result. In which fig. 6 shows a schematic diagram of an alternative triangulation, in fig. 63 sampling points, i.e. sampling points 1, 2, 3 are arranged on the virtual block a, 3 sampling points, i.e. sampling points 4, 5, 6 are arranged on the virtual block B, wherein sampling point 1 is connected with sampling point 4, sampling point 4 is connected with sampling point 2, sampling point 2 is connected with sampling point 5, sampling point 5 is connected with sampling point 3, and sampling point 3 is connected with sampling point 6.
It should be noted that the delaunay triangulation algorithm is an algorithm that triangulates any given set of planar points based on a "max-min angle" optimization criterion, where the "max-min angle" optimization criterion refers to a criterion that the sum of all the minimum interior angles is the largest.
In addition, it should be further noted that the terminal device first projects a plurality of sampling points of the plurality of virtual plots in the three-dimensional space onto a preset plane to obtain a plurality of projected sampling points, where the plurality of projected sampling points are sampling points on the preset plane. After the subdivision result is obtained, a plurality of sampling points in the three-dimensional space are connected based on the subdivision result, and a grid map of the navigation surface can be generated, as shown in fig. 7. As can be seen from fig. 7, the method provided by the present application can be implemented to generate a complete and continuous navigation surface (i.e., continuous reference surface) for a plurality of virtual plots in discrete states.
Further, after the connection operation is performed on the plurality of sampling points to generate a plurality of continuous reference planes, the terminal device can control the virtual camera to move according to the plurality of reference planes. Specifically, the terminal equipment acquires an association relation between the virtual camera and a plurality of reference surfaces, acquires navigation coordinates corresponding to the virtual camera according to the association relation and the plurality of reference surfaces, and then controls the virtual camera to move according to the navigation coordinates.
It should be noted that the above association relationship indicates that there is a certain association between the virtual camera and the reference plane, for example, the relative distance between the virtual camera and the reference plane is a fixed value, in this scenario, the target relative distance between the virtual camera and the reference plane remains unchanged, that is, the virtual camera fluctuates along with the fluctuation of the reference plane during the moving process, and for example, the distance between the virtual camera and the reference plane is a dynamic value, which may be calculated by a damping algorithm. The relationship between the virtual camera and the reference plane is not limited to the above, but may be other relationships, and will not be illustrated here.
In addition, it should be noted that, in the present application, the navigation coordinate is a coordinate for determining a position of the virtual camera, and the navigation coordinate may be, but is not limited to, a position coordinate in a world coordinate system in a game scene.
In an alternative embodiment, the terminal device may generate drag data in response to a drag operation for the game scene, and determine association relationships between the virtual camera and the plurality of reference planes according to the drag data. The drag data at least comprises information such as a drag speed and a drag direction of a player on a game scene, and the terminal equipment can determine an association relationship between the virtual camera and the reference surface according to the drag speed and/or the drag direction of the player on the game scene.
Alternatively, the player may input a control instruction in the terminal device, where the control instruction is used to control the virtual camera to move along with the reference plane, and the control instruction may be a two-dimensional instruction, for example, a mouse drag instruction, a single-finger drag instruction, or the like. After receiving the control instruction, the terminal equipment analyzes the control instruction, and determines the moving direction and/or the moving speed of the virtual camera from the analysis result, so that a certain rule is kept between the virtual camera and the reference surface, a certain height is kept, and the two-dimensional movement is performed according to the control instruction of the player, so that the player can perform continuous visual angle movement on the virtual camera in a three-dimensional space formed by the virtual land block in a discrete state. For example, the direction in which the player drags the mouse is the moving direction, the speed in which the player drags the mouse is the moving speed, and after detecting the operation of the player on the mouse, the terminal device controls the virtual camera to move along the reference plane according to the moving speed.
It should be noted that, because the virtual camera moves along the continuous reference plane in the application, the player can complete viewing the game scene only by moving the virtual camera in two dimensions, and does not need to perform rotation operation and displacement operation on the virtual camera at the same time.
In an alternative embodiment, in the process of obtaining the navigation coordinates corresponding to the virtual camera according to the association relationship and the plurality of reference surfaces, the terminal device obtains initial coordinates corresponding to the virtual camera and the plurality of reference surfaces according to the association relationship and the plurality of reference surfaces, and corrects the initial coordinates according to a preset algorithm to obtain the navigation coordinates corresponding to the virtual camera.
It should be noted that, in practical application, the relationship between the reference plane and the movement track of the virtual camera is not limited to a parallel relationship, but may be a non-strict parallel relationship, for example, when the reference plane formed by the triangle is not smooth enough, the player may input an adjustment instruction to the terminal device, so that the terminal device corrects the initial coordinates corresponding to the virtual camera and the multiple reference planes based on the preset algorithm, so as to implement smoothing processing on the reference plane, and obtain the navigation coordinates corresponding to the virtual camera. For another example, the terminal device may determine a height variation trend parameter of the reference plane according to the height variation of the reference plane, and then adjust initial coordinates of the virtual camera corresponding to the plurality of reference planes according to the height variation trend parameter, so that a variation trend of the virtual camera path is more gentle or steeper than a variation trend of the terrain.
In another optional embodiment, the terminal device may adjust, in addition to initial coordinates of the virtual camera corresponding to the multiple reference planes, angles of the virtual camera in a moving process according to the multiple reference planes, so that the terminal device adjusts a shooting view angle of the virtual camera according to the fluctuation of the earth surface, and a game scene displayed by the graphical user interface is adapted to the fluctuation of the earth surface.
From the foregoing, it can be seen that the present application provides a navigation scheme for a dynamically changeable discrete virtual parcel scene, which generates a continuous reference surface for a plurality of virtual parcels in discrete states, and uses the continuous reference surface for navigation of a virtual camera. The virtual camera can be applied to game scenes of a plurality of virtual plots in discrete states through the scheme, namely, the virtual camera can continuously move in a discontinuous and non-preset (namely, can be edited along with a player) game scene. In addition, compared with the scheme of the prior art that rotation and displacement operations are required to be carried out on the virtual camera at the same time, the scheme provided by the application has the advantage of convenience in operation.
According to an embodiment of the present invention, there is further provided an apparatus embodiment of virtual camera control, where a graphical user interface is provided by a terminal device, and fig. 8 is a schematic diagram of an apparatus for virtual camera control according to an embodiment of the present invention, and as shown in fig. 8, the apparatus includes a display module 801, a sampling module 803, a connection module 805, and a control module 807.
The device comprises a display module 801, a sampling module 803, a connection module 805 and a control module 807, wherein the display module 801 is used for displaying a game scene captured by a virtual camera through a graphical user interface, the game scene comprises a plurality of virtual plots, the virtual plots are separated from each other, the sampling module 803 is used for respectively sampling the virtual plots to obtain a plurality of sampling points, the connection module 805 is used for performing connection operation on the sampling points to generate a plurality of continuous reference surfaces, and the control module 807 is used for controlling the virtual camera to move according to the reference surfaces.
It should be noted that the display module 801, the sampling module 803, the connection module 805, and the control module 807 correspond to steps S302 to S308 in the above embodiment, and the four modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to those disclosed in the above embodiment.
Optionally, the device for controlling the virtual camera further comprises a first determining module, configured to determine that an edit completion instruction for at least one virtual parcel in the plurality of virtual parcels is received before performing a connection operation on the plurality of sampling points to generate a plurality of continuous reference planes.
Optionally, the device for controlling the virtual camera further comprises a first generation module, configured to generate an edit completion instruction for at least one of the plurality of virtual plots when a touch operation for the edit completion control is received or when it is detected that the state after editing of the at least one of the plurality of virtual plots meets a preset condition.
Optionally, the connection module comprises a projection module, a subdivision module and a first connection module. The system comprises a projection module, a subdivision module and a first connection module, wherein the projection module is used for projecting a plurality of sampling points onto a preset plane to obtain reference points corresponding to the sampling points on the preset plane respectively, the subdivision module is used for carrying out triangulation operation based on the reference points to obtain subdivision results, and the first connection module is used for carrying out connection operation on the sampling points based on the subdivision results to generate a plurality of continuous reference surfaces.
Optionally, the control module comprises a first acquisition module, a second acquisition module and a first control module. The system comprises a first acquisition module, a second acquisition module and a first control module, wherein the first acquisition module is used for acquiring the association relation between the virtual camera and a plurality of reference surfaces, the second acquisition module is used for acquiring navigation coordinates corresponding to the virtual camera according to the association relation and the plurality of reference surfaces, and the first control module is used for controlling the virtual camera to move according to the navigation coordinates.
Optionally, the first acquisition module comprises a second generation module and a second determination module. The game device comprises a first generation module, a second generation module and a second determination module, wherein the first generation module is used for responding to drag operation aiming at a game scene and generating drag data, and the second determination module is used for determining association relations between a virtual camera and a plurality of reference surfaces according to the drag data.
Optionally, the second acquisition module comprises a third acquisition module and a first adjustment module. The system comprises a first acquisition module, a second acquisition module and a first adjustment module, wherein the first acquisition module is used for acquiring initial coordinates of a virtual camera corresponding to a plurality of reference surfaces according to an association relation and the plurality of reference surfaces, and the second acquisition module is used for correcting the initial coordinates according to a preset algorithm to acquire navigation coordinates corresponding to the virtual camera.
Optionally, the device for controlling the virtual camera further comprises a second adjusting module, which is used for adjusting the angle of the virtual camera in the moving process according to the plurality of reference surfaces.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the control method of the virtual camera in the above embodiments when running.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including one or more processors, and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement a method for running the programs, wherein the programs are configured to perform the control method of the virtual camera in the above embodiment when running.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes a U disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, etc. which can store the program code.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (11)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111372360.0A CN113926191B (en) | 2021-11-18 | 2021-11-18 | Virtual camera control method, device and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111372360.0A CN113926191B (en) | 2021-11-18 | 2021-11-18 | Virtual camera control method, device and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113926191A CN113926191A (en) | 2022-01-14 |
CN113926191B true CN113926191B (en) | 2024-12-20 |
Family
ID=79287070
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111372360.0A Active CN113926191B (en) | 2021-11-18 | 2021-11-18 | Virtual camera control method, device and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113926191B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108710525A (en) * | 2018-05-18 | 2018-10-26 | 腾讯科技(深圳)有限公司 | Map methods of exhibiting, device, equipment and storage medium in virtual scene |
CN110313020A (en) * | 2018-01-22 | 2019-10-08 | 深圳市大疆创新科技有限公司 | Image processing method, equipment and computer readable storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4558399B2 (en) * | 1996-07-25 | 2010-10-06 | 株式会社セガ | Play equipment |
GB2571306A (en) * | 2018-02-23 | 2019-08-28 | Sony Interactive Entertainment Europe Ltd | Video recording and playback systems and methods |
CN111494943B (en) * | 2020-04-21 | 2023-03-31 | 网易(杭州)网络有限公司 | Image display method and device, electronic equipment and readable storage medium |
CN113117327B (en) * | 2021-04-12 | 2024-02-02 | 网易(杭州)网络有限公司 | Augmented reality interaction control method and device, electronic equipment and storage medium |
-
2021
- 2021-11-18 CN CN202111372360.0A patent/CN113926191B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110313020A (en) * | 2018-01-22 | 2019-10-08 | 深圳市大疆创新科技有限公司 | Image processing method, equipment and computer readable storage medium |
CN108710525A (en) * | 2018-05-18 | 2018-10-26 | 腾讯科技(深圳)有限公司 | Map methods of exhibiting, device, equipment and storage medium in virtual scene |
Also Published As
Publication number | Publication date |
---|---|
CN113926191A (en) | 2022-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12175618B2 (en) | Image processing method and apparatus, electronic device, and computer-readable storage medium | |
US11012679B2 (en) | Generating apparatus, generating method, and storage medium | |
CN109887003B (en) | Method and equipment for carrying out three-dimensional tracking initialization | |
KR101135186B1 (en) | System and method for interactive and real-time augmented reality, and the recording media storing the program performing the said method | |
CN109771951B (en) | Game map generation method, device, storage medium and electronic equipment | |
US20180276882A1 (en) | Systems and methods for augmented reality art creation | |
CN110163942B (en) | Image data processing method and device | |
KR101410273B1 (en) | Method and apparatus for environment modeling for ar | |
CN111167120A (en) | Method and device for processing virtual model in game | |
JP2023022090A (en) | Responsive video generation method and generation program | |
CN109906600B (en) | Simulated depth of field | |
CN113741698A (en) | Method and equipment for determining and presenting target mark information | |
CN109189302B (en) | Control method and device of AR virtual model | |
JP7483979B2 (en) | Method and apparatus for playing multi-dimensional responsive images | |
Piumsomboon et al. | Physically-based interaction for tabletop augmented reality using a depth-sensing camera for environment mapping | |
CN112348965B (en) | Imaging method, device, electronic device and readable storage medium | |
WO2025082145A1 (en) | Interaction method and apparatus for game model, and device and storage medium | |
CN106325505A (en) | Control method and device based on viewpoint tracking | |
US10297036B2 (en) | Recording medium, information processing apparatus, and depth definition method | |
CN113926191B (en) | Virtual camera control method, device and electronic device | |
KR101743888B1 (en) | User Terminal and Computer Implemented Method for Synchronizing Camera Movement Path and Camera Movement Timing Using Touch User Interface | |
CN112511815A (en) | Image or video generation method and device | |
CN109064563A (en) | The method of real-time control model vertices in a kind of Fusion Edges program of 3D exploitation | |
CN114299203A (en) | Method and device for processing virtual model | |
JPH1166351A (en) | Object movement control method and apparatus in three-dimensional virtual space and recording medium recording object movement control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |