CN116474367A - Virtual lens control method and device, storage medium and computer equipment - Google Patents
Virtual lens control method and device, storage medium and computer equipment Download PDFInfo
- Publication number
- CN116474367A CN116474367A CN202310239205.4A CN202310239205A CN116474367A CN 116474367 A CN116474367 A CN 116474367A CN 202310239205 A CN202310239205 A CN 202310239205A CN 116474367 A CN116474367 A CN 116474367A
- Authority
- CN
- China
- Prior art keywords
- virtual lens
- virtual
- user interface
- graphical user
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000003860 storage Methods 0.000 title claims abstract description 16
- 230000004044 response Effects 0.000 claims abstract description 166
- 238000003825 pressing Methods 0.000 claims description 52
- 230000006870 function Effects 0.000 claims description 47
- 238000009826 distribution Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 6
- 238000013459 approach Methods 0.000 claims description 5
- 230000002085 persistent effect Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 9
- 230000000873 masking effect Effects 0.000 abstract description 2
- 230000001276 controlling effect Effects 0.000 description 54
- 238000010586 diagram Methods 0.000 description 13
- 238000001514 detection method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 238000004088 simulation Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 238000010304 firing Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010924 continuous production Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a control method, a device, a storage medium and computer equipment of a virtual lens, wherein the method comprises the following steps: displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes along a designated collection direction by a virtual lens; in response to a first touch operation for a first response area in the graphical user interface, masking a functional response of a game functionality control in the graphical user interface, generating a second response area in the graphical user interface; and responding to a second touch operation aiming at a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction. The functional response of the game function control in the graphical user interface is shielded before the control of the virtual lens through the first touch operation aiming at the first response area in the graphical user interface, so that the problem that other game function controls are touched by mistake in the control process of the virtual lens is avoided.
Description
Technical Field
The present invention relates to the field of computers, and in particular, to a method and apparatus for controlling a virtual lens, a computer readable storage medium, and a computer device.
Background
In recent years, with development and popularization of computer equipment technology, more and more applications having three-dimensional virtual environments, such as: virtual reality applications, three-dimensional map programs, military simulation programs, first person shooter games (First person shooting game, FPS), role-playing games (RPG), and the like.
In the prior art, when a user plays a game, if the user needs to observe a certain virtual character or virtual object in a virtual scene in detail, the user needs to slide from near to far by using two fingers in an interactable area near the virtual character controlled by the user, so that the virtual lens is controlled to approach, and the displayed picture is enlarged; if the required picture is smaller, the two fingers slide from far to near, so that the virtual lens is controlled to be far away.
In the research and practice process of the prior art, the inventor of the application finds that in the prior game process, when a user adjusts a virtual lens, other functional controls of the game are touched by mistake along with the sliding of two fingers.
Disclosure of Invention
The embodiment of the application provides a control method and device for a virtual lens, which can avoid mistakenly touching other functional controls of a game when the virtual lens is regulated.
In order to solve the technical problems, the embodiment of the application provides the following technical scheme:
a control method of a virtual lens, comprising:
displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction;
in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface;
and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction.
A control device of a virtual lens, comprising:
the display module is used for displaying a graphical user interface, wherein the graphical user interface comprises at least partial virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction;
A shielding module, configured to shield a functional response of the game function control in the graphical user interface in response to a first touch operation for a first response area in the graphical user interface, and generate a second response area in the graphical user interface;
and the control module is used for responding to a second touch operation aiming at a second response area in the graphical user interface and controlling the virtual lens to move according to the appointed acquisition direction.
A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the virtual lens control method described above.
A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method for controlling a virtual lens as described above when executing the program.
The method comprises the steps that a graphical user interface is displayed, wherein the graphical user interface comprises at least partial virtual scene images and game function controls, and the virtual scene images are images obtained by acquiring virtual scenes by a virtual lens along a designated acquisition direction; in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface; and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction. Therefore, through the first touch operation aiming at the first response area in the graphical user interface, the functional response of the game function control in the graphical user interface is shielded before the control of the virtual lens, so that the problem of mistakenly touching other game function controls in the control process of the virtual lens is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic system diagram of a method for controlling a virtual lens according to an embodiment of the present application.
Fig. 1b is a schematic flow chart of a first method for controlling a virtual lens according to an embodiment of the present application.
Fig. 1c is a first schematic diagram of a graphical user interface provided in an embodiment of the present application.
Fig. 1d is a second schematic diagram of a graphical user interface according to an embodiment of the present application.
Fig. 1e is a third schematic diagram of a graphical user interface according to an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a control device for a virtual lens according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a control method and device of a virtual lens, a storage medium and computer equipment. Specifically, the method for controlling the virtual lens in the embodiment of the present application may be performed by a computer device, where the computer device may be a device such as a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA), etc., and the terminal device may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, etc. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligent platforms.
For example, when the control method of the virtual lens is run on the terminal, the terminal device stores a game application program and presents a part of game scenes in the game through the display component. The terminal device is used for interacting with a user through a graphical user interface, for example, the terminal device downloads and installs a game application program and runs the game application program. The way in which the terminal device presents the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device, or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including game screens and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for running the game, generating the graphical user interface, responding to the operation instructions, and controlling the display of the graphical user interface on the touch display screen.
For example, when the control method of the virtual lens is run on a server, it may be a cloud game. Cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game application program and a game picture presentation main body are separated, and the storage and the running of the control method of the virtual lens are completed on the cloud game server. The game image presentation is completed at a cloud game client, which is mainly used for receiving and sending game data and presenting game images, for example, the cloud game client may be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, etc., near a user side, but a terminal device performing virtual lens control is a cloud game server in the cloud. When playing the game, the user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
Referring to fig. 1a, fig. 1a is a schematic system diagram of a method for controlling a virtual lens according to an embodiment of the present application. The system may include at least one computer device 1000, at least one server 2000, at least one database 3000, and a network 4000. The computer device 1000 held by the user may be connected to servers of different games through the network 4000. Computer device 1000 is any device having computing hardware capable of supporting and executing software products corresponding to a game. In addition, the computer device 1000 has one or more multi-touch sensitive screens for sensing and obtaining input of a user through touch or slide operations performed at multiple points of the one or more touch sensitive display screens. In addition, when the system includes a plurality of computer devices 1000, a plurality of servers 2000, and a plurality of networks 4000, different computer devices 1000 may be connected to each other through different networks 4000, through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, the different computer devices 1000 may be connected to other terminals or to a server or the like using their own bluetooth network or hotspot network. For example, multiple users may be online through different computer devices 1000 so as to be connected through an appropriate network and synchronized with each other to support multi-user gaming. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to game environments may be continuously stored in the databases 3000 when different users play multi-user games online.
The embodiment of the application provides a control method of a virtual lens, which can be executed by a terminal or a server. The embodiment of the application is described by taking a control method of a virtual lens as an example executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the display component. When a user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the user-generated operational instructions for the graphical user interface include instructions for launching the gaming application, and the processor is configured to launch the gaming application after receiving the user-provided instructions for launching the gaming application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch-sensitive display screen. A touch display screen is a multi-touch-sensitive screen capable of sensing touch or slide operations performed simultaneously by a plurality of points on the screen. The user performs touch operation on the graphical user interface by using a finger, and when the graphical user interface detects the touch operation, the graphical user interface controls different virtual objects in the graphical user interface of the game to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, a first person shooter game (First person shooting game, FPS), and the like. Wherein the game may comprise a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by a user (or users) may be included in the virtual scene of the game. In addition, one or more obstacles, such as rails, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual object, e.g., to limit movement of the one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, scores, character health status, energy, etc., to provide assistance to the user, provide virtual services, increase scores related to the user's performance, etc. In addition, the graphical user interface may also present one or more indicators to provide indication information to the user. For example, a game may include a user-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other users of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using an Artificial Intelligence (AI) algorithm, implementing a human-machine engagement mode. For example, virtual objects possess various skills or capabilities that a game user uses to achieve a goal. For example, the virtual object may possess one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a user of the game using one of a plurality of preset touch operations with the touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of the user.
It should be noted that, the system schematic diagram of the control system of the virtual lens shown in fig. 1a is only an example, and the control system and the scene of the virtual lens described in the embodiments of the present application are for more clearly describing the technical solution of the embodiments of the present application, and do not constitute a limitation on the technical solution provided by the embodiments of the present application, and as one of ordinary skill in the art can know, along with the evolution of the control system of the virtual lens and the appearance of the new service scene, the technical solution provided by the embodiments of the present application is also applicable to similar technical problems.
In the present embodiment, description will be made from the viewpoint of a control device of a virtual lens, which may be integrated in a computer device having a storage unit and a microprocessor mounted thereon and having arithmetic capability.
Referring to fig. 1b, fig. 1b is a first flowchart of a method for controlling a virtual lens according to an embodiment of the present application. The control method of the virtual lens comprises the following steps:
in step 101, a graphical user interface is displayed, where the graphical user interface includes at least a part of virtual scene images and game function controls, where the virtual scene images are images obtained by capturing virtual scenes by a virtual lens along a designated capturing direction.
The three-dimensional virtual scene picture is a virtual scene provided when an application program runs on the terminal, and can be a simulation scene of a real world, a semi-simulation and semi-fictional scene or a pure fictional scene. The scene picture displayed on the graphical user interface is a scene picture presented when the virtual object observes a three-dimensional virtual scene. The user controls the virtual object in the game scene through the terminal, the virtual object can observe the three-dimensional virtual scene through the virtual lens, taking FPS game as an example, when the virtual lens is positioned at the first person viewing angle, the virtual lens is positioned at the head or neck of the target virtual object, and only the arm part of the virtual character can be displayed in the graphical user interface; when the virtual lens is positioned behind the target virtual object in the third person viewing angle, only the upper body part of the virtual character can be displayed in the graphical user interface. The graphical user interface is a part of virtual scene pictures presented by observing the three-dimensional virtual scene through the virtual lens in a specified collection direction, and the specified collection direction can be adjusted by sliding the graphical user interface by a user.
Specifically, referring to fig. 1c, fig. 1c is a first schematic diagram of a graphical user interface according to an embodiment of the present application. The graphical user interface is presented by a screen of the computer device 1000, in which a virtual object 10 manipulated by a user is included, the virtual object 10 being configured with a virtual firearm through which to fire, and game functionality controls, which may include, in particular, a firing aim identification 20 for assisting the virtual firearm in aiming, a cursor control 30 for prompting the user of current direction information of the virtual object 10, a movement control 40 for controlling movement of the virtual object 10 in a three-dimensional virtual environment, an aim control 50 that can be used when the target virtual object 10 is attacked, and a map control 60 for prompting the user of the location of the virtual object 10 in the three-dimensional virtual environment, and a firing control 70 for controlling the virtual object 10 to fire the firearm in the three-dimensional virtual environment, etc. An indication control 31 is further disposed in the cursor control 30, and is used for indicating the direction of the virtual object 10 in the cursor control 30. It will be appreciated that the user graphical interface may include not only the above-mentioned identifier and control, but also other functional controls or identifiers, which are determined according to the specific game content, and are not limited herein.
In step 102, in response to a first touch operation for a first response area in the graphical user interface, functional responses of the game functionality controls within the graphical user interface are masked, and a second response area is generated within the graphical user interface.
Fig. 1d is a second schematic diagram of the graphical user interface provided in the embodiment of the present application, as shown in fig. 1 d. A first response area 90 is provided within the graphical user interface. In order to avoid a situation that the user has mistakenly touched other functional controls of the game when adjusting the virtual lens, the first response area 90 is set. The first response area 90 is configured to mask the functional response of the game function control in the gui when receiving the first touch operation with respect to the first response area 90, and generate a second response area 80, where the second response area 80 is configured to receive the touch operation of the user, so as to control the movement of the virtual lens.
Specifically, since the function of the game function control is already masked at this time, when the user clicks the area where the game function control is located, only the virtual lens adjustment function corresponding to the second response area 80 is implemented, and the function corresponding to the game function control is not implemented.
In step 103, in response to a second touch operation for a second response area in the graphical user interface, the virtual lens is controlled to move according to the designated acquisition direction.
Fig. 1d is a second schematic diagram of a graphical user interface provided in an embodiment of the present application, as shown in fig. 1d and fig. 1 e. Wherein a second response area 80 is generated in the graphical user interface, the second response area 80 may be proximate to the game functionality control, for example, the second response area 80 is proximate to the movement control 40 of the game functionality control in fig. 1 d; the second response area 80 may also include game functionality controls, such as the movement control 40 of FIG. 1e, where the second response area 80 includes game functionality controls.
Specifically, the second response area 80 is used for receiving a second touch operation of the user on the second response area 80, so as to control the virtual lens to move according to the designated collection direction.
In some embodiments, the second touch operation is a click operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
And responding to the clicking operation aiming at the second response area, and controlling the virtual lens to move along the same direction as the designated acquisition direction.
Because the control of the virtual lens moving in the same direction as the designated collection direction or moving in the opposite direction to the designated collection direction is the second touch operation performed on the second response area, in order to identify the control intention of the user to control the virtual lens, the second touch operation needs to be classified in operation modes, and when the operation mode of the second touch operation is a click operation, the virtual lens is controlled to move in the same direction as the designated collection direction.
In some embodiments, the second touch operation is a long press operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
and responding to the long-press operation aiming at the second response area, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction.
And when the operation mode of the second touch operation is long-press operation, controlling the virtual lens to move along the direction opposite to the designated acquisition direction.
In some embodiments, the step of controlling the virtual lens to move in the same direction as the specified acquisition direction in response to the click operation for the second response area includes:
and responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the same direction as the designated acquisition direction by the movement parameter.
The movement parameter for controlling the virtual lens to move is determined by the operation parameter of the second touch operation. For the case where the virtual lens is controlled to move in the same direction as the specified collection direction by the click operation, the movement parameter of the virtual lens may be determined according to the operation parameter of the click operation, and the virtual lens may be controlled to move in the same direction as the specified collection direction by the movement parameter.
In some embodiments, the step of determining the movement parameter of the virtual lens according to the operation parameter of the click operation includes:
(1) Acquiring a first maximum moving distance of the virtual lens along the appointed acquisition direction;
(2) Acquiring the clicking times of the clicking operation, and determining the clicking times as operation parameters of the clicking operation;
(3) And determining the movement parameters of the virtual lens based on the click times and the first maximum movement distance.
The virtual lens faces different collection directions due to design factors of the virtual scene, and the movable maximum moving distance is different. For example, when the virtual character approaches and faces a wall, in order to avoid that the virtual lens passes through the mould to see the virtual scene on the other side of the wall, the maximum moving distance is smaller. Therefore, since the virtual lens is required to move in the same direction as the specified acquisition direction, the first maximum movement distance of the virtual lens in the specified acquisition direction needs to be acquired. Wherein the first maximum movement distance may be a furthest distance from the target avatar or target virtual thing. For example, the acquisition direction is designated to see the target virtual character, and the first maximum movement distance is the distance farthest from the target virtual character in the designated acquisition direction, for example, 10m.
Specifically, after the first maximum movement distance is obtained, determining the clicking times of the clicking operation as the operation parameters of the clicking operation, and determining the movement parameters of the virtual lens by combining the clicking times.
Wherein the step of determining the movement parameter of the virtual lens based on the click times and the first maximum movement distance includes:
(1.1) acquiring a first current position of the virtual lens along the designated acquisition direction;
(1.2) calculating the product of the first maximum moving distance and a first preset distribution proportion to obtain a first reference moving distance;
and (1.3) calculating the product of the first reference moving distance and the clicking times to obtain a first target moving distance, and determining the first target moving distance as a moving parameter of the virtual lens.
The method for determining the movement parameter of the virtual lens based on the clicking times of the clicking operation and the first maximum movement distance may be as follows: and controlling the distance of the virtual lens to the result obtained by multiplying the first maximum moving distance and the first preset distribution proportion in the appointed acquisition direction after clicking once by the user, namely, the first reference moving distance, and determining the total moving distance of the virtual lens in the same direction as the appointed acquisition direction, namely, the first target moving distance according to the clicking times of the user. And controlling the moving distance of the virtual lens to move the first target moving distance along the same direction as the designated collecting direction from the first current position along the designated collecting direction, so as to realize picture amplification.
For example, the first maximum moving distance is 10m, the first preset allocation proportion is 20%, the number of clicks is 3, the first reference moving distance is 10m×20% =2m, the first target moving distance is 2m×3 times=6m, and the virtual lens is controlled to move 6m from the first current position along the same direction as the designated collection direction.
In some embodiments, the step of controlling the virtual lens to move in a direction opposite to the specified acquisition direction in response to the long press operation on the second response area includes:
and responding to the long-press operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the long-press operation, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
Wherein, for the case that the virtual lens is controlled to move in the direction opposite to the specified collection direction by the long press operation, the movement parameter of the virtual lens can be determined according to the operation parameter of the long press operation, and the virtual lens can be controlled to move in the direction opposite to the specified collection direction by the movement parameter.
In some embodiments, the step of determining the movement parameter of the virtual lens according to the operation parameter of the long press operation includes:
(1) Acquiring a second maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
(2) Acquiring the pressing time length of the long pressing operation, and determining the pressing time length as an operation parameter of the pressing operation;
(3) And determining the movement parameter of the virtual lens based on the pressing duration and the second maximum movement distance.
Because of design factors of the virtual scene, the maximum moving distance of the virtual lens which can reversely move is different when the virtual lens faces different collecting directions. For example, when the virtual character leans against a wall, in order to avoid that the virtual lens passes through the mould to see the virtual scene on the other side of the wall, the maximum moving distance is smaller. Therefore, since the virtual lens is required to move in the direction opposite to the specified acquisition direction, a second maximum movement distance of the virtual lens toward the target acquisition direction is required to be acquired. Wherein the second maximum movement distance may be the furthest distance from the target avatar or target virtual thing. For example, the acquisition direction is designated to see the target virtual character, and the second maximum movement distance is the distance farthest from the target virtual character in the designated acquisition direction, for example, 20m.
Specifically, after the second maximum movement distance is obtained, determining the pressing duration of the long press operation as an operation parameter of the long press operation, and determining the movement parameter of the virtual lens by combining the pressing duration.
In some embodiments, the step of determining the movement parameter of the virtual lens based on the pressing duration and the second maximum movement distance includes:
(1.1) obtaining a second current position of the virtual lens along the target acquisition direction;
(1.2) calculating the product of the second maximum movement distance and a second preset distribution proportion to obtain a second reference movement distance;
(1.3) calculating the ratio of the pressing time length to the unit time length to obtain a time length ratio;
and (1.4) calculating the product of the second reference moving distance and the time length proportion to obtain a second target moving distance, and determining the second target moving distance as the moving parameter of the virtual lens.
The method for determining the movement parameter of the virtual lens based on the pressing times of the long press operation and the second maximum movement distance may be as follows: and controlling the virtual lens to acquire and move towards the target every time the pressing time length of the user reaches the unit time length, namely, a second reference moving distance, and determining the time length proportion according to the pressing time length of the user and the unit time length, calculating the product of the second reference moving distance and the time length proportion, and determining the total moving distance of the virtual lens in the direction opposite to the designated acquisition direction, namely, the second target moving distance. And controlling the moving distance of the virtual lens to move a second target moving distance along the direction opposite to the designated collecting direction from a second current position along the designated collecting direction, so as to realize picture reduction.
For example, the second maximum moving distance is 10m, the second preset allocation proportion is 10%, the pressing duration is 3s, the unit duration is 1s, the second reference moving distance is 10m×10% =1m, the duration proportion is 3 s/1s=3, the second target moving distance is 1×3=3m, and the virtual lens is controlled to move 3m from the second current position along the direction opposite to the designated collecting direction.
In some embodiments, the second touch operation is a click operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
and responding to the clicking operation aiming at the second response area, and controlling the virtual lens to move in the direction opposite to the designated acquisition direction.
The clicking operation is used for controlling the virtual lens to move along the direction opposite to the designated collection direction.
In some embodiments, the second touch operation is a long press operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
And responding to the long press operation aiming at the second response area, and controlling the virtual lens to move along the same direction as the designated acquisition direction.
The long press operation is used for controlling the virtual lens to move along the same direction as the designated acquisition direction.
In some embodiments, the step of controlling the virtual lens to move in a direction opposite to the specified collection direction in response to the click operation for the second response area includes:
and responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
Wherein, for the case that the virtual lens is controlled to move in the direction opposite to the designated collection direction by the click operation, the movement parameter of the virtual lens may be determined according to the operation parameter of the click operation, and the virtual lens may be controlled to move in the direction opposite to the designated collection direction by the movement parameter.
In some embodiments, the step of determining the movement parameter of the virtual lens according to the operation parameter of the click operation includes:
(1) Acquiring a third maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
(2) Acquiring the clicking times of the clicking operation, and determining the clicking times as operation parameters of the clicking operation;
(3) And determining the movement parameters of the virtual lens based on the click times and the third maximum movement distance.
Because of design factors of the virtual scene, the maximum moving distance of the virtual lens which can reversely move is different when the virtual lens faces different collecting directions. For example, when the virtual character leans against a wall, in order to avoid that the virtual lens passes through the mould to see the virtual scene on the other side of the wall, the maximum moving distance is smaller. Therefore, since the virtual lens is required to move in the direction opposite to the specified acquisition direction, a third maximum movement distance of the virtual lens toward the target acquisition direction is required to be acquired. Wherein the third maximum movement distance may be the furthest distance from the target avatar or target virtual thing. For example, the acquisition direction is designated to see the target virtual character, and the third maximum movement distance is the distance farthest from the target virtual character in the designated acquisition direction, for example, 20m.
Specifically, after the third maximum movement distance is obtained, determining the clicking times of the clicking operation as the operation parameters of the clicking operation, and determining the movement parameters of the virtual lens by combining the clicking times.
In some embodiments, the step of determining the movement parameter of the virtual lens based on the number of clicks and the third maximum movement distance includes:
(1.1) obtaining a third current position of the virtual lens along the designated acquisition direction;
(1.2) calculating the product of the third maximum movement distance and a third preset distribution proportion to obtain a third reference movement distance;
and (1.3) calculating the product of the third reference moving distance and the clicking times to obtain a third target moving distance, and determining the third target moving distance as the moving parameter of the virtual lens.
The method for determining the movement parameter of the virtual lens based on the clicking times of the clicking operation and the third maximum movement distance may be as follows: and when the user clicks once, controlling the virtual lens to acquire the distance of the result obtained by the product of the third maximum moving distance and the third preset distribution proportion towards the target, namely the third reference moving distance, and determining the total moving distance of the virtual lens along the same direction as the designated acquisition direction, namely the third target moving distance according to the clicking times of the user. And controlling the moving distance of the virtual lens to move a third target moving distance along the direction opposite to the designated collecting direction from the third current position along the designated collecting direction, so as to realize picture reduction.
For example, the third maximum moving distance is 10m, the third preset allocation proportion is 20%, the number of clicks is 3, the third reference moving distance is 10m×20% =2m, the third target moving distance is 2m×3 times=6m, and the virtual lens is controlled to move 6m from the third current position along the direction opposite to the designated collection direction.
In some embodiments, the step of controlling the virtual lens to move in the same direction as the specified acquisition direction in response to the long press operation of the second response area includes:
and responding to the long-press operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the long-press operation, and controlling the virtual lens to move along the same direction as the designated acquisition direction by the movement parameter.
Wherein, for the case that the virtual lens is controlled to move in the same direction as the specified collection direction by the long press operation, the movement parameter of the virtual lens can be determined according to the operation parameter of the long press operation, and the virtual lens can be controlled to move in the same direction as the specified collection direction by the movement parameter.
In some embodiments, the step of determining the movement parameter of the virtual lens according to the operation parameter of the long press operation includes:
(1) Acquiring a fourth maximum moving distance of the virtual lens along the appointed acquisition direction;
(2) Acquiring the pressing time length of the long pressing operation, and determining the pressing time length as an operation parameter of the pressing operation;
(3) And determining the movement parameters of the virtual lens based on the pressing duration and the fourth maximum movement distance.
The virtual lens faces different collection directions due to design factors of the virtual scene, and the movable maximum moving distance is different. For example, when the virtual character approaches and faces a wall, in order to avoid that the virtual lens passes through the mould to see the virtual scene on the other side of the wall, the maximum moving distance is smaller. Therefore, since the virtual lens is required to move in the same direction as the specified acquisition direction, a fourth maximum movement distance of the virtual lens in the specified acquisition direction is required to be acquired. Wherein the fourth maximum movement distance may be the furthest distance from the target avatar or target virtual thing. For example, the acquisition direction is designated to see the target virtual character, and the fourth maximum movement distance is the distance farthest from the target virtual character in the designated acquisition direction, for example, 10m.
Specifically, after the fourth maximum movement distance is obtained, determining the pressing duration of the long press operation as an operation parameter of the long press operation, and determining the movement parameter of the virtual lens by combining the pressing duration.
In some embodiments, the step of determining the movement parameter of the virtual lens based on the pressing duration and the fourth maximum movement distance includes:
(1.1) obtaining a fourth current position of the virtual lens along the target acquisition direction;
(1.2) calculating the product of the fourth maximum moving distance and a fourth preset distribution proportion to obtain a fourth reference moving distance;
(1.3) calculating the ratio of the pressing time length to the unit time length to obtain a time length ratio;
and (1.4) calculating the product of the fourth reference moving distance and the time length proportion to obtain a fourth target moving distance, and determining the fourth target moving distance as the moving parameter of the virtual lens.
The method for determining the movement parameter of the virtual lens based on the pressing times of the long press operation and the fourth maximum movement distance may be as follows: and controlling the virtual lens to acquire and move towards the target every time the pressing time of the user reaches the unit time, namely, a fourth reference moving distance, which is the distance of the product result of the fourth maximum moving distance and the fourth preset distribution proportion, determining the time proportion according to the pressing time of the user and the unit time, calculating the product of the fourth reference moving distance and the time proportion, and determining the total moving distance of the virtual lens along the same direction as the designated acquisition direction, namely, the fourth target moving distance. And controlling the virtual lens to move from the fourth current position along the designated acquisition direction by the moving distance of the fourth target moving distance along the same direction as the designated acquisition direction, so as to realize picture amplification.
For example, the fourth maximum moving distance is 10m, the fourth preset allocation proportion is 10%, the pressing duration is 3s, the unit duration is 1s, the fourth reference moving distance is 10m×10% =1m, the duration proportion is 3 s/1s=3, the fourth target moving distance is 1×3=3m, and the virtual lens is controlled to move 3m from the fourth current position along the same direction as the designated collecting direction.
In some embodiments, the first touch operation is a persistent operation, and after the step of controlling the virtual lens to approach or depart from the designated collection direction in response to the first touch operation for the first response area in the graphical user interface, the method further includes:
and if the continuous interruption of the operation of the first touch operation is detected, hiding the second response area, and recovering the functional response of the game functional control in the graphical user interface.
The second touch operation is a continuous operation, such as a long-press operation or a continuous clicking operation, the second touch operation for the second response area 80 is implemented during the continuous process of the first touch operation for the first response area 90 by the user, when the first touch operation for the first response area disappears, the second response area is hidden, and the functional response of the game function control in the graphical user interface is restored.
As can be seen from the foregoing, in the embodiment of the present application, by displaying a graphical user interface, where the graphical user interface includes at least a part of virtual scene images and game function controls, where the virtual scene images are images obtained by collecting virtual scenes by a virtual lens along a specified collection direction; in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface; and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction. Therefore, through the first touch operation aiming at the first response area in the graphical user interface, the functional response of the game function control in the graphical user interface is shielded before the control of the virtual lens, so that the problem of mistakenly touching other game function controls in the control process of the virtual lens is avoided.
In order to facilitate better implementation of the control method of the virtual lens provided by the embodiment of the application, the embodiment of the application also provides a device based on the control method of the virtual lens. The meaning of the nouns is the same as that of the virtual lens control method, and specific implementation details can be referred to the description of the method embodiment.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a control device for a virtual lens according to an embodiment of the present application, where the control device for a virtual lens may include a display module 301, a shielding module 302, a control module 303, and the like.
The display module 301 is configured to display a graphical user interface, where the graphical user interface includes at least a part of virtual scene images and game function controls, where the virtual scene images are images obtained by collecting virtual scenes by a virtual lens along a specified collection direction;
a masking module 302, configured to mask a functional response of the game function control in the graphical user interface in response to a first touch operation for a first response area in the graphical user interface, and generate a second response area in the graphical user interface;
and the control module 303 is configured to control the virtual lens to move according to the specified acquisition direction in response to a second touch operation for a second response area in the graphical user interface.
In some embodiments, the second touch operation is a click operation, and the control module 303 includes:
and the first control sub-module is used for responding to the clicking operation aiming at the second response area and controlling the virtual lens to move along the direction same as the designated acquisition direction.
In some embodiments, the second touch operation is a long press operation, and the control module 303 includes:
and the second control sub-module is used for responding to the long-press operation aiming at the second response area and controlling the virtual lens to move along the direction opposite to the designated acquisition direction.
In some embodiments, the first control sub-module comprises:
and the first determining unit is used for responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the same direction as the designated acquisition direction by the movement parameter.
In some embodiments, the first determining unit includes:
the first acquisition subunit is used for acquiring a first maximum moving distance of the virtual lens along the appointed acquisition direction;
the first determining subunit is used for obtaining the clicking times of the clicking operation and determining the clicking times as the operation parameters of the clicking operation;
and the second determining subunit is used for determining the movement parameters of the virtual lens based on the click times and the first maximum movement distance.
In some embodiments, the second determining subunit is configured to:
acquiring a first current position of the virtual lens along the appointed acquisition direction;
calculating the product of the first maximum moving distance and a first preset distribution proportion to obtain a first reference moving distance;
and calculating the product of the first reference moving distance and the clicking times to obtain a first target moving distance, and determining the first target moving distance as a moving parameter of the virtual lens.
In some embodiments, the second control sub-module comprises:
and the second determining unit is used for responding to the long-press operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the long-press operation and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
In some embodiments, the second determining unit includes:
the second acquisition subunit is used for acquiring a second maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
a third determining subunit, configured to obtain a pressing duration of the long pressing operation, and determine the pressing duration as an operation parameter of the pressing operation;
And a fourth determining subunit, configured to determine the movement parameter of the virtual lens based on the pressing duration and the second maximum movement distance.
In some embodiments, the fourth determining subunit is configured to:
acquiring a second current position of the virtual lens along the target acquisition direction;
calculating the product of the second maximum moving distance and a second preset distribution proportion to obtain a second reference moving distance;
calculating the ratio of the pressing time length to the unit time length to obtain the time length ratio;
and calculating the product of the second reference moving distance and the time length proportion to obtain a second target moving distance, and determining the second target moving distance as the moving parameter of the virtual lens.
In some embodiments, the second touch operation is a click operation, and the control module 303 includes:
and the third control sub-module is used for responding to the clicking operation aiming at the second response area and controlling the virtual lens to move along the direction opposite to the designated acquisition direction.
In some embodiments, the second touch operation is a long press operation, and the control module 303 includes:
And the fourth control sub-module is used for responding to the long-press operation aiming at the second response area and controlling the virtual lens to move along the direction same as the designated acquisition direction.
In some embodiments, the third control sub-module comprises:
and the third determining unit is used for responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
In some embodiments, the third determining unit includes:
the third acquisition subunit is used for acquiring a third maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
a fifth determining subunit, configured to obtain a number of clicks of the clicking operation, and determine the number of clicks as an operation parameter of the clicking operation;
and a sixth determining subunit, configured to determine a movement parameter of the virtual lens based on the number of clicks and the third maximum movement distance.
In some embodiments, the sixth determining subunit is configured to:
Acquiring a third current position of the virtual lens along the appointed acquisition direction;
calculating the product of the third maximum moving distance and a third preset distribution proportion to obtain a third reference moving distance;
and calculating the product of the third reference moving distance and the clicking times to obtain a third target moving distance, and determining the third target moving distance as the moving parameter of the virtual lens.
In some embodiments, the fourth control sub-module comprises:
and a fourth determining unit, configured to determine, in response to a long-press operation for the second response area, a movement parameter of the virtual lens according to an operation parameter of the long-press operation, and control the virtual lens to move in the same direction as the specified acquisition direction with the movement parameter.
In some embodiments, the fourth determining unit includes:
a fourth obtaining subunit, configured to obtain a fourth maximum movement distance of the virtual lens along the specified collection direction;
a seventh obtaining subunit, configured to obtain a pressing duration of the long pressing operation, and determine the pressing duration as an operation parameter of the pressing operation;
and an eighth obtaining subunit, configured to determine a movement parameter of the virtual lens based on the pressing duration and the fourth maximum movement distance.
In some embodiments, the eighth acquisition subunit is configured to:
acquiring a fourth current position of the virtual lens along the target acquisition direction;
calculating the product of the fourth maximum moving distance and a fourth preset distribution proportion to obtain a fourth reference moving distance;
calculating the ratio of the pressing time length to the unit time length to obtain the time length ratio;
and calculating the product of the fourth reference moving distance and the time length proportion to obtain a fourth target moving distance, and determining the fourth target moving distance as the moving parameter of the virtual lens.
In some embodiments, the apparatus further comprises:
and the recovery module is used for hiding the second response area and recovering the functional response of the game functional control in the graphical user interface if the operation persistence interrupt of the first touch operation is detected.
As can be seen from the foregoing, in the embodiment of the present application, a graphical user interface is displayed through the display module 301, where the graphical user interface includes at least a part of virtual scene images and game function controls, where the virtual scene images are images obtained by collecting virtual scenes by a virtual lens along a specified collection direction; a shielding module 302 is used for shielding functional response of the game function control in the graphical user interface in response to a first touch operation on a first response area in the graphical user interface, and generating a second response area in the graphical user interface; the control module 303 responds to a second touch operation for a second response area in the graphical user interface to control the virtual lens to move according to the designated collection direction. Therefore, through the first touch operation aiming at the first response area in the graphical user interface, the functional response of the game function control in the graphical user interface is shielded before the control of the virtual lens, so that the problem of mistakenly touching other game function controls in the control process of the virtual lens is avoided.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer (PC, personal Computer), a personal digital assistant (Personal Digital Assistant, PDA) and the like. As shown in fig. 3, fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer device 1000 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
The processor 401 is a control center of the computer device 1000, connects various parts of the entire computer device 1000 using various interfaces and lines, and performs various functions of the computer device 1000 and processes data by running or loading software programs and/or modules stored in the memory 402, and calling data stored in the memory 402, thereby performing overall monitoring of the computer device 1000.
In the embodiment of the present application, the processor 401 in the computer device 1000 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction; in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface; and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 3, the computer device 1000 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 3 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to implement the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the application, the processor 401 executes the game application program to generate a graphical user interface on the touch display screen 403, where the virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 1000. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 3, the computer device 1000 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, in the computer device provided in this embodiment, by displaying a graphical user interface, the graphical user interface includes at least a part of virtual scene images and game function controls, where the virtual scene images are images obtained by collecting virtual scenes by a virtual lens along a specified collection direction; in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface; and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction. Therefore, through the first touch operation aiming at the first response area in the graphical user interface, the functional response of the game function control in the graphical user interface is shielded before the control of the virtual lens, so that the problem of mistakenly touching other game function controls in the control process of the virtual lens is avoided.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of computer programs that can be loaded by a processor to perform steps in any of the skills control methods provided by embodiments of the present application. For example, the computer program may perform the steps of:
displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction; in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface; and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any one of the virtual lens control methods provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any one of the virtual lens control methods provided in the embodiments of the present application may be achieved, which are detailed in the previous embodiments and are not described herein.
The above describes in detail a control method, apparatus, storage medium and computer device for a virtual lens provided in the embodiments of the present application, and specific examples are applied to describe the principles and embodiments of the present application, where the description of the above embodiments is only used to help understand the method and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.
Claims (21)
1. A method for controlling a virtual lens, comprising:
displaying a graphical user interface, wherein the graphical user interface comprises at least part of virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction;
in response to a first touch operation for a first response area in the graphical user interface, shielding a functional response of the game function control in the graphical user interface, and generating a second response area in the graphical user interface;
and responding to a second touch operation for a second response area in the graphical user interface, and controlling the virtual lens to move according to the designated acquisition direction.
2. The method according to claim 1, wherein the second touch operation is a click operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
and responding to the clicking operation aiming at the second response area, and controlling the virtual lens to move along the same direction as the designated acquisition direction.
3. The method according to claim 2, wherein the second touch operation is a long press operation, and the step of controlling the virtual lens to move according to the specified collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
and responding to the long-press operation aiming at the second response area, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction.
4. The method according to claim 2, wherein the step of controlling the virtual lens to move in the same direction as the specified collection direction in response to the click operation for the second response area, comprises:
and responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the same direction as the designated acquisition direction by the movement parameter.
5. The method according to claim 4, wherein the step of determining the movement parameter of the virtual lens according to the operation parameter of the click operation includes:
Acquiring a first maximum moving distance of the virtual lens along the appointed acquisition direction;
acquiring the clicking times of the clicking operation, and determining the clicking times as operation parameters of the clicking operation;
and determining the movement parameters of the virtual lens based on the click times and the first maximum movement distance.
6. The method according to claim 5, wherein the step of determining the movement parameter of the virtual lens based on the number of clicks and the first maximum movement distance includes:
acquiring a first current position of the virtual lens along the appointed acquisition direction;
calculating the product of the first maximum moving distance and a first preset distribution proportion to obtain a first reference moving distance;
and calculating the product of the first reference moving distance and the clicking times to obtain a first target moving distance, and determining the first target moving distance as a moving parameter of the virtual lens.
7. A control method of a virtual lens according to claim 3, wherein the step of controlling the virtual lens to move in a direction opposite to the specified collection direction in response to the long press operation for the second response area includes:
And responding to the long-press operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the long-press operation, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
8. The method according to claim 7, wherein the step of determining the movement parameter of the virtual lens according to the operation parameter of the long press operation includes:
acquiring a second maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
acquiring the pressing time length of the long pressing operation, and determining the pressing time length as an operation parameter of the pressing operation;
and determining the movement parameter of the virtual lens based on the pressing duration and the second maximum movement distance.
9. The method according to claim 8, wherein the step of determining the movement parameter of the virtual lens based on the pressing time period and the second maximum movement distance includes:
acquiring a second current position of the virtual lens along the target acquisition direction;
Calculating the product of the second maximum moving distance and a second preset distribution proportion to obtain a second reference moving distance;
calculating the ratio of the pressing time length to the unit time length to obtain the time length ratio;
and calculating the product of the second reference moving distance and the time length proportion to obtain a second target moving distance, and determining the second target moving distance as the moving parameter of the virtual lens.
10. The method according to claim 1, wherein the second touch operation is a click operation, and the step of controlling the virtual lens to move according to the designated collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
and responding to the clicking operation aiming at the second response area, and controlling the virtual lens to move in the direction opposite to the designated acquisition direction.
11. The method according to claim 10, wherein the second touch operation is a long press operation, and the step of controlling the virtual lens to move according to the specified collection direction in response to the second touch operation for the second response area in the graphical user interface includes:
And responding to the long press operation aiming at the second response area, and controlling the virtual lens to move along the same direction as the designated acquisition direction.
12. The method according to claim 10, wherein the step of controlling the virtual lens to move in a direction opposite to the specified collection direction in response to the click operation for the second response area, comprises:
and responding to the clicking operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the clicking operation, and controlling the virtual lens to move along the direction opposite to the designated acquisition direction by the movement parameter.
13. The method according to claim 12, wherein the step of determining the movement parameter of the virtual lens according to the operation parameter of the click operation includes:
acquiring a third maximum moving distance of the virtual lens along a target acquisition direction, wherein the target acquisition direction is opposite to the designated acquisition direction;
acquiring the clicking times of the clicking operation, and determining the clicking times as operation parameters of the clicking operation;
And determining the movement parameters of the virtual lens based on the click times and the third maximum movement distance.
14. The method according to claim 13, wherein the step of determining the movement parameter of the virtual lens based on the number of clicks and the third maximum movement distance includes:
acquiring a third current position of the virtual lens along the appointed acquisition direction;
calculating the product of the third maximum moving distance and a third preset distribution proportion to obtain a third reference moving distance;
and calculating the product of the third reference moving distance and the clicking times to obtain a third target moving distance, and determining the third target moving distance as the moving parameter of the virtual lens.
15. The method according to claim 11, characterized in that the step of controlling the virtual lens to move in the same direction as the specified collection direction in response to the long press operation for the second response area, comprises:
and responding to the long-press operation aiming at the second response area, determining the movement parameter of the virtual lens according to the operation parameter of the long-press operation, and controlling the virtual lens to move along the same direction as the designated acquisition direction by the movement parameter.
16. The method according to claim 15, wherein the step of determining the movement parameter of the virtual lens according to the operation parameter of the long press operation includes:
acquiring a fourth maximum moving distance of the virtual lens along the appointed acquisition direction;
acquiring the pressing time length of the long pressing operation, and determining the pressing time length as an operation parameter of the pressing operation;
and determining the movement parameters of the virtual lens based on the pressing duration and the fourth maximum movement distance.
17. The method according to claim 16, wherein the step of determining the movement parameter of the virtual lens based on the pressing time period and the fourth maximum movement distance includes:
acquiring a fourth current position of the virtual lens along the target acquisition direction;
calculating the product of the fourth maximum moving distance and a fourth preset distribution proportion to obtain a fourth reference moving distance;
calculating the ratio of the pressing time length to the unit time length to obtain the time length ratio;
and calculating the product of the fourth reference moving distance and the time length proportion to obtain a fourth target moving distance, and determining the fourth target moving distance as the moving parameter of the virtual lens.
18. The method according to claim 1, wherein the first touch operation is a persistent operation, and further comprising, after the step of controlling the virtual lens to approach or separate from the designated collection direction in response to the first touch operation for the first response area in the graphical user interface:
and if the continuous interruption of the operation of the first touch operation is detected, hiding the second response area, and recovering the functional response of the game functional control in the graphical user interface.
19. A control device for a virtual lens, comprising:
the display module is used for displaying a graphical user interface, wherein the graphical user interface comprises at least partial virtual scene pictures and game function controls, and the virtual scene pictures are pictures obtained by collecting virtual scenes by a virtual lens along a designated collection direction;
a shielding module, configured to shield a functional response of the game function control in the graphical user interface in response to a first touch operation for a first response area in the graphical user interface, and generate a second response area in the graphical user interface;
And the control module is used for responding to a second touch operation aiming at a second response area in the graphical user interface and controlling the virtual lens to move according to the appointed acquisition direction.
20. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the method of controlling a virtual lens according to any one of claims 1 to 18.
21. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps in the method of controlling a virtual lens according to any one of claims 1 to 18 when the program is executed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310239205.4A CN116474367A (en) | 2023-03-09 | 2023-03-09 | Virtual lens control method and device, storage medium and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310239205.4A CN116474367A (en) | 2023-03-09 | 2023-03-09 | Virtual lens control method and device, storage medium and computer equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116474367A true CN116474367A (en) | 2023-07-25 |
Family
ID=87216748
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310239205.4A Pending CN116474367A (en) | 2023-03-09 | 2023-03-09 | Virtual lens control method and device, storage medium and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116474367A (en) |
-
2023
- 2023-03-09 CN CN202310239205.4A patent/CN116474367A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113398590B (en) | Sound processing method, device, computer equipment and storage medium | |
CN113082688B (en) | Method and device for controlling virtual character in game, storage medium and equipment | |
CN112870718B (en) | Prop using method, prop using device, storage medium and computer equipment | |
CN113413600B (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN113426124B (en) | Display control method and device in game, storage medium and computer equipment | |
CN113332721B (en) | Game control method, game control device, computer equipment and storage medium | |
CN113082707A (en) | Virtual object prompting method and device, storage medium and computer equipment | |
CN114225412A (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN115193064B (en) | Virtual object control method, device, storage medium and computer equipment | |
CN116115991A (en) | Aiming method, aiming device, computer equipment and storage medium | |
CN115970284A (en) | Attack method and device of virtual weapon, storage medium and computer equipment | |
CN115040873A (en) | Game grouping processing method and device, computer equipment and storage medium | |
CN113867873A (en) | Page display method, device, computer equipment and storage medium | |
CN115212567B (en) | Information processing method, information processing device, computer equipment and computer readable storage medium | |
CN112245914B (en) | Viewing angle adjusting method and device, storage medium and computer equipment | |
CN116999825A (en) | Game control method, game control device, computer equipment and storage medium | |
CN116036589A (en) | Attack perception method and device of virtual weapon, storage medium and computer equipment | |
CN115193046A (en) | A game display control method, device, computer equipment and storage medium | |
CN115382201A (en) | Game control method, device, computer equipment and storage medium | |
CN116474367A (en) | Virtual lens control method and device, storage medium and computer equipment | |
CN113398564B (en) | Virtual character control method, device, storage medium and computer equipment | |
CN115193062B (en) | Game control method and device, storage medium and computer equipment | |
CN115430150B (en) | Game skill releasing method and device, computer equipment and storage medium | |
CN116999835A (en) | Game control method, game control device, computer equipment and storage medium | |
CN119746401A (en) | Game control method, game control device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |