CN115430145A - Target position interaction method and device, electronic equipment and readable storage medium - Google Patents
Target position interaction method and device, electronic equipment and readable storage medium Download PDFInfo
- Publication number
- CN115430145A CN115430145A CN202211111993.0A CN202211111993A CN115430145A CN 115430145 A CN115430145 A CN 115430145A CN 202211111993 A CN202211111993 A CN 202211111993A CN 115430145 A CN115430145 A CN 115430145A
- Authority
- CN
- China
- Prior art keywords
- game
- target
- game interface
- visual angle
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5258—Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses an interaction method and device for a target position, electronic equipment and a computer readable storage medium; the method and the device can display a first game interface with an initial view angle of a current running game; responding to continuous touch operation in a visual angle switching area of the first game interface, and adding a display front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle; responding to the dragging operation in the visual angle switching area of the second game interface, switching the visual angle of the second game interface and the visual angle pointing position of the sight bead to obtain a third game interface with a target visual angle; and taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction. According to the embodiment of the application, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
Description
Technical Field
The application relates to the technical field of game interface interaction, in particular to a target position interaction method and device, electronic equipment and a computer-readable storage medium.
Background
In existing games, for example, MOBA (Multiplayer Online Battle Arena) usually needs to mark a target position in a scene for interaction with the target position in the scene.
In the prior art, the target position marking is generally realized by dragging a signal key for marking the position to a certain position in a scene, or by clicking a certain map position in a small map.
However, both of these methods have difficulty in achieving both the convenience of the marking operation and the accuracy of the marking position. For example, although the first method can accurately locate the target position, if the distance between the signal key at the mark position and the target position to be marked in the scene is relatively long, the signal key needs to be dragged greatly, even the finger is seriously out of the original touch range, and the operation convenience is low. For another example, although the second method can conveniently mark the target position within a smaller moving range, it is difficult to accurately mark the target position because the locations of the small map are concentrated.
Disclosure of Invention
The embodiment of the application provides an interaction method and device for a target position, electronic equipment and a computer-readable storage medium, which can improve the marking accuracy and marking speed of the target position on the basis of considering operation convenience.
In a first aspect, an embodiment of the present application provides an interaction method for a target location, including:
displaying a first game interface with an initial view angle of a currently running game, wherein the first game interface comprises a view angle switching area;
responding to continuous touch operation in a visual angle switching area of the first game interface, and additionally displaying a front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
responding to dragging operation in a visual angle switching area of the second game interface, switching a visual angle of the second game interface and a visual angle pointing position of the sight bead to obtain a third game interface with a target visual angle, wherein the sight bead is kept to be displayed on the third game interface according to the preset screen position, and the visual angle pointing position of the sight bead is switched from the visual angle position at the preset screen position in the second game interface to the visual angle position at the preset screen position in the third game interface;
and taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction.
In a second aspect, an embodiment of the present application further provides an interaction apparatus for a target location, including:
the game system comprises a first display unit, a second display unit and a display unit, wherein the first display unit is used for displaying a first game interface with an initial visual angle of a current running game, and the first game interface comprises a visual angle switching area;
the second display unit is used for responding to continuous touch operation in the visual angle switching area of the first game interface, and increasing display front stars on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
the switching unit is used for responding to dragging operation in a visual angle switching area of the second game interface, switching the visual angle of the second game interface and the visual angle pointing position of the sight bead to obtain a third game interface with a target visual angle, wherein the sight bead is kept displayed on the third game interface according to the preset screen position, and the visual angle pointing position of the sight bead is switched from the visual angle position at the preset screen position in the second game interface to the visual angle position at the preset screen position in the third game interface;
and the interaction unit is used for taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position and executing a target instruction.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory storing multiple instructions; the processor loads instructions from the memory to execute the steps of any one of the target position interaction methods provided by the embodiments of the present application.
In a fourth aspect, the present application further provides a computer-readable storage medium, where multiple instructions are stored, and the instructions are suitable for being loaded by a processor to perform the steps in any one of the methods for interacting with a target location provided by the embodiments of the present application.
In the embodiment of the application, on the first hand, the visual angle switching area is arranged in the game interface of the current running game for switching the visual angle of the game interface, and the sight is matched with the sight aiming at the target interaction position, so that the visual angle switching of the game interface can be completed only by dragging within a smaller amplitude range of the visual angle switching area; the problem that the signal key at the marking position needs to be dragged to the target position needing to be marked is avoided, so that the problem that the signal key at the marking position needs to be dragged to a large extent due to the fact that the distance between the signal key at the marking position and the target position needing to be marked in a scene is long, even fingers are seriously separated from the original touch range is avoided, and convenience of interactive operation of the target position is improved to a certain extent. In the second aspect, the visual angle of the second game interface and the visual angle pointing position of the sight bead are switched by responding to the dragging operation in the visual angle switching area of the second game interface, so that the second game interface with the initial visual angle is switched to the third game interface with the target visual angle, and the sight bead is kept displayed on the third game interface according to the preset screen position, so that the game interface is switched from the initial visual angle to the target visual angle, the visual angle pointing position of the sight bead is driven to change, the sight bead is driven to move to a specific position where interaction is required, and the target position can be accurately positioned. In the third aspect, the foresight is kept displayed on the third game interface according to the preset screen position, so that the screen position of the foresight is unchanged, the screen position of the foresight in the target visual angle does not need to be detected, the visual angle direction of the foresight can be determined quickly, and the marking speed of the target position is improved. Therefore, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an interactive system for target location provided by an embodiment of the present application;
fig. 2 is a flowchart illustrating an embodiment of an interaction method for a target location according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a game interface before and after a front sight display according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram illustrating a game interface before and after switching of a viewing angle according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating an embodiment of executing a target instruction according to the present embodiment;
FIG. 6 is a schematic structural diagram of an interaction apparatus for a target location according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. Meanwhile, in the description of the embodiments of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not construed as indicating or implying relative importance. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The embodiment of the application provides an interaction method and device for a target position, electronic equipment and a computer-readable storage medium. Specifically, the target location interaction method according to the embodiment of the present application may be executed by an electronic device, where the electronic device may be a terminal or a server. The terminal can be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal can also include a client, which can be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the interaction method of the target position is operated on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game picture. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the interaction method of the target location runs on a server, the interaction method can be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the interaction method of the target position are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are coded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic diagram of an interactive system for a target location according to an embodiment of the present disclosure. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The target position interaction method provided by the embodiment of the application can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which the interaction method of the target location is executed by the terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operating instructions generated by the user acting on the graphical user interface include instructions for launching the game application, and the processor is configured to launch the game application after receiving the instructions provided by the user to launch the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed simultaneously at a plurality of points on the screen. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, an educational game, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. In addition, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
The following detailed description is made with reference to the accompanying drawings, respectively. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments. Although a logical order is shown in the flowcharts, in some cases, the steps shown or described may be performed in an order different than that shown in the figures.
As shown in fig. 2, a specific flow of the target location interaction method may include the following steps 201 to 204, where:
201. a first game interface with an initial perspective of a currently running game is displayed.
The first game interface comprises a visual angle switching area.
Wherein the currently running game includes, but is not limited to, a MOBA game.
In the embodiment, the target position is aimed through the sight, so that the target position is selected as the target interaction position.
The first game interface is the game interface of the current running game before the front sight display is triggered.
The initial view angle refers to a view angle of a virtual scene of the currently running game, which is presented by the game interface of the currently running game, before the drag operation of the view angle switching area of the second game interface. The initial viewing angle may be a viewing angle at which the virtual scene is observed using a first person viewing angle of the virtual object, or may be a viewing angle at which the virtual scene is observed using a third person viewing angle.
The visual angle switching area is an area used for operating and switching the visual angle of the virtual scene of the current running game presented by the game interface.
Taking a terminal with a touch display screen as an example, in step 201, a first game interface with an initial view angle is generated by rendering a game application on the touch display screen executed by the terminal, a virtual scene on the first game interface includes a view angle switching area, and the view angle switching area is used for triggering a sight and switching the initial view angle presented in the first game interface, so as to aim at a target position through the sight, thereby realizing interaction of the selected target position.
202. Responding to continuous touch operation in the visual angle switching area of the first game interface, and adding a display sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle.
And the second game interface is the game interface of the current running game after the front sight display is triggered.
The continuous touch operation is performed on the view switching area and used for triggering the front sight display. Continuous touch operations include, but are not limited to: and continuously clicking for two times in the visual angle switching area within 2 seconds and short time.
Taking the example that the continuous touch operation is two continuous clicks in the view angle switching area in a short time, before step 202, it is further detected whether there is a continuous touch operation in the view angle switching area of the first game interface, where the detection of the continuous touch operation specifically includes the following steps A1 to A3, where:
a1, receiving a first click operation in a visual angle switching area of the first game interface.
The first click operation refers to a first click operation in two consecutive click operations of the view angle switching area of the first game interface.
And A2, receiving a second click operation in the visual angle switching area of the first game interface.
The second click operation refers to a second click operation in two consecutive click operations of the view angle switching area of the first game interface.
For example, the player clicks the view switching area of the first game interface 3 times before and after, which are respectively recorded as: click 1, click 2, and click 3; if click 1 and click 2 are two consecutive click operations in the view angle switching area of the first game interface, the first click operation is click 1, and the second click operation is click 2. If click 2 and click 3 are two consecutive click operations in the view angle switching area of the first game interface, the first click operation is click 2, and the second click operation is click 3.
As shown in fig. 3, fig. 3 is a schematic diagram illustrating a game interface before and after the front sight display provided in the embodiment of the present application. Fig. 3 (a) is a schematic diagram of the first game interface, and as shown in fig. 3 (a), the foresight display may be triggered by continuously clicking two times in the view angle switching area (as shown by a dashed line box in fig. 3 (a)) in a short time, in order to trigger the foresight display, the player quickly clicks, lifts, and clicks and holds in the view angle switching area of the first game interface, and the first click of the player in the view angle switching area of the first game interface is a first click operation. At this time, the time of the first click operation is recorded. And clicking and holding the player in the visual angle switching area of the first game interface to obtain the second clicking operation. At this time, the time of the second click operation is recorded.
And A3, if the interval between the time of the first click operation and the time of the second click operation is smaller than a preset time interval, determining that the continuous touch operation of the visual angle switching area of the first game interface exists.
After receiving the first click operation and the second click operation in the view angle switching area of the first game interface, comparing an interval between the time of the first click operation and the time of the second click operation with a preset time interval, if the interval between the time of the first click operation and the time of the second click operation is smaller than the preset time interval, determining that there is a continuous touch operation in the view angle switching area of the first game interface, and proceeding to step 202 to trigger the display of a front sight, where the display is "+" as shown in fig. 3 (b), and fig. 3 (b) is a schematic diagram of the second game interface.
If the interval between the time of the first click operation and the time of the second click operation is greater than or equal to the preset time interval, determining that no continuous touch operation exists in the view angle switching area of the first game interface, and not continuing processing or (based on the next round of continuous two click operations) continuing to detect whether continuous touch operation exists.
The preset screen position refers to a preset screen position for displaying the front sight, and the preset screen position may be a center position of the touch display screen, a lower left corner position of the touch display screen, or the like, where specific setting of the preset screen position is not limited.
For example, taking the preset screen position as the center position of the touch display screen as an example, when the continuous touch operation of the viewing angle switching area of the first game interface is detected, a display front sight is added to the center position of the touch display screen in the first game interface, as shown in (b) of fig. 3.
203. And responding to the dragging operation in the visual angle switching area of the second game interface, switching the visual angle of the second game interface and the visual angle pointing position of the sight to obtain a third game interface with a target visual angle.
And the sight bead is kept displayed on the third game interface according to the preset screen position, and the visual angle pointing position of the sight bead is switched from the visual angle position at the preset screen position in the second game interface to the visual angle position at the preset screen position in the third game interface.
The target view angle refers to a view angle of a virtual scene of the currently running game, which is presented by the game interface of the currently running game after the drag operation of the view angle switching area of the second game interface.
The dragging operation is a touch operation which is performed on the view angle switching area and is used for switching the view angle of the virtual scene presented by the game interface of the current running game. The drag operation includes, but is not limited to: clicking to hold and slide up, down, left, right, etc. in the view switching area.
For example, when clicking and holding down in the view switching area and sliding upwards, the view angle of the virtual scene presented by the game interface of the currently running game will move from the initial view angle to the view angle of the virtual scene below the initial view angle. When clicking and pressing in the view angle switching area and sliding downwards, the view angle of the virtual scene presented by the game interface of the current running game is moved from the initial view angle to the view angle of the virtual scene above the initial view angle. For another example, when clicking and holding in the view switching area and sliding to the left, the view of the virtual scene presented by the game interface of the currently running game will move from the initial view to the view of the virtual scene to the right of the initial view. When clicking and pressing in the visual angle switching area and sliding to the right, the visual angle of the virtual scene presented by the game interface of the current running game is moved from the initial visual angle to the visual angle of the virtual scene on the left of the initial visual angle.
The visual angle pointing position of the foresight refers to a position at which the foresight aims in a virtual scene visual angle presented by a game interface of the currently running game.
As shown in fig. 4, fig. 4 is a schematic diagram comparing game interfaces before and after the perspective is switched according to an embodiment of the present application, where (a) in fig. 4 is a schematic diagram of a second game interface having an initial perspective, and (b) in fig. 4 is a schematic diagram of a third game interface having a target perspective. By responding to the drag operation of the perspective switching area of the second game interface shown in (a) of fig. 4, the perspective of the virtual scene of the currently running game, which is presented to the game interface of the currently running game, is moved from the initial perspective to the target perspective, resulting in a third game interface having the target perspective as shown in (b) of fig. 4.
After the dragging operation in the visual angle switching area of the second game interface, the visual angle of the virtual scene of the currently running game, which is displayed on the game interface of the currently running game, is moved to the target visual angle from the initial visual angle, so that the visual angle pointing position of the sight bead is driven to change, and the sight bead is driven to move to a specific position to be interacted.
204. And taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction.
Where a target instruction refers to an instruction executed at a target interaction location, for example, including but not limited to: marking, attacking to the target interaction position, retreating to the target interaction position, and going to the target interaction position.
For example, a drag operation of the player in the view angle switching area of the second game interface may be received, and after the drag operation of the player is detected, the finger is released, and the virtual scene view angle presented by the game interface of the currently running game is switched from the initial view angle to the target view angle, so as to obtain a third game interface with the target view angle.
At this time, the viewing angle pointing position of the front sight in the third game interface, that is, the viewing angle position at the preset screen position in the third game interface, may be used as the target interaction position.
Furthermore, in order to increase the convenience of operation and the marking accuracy of the target interaction position, the player can also manually move the sight bead in the game interface so as to quickly and accurately aim the target position through the sight bead. Namely, the method for interacting the target position may further include: responding to the movement operation of the sight bead in the third game interface, and determining the screen position of the sight bead after movement; and acquiring the visual angle position of the screen position after the movement in the third game interface as the visual angle pointing position of the sight bead in the third game interface. And finally, taking the visual angle pointing position of the foresight in the third game interface after the foresight is moved as a target interaction position for subsequently executing a target instruction.
In some embodiments, the interaction objectives (i.e., target instructions) for the player to select the target location are: the target location is marked. In this case, the target interaction location may be marked after the sight is aimed at the target interaction location. That is, step 204 may specifically include: when the dragging operation is terminated, taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position; and marking the target interaction position. For example, as shown in fig. 3, the player continuously clicks two times in the visual switching area in a short time and then triggers the front sight display, and the visual angle pointing position of the front sight is moved by continuously clicking two times and then holding down the drag; after the hands are released, the sight angle pointing position of the sight bead is used as a target interaction position for marking, and particularly, the sight angle pointing position can be simply marked at the target interaction position in a game map of the current running game.
In some embodiments, the interaction objectives (target instructions) for the player to select the target position are: and carrying out specific instructions (such as attack, retreat, automatic heading, and the like) at the target position. In this case, when the continuous touch operation of the view angle switching area of the first game interface is detected, the display front sight is increased and the game operation control for performing a specific instruction is also increased; after the sight is aimed at the target interaction position, the player clicks the game operation control of the specific instruction, releases the finger and executes the specific instruction on the target interaction position.
That is, step 202 may specifically include: responding to continuous touch operation in the visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with the initial visual angle.
Step 204 may specifically include: taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position; and responding to the touch operation of a target game operation control in at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position. And the target instruction is an operation instruction corresponding to the target game operation control.
Illustratively, as shown in fig. 3 (b) and fig. 4, in step 202, after receiving a continuous touch operation of a player in a view angle switching area of a first game interface, a front sight is additionally displayed on the first game interface according to a preset screen position (e.g., at a center position of a touch display screen of a terminal), and at least one game operation control is additionally displayed on the first game interface (e.g., three game operation controls of "attack", "retreat", and "go to" are additionally displayed on a left side of the touch display screen). In step 203, after receiving the dragging operation of the player in the view angle switching area of the second game interface, switching the view angle of the second game interface and the view angle pointing position of the sight to obtain a third game interface with a target view angle. In step 204, the view pointing position of the front sight in the third game interface is taken as a target interaction position; and after receiving the touch operation of the player on a target game operation control (such as "attack", "retreat", or "go to") in at least one game operation control of the third game interface, executing an operation instruction corresponding to the target game operation control at the target interaction position.
In addition, in the prior art, the signal key for marking the position can only be limited in the current visual angle, if a target position outside the current visual angle scene needs to be marked, the specific operation can be realized only by firstly moving the scene visual angle to enable the target position to fall into the current visual angle scene and then dragging the signal key to the target position for marking, and the operation is complex and the convenience is low. And responding to the continuous touch operation in the visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with an initial visual angle. The visual angle pointing position of the front sight in the third game interface is used as a target interaction position; and responding to the touch operation of the target game operation control in the at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position. Therefore, the target position can be marked and specific operation can be selected while the visual angle of the game interface is switched, the problem that specific instructions can be realized only by dragging the signal key to the target position to mark after the scene visual angle is moved first to enable the target position to fall into the current visual angle scene is solved, the complicated operation process is avoided, and the convenience and the efficiency of human-computer interaction in the game process are improved.
For example, when the operation instruction corresponding to the target game operation control is an attack instruction, a retreat instruction, or a go-to instruction, after receiving a touch operation performed by the player on a target game operation control (such as "attack", "retreat", or "go-to") in at least one game operation control of the third game interface, executing the operation instruction corresponding to the target game operation control on the target interaction position to control the target virtual character of the currently running game to attack, retreat, or go-to from the current position to the target interaction position.
Further, taking the example that the operation instruction corresponding to the target game operation control is a go-to instruction, in order to increase the speed of the target virtual character going to the target interaction position, a target route from the current position of the target virtual character to the target interaction position may also be automatically generated, and the target route is displayed in the game map of the currently running game. As shown in fig. 5, that is, the step "in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, executing the operation instruction corresponding to the target game operation control at the target interaction position" may specifically include the following steps 501 to 503:
501. and determining a target line from the current position of the target virtual character of the currently running game to the target interaction position in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface.
Wherein, the virtual object refers to a movable object in the virtual scene. The movable objects include, but are not limited to: virtual characters, virtual animals, cartoon characters.
The target virtual character may specifically be a virtual object controlled by the player in the currently running game.
The target route refers to a route from the current position of the target virtual character to the target interaction position. The target route may be the shortest route, the safest route, etc. from the current location of the target avatar to the target interaction location.
502. And displaying the target route in a game map of the third game interface.
503. And controlling the target virtual role to move to the target interaction position from the current position according to the target line.
Switching the visual angle of the second game interface and the visual angle pointing position of the sight bead through dragging operation in the visual angle switching area of the second game interface, so that the visual angle pointing position of the sight bead in the third game interface is adopted as a target interaction position after the target position of the sight bead is reached; and automatically generating a route from the current position of the target virtual character to the target interaction position, and displaying the target route in a game map of a third game interface, so that a player can accurately, quickly and conveniently control the target virtual character to go to the target interaction position, and the accuracy and convenience of man-machine interaction in the game process are improved.
Further, in order to enable other players (for example, control players of teammate characters of the target virtual character) to see the target interaction position and the target instruction, so as to facilitate game cooperation among teammates, the target interaction position and identification information of the operation instruction corresponding to the target game operation control may be highlighted in a game map on which the game is currently running. Namely, the method for interacting the target position may further include: acquiring identification information of an operation instruction corresponding to the target game operation control; and highlighting the target interaction position and the identification information in a game map of the current running game. For example, a game map of the currently running game is displayed on the game interface of the player controlling the teammate character of the target virtual character, and the target interaction position and the identification information of the operation instruction corresponding to the target game operation control are highlighted on the game map of the currently running game.
Further, in order to meet the requirements of service scenarios such as cancellation of misoperation and interaction requirement for suspending the target position by the player, a cancellation operation control, for example, a "cancellation" character control shown in fig. 3 (b), may be added while the quasi-star is triggered and displayed on the game interface in step 202, and after the quasi-star for aiming at the target position is triggered and displayed in step 202, the present aiming operation on the target position may be suspended or cancelled when a touch operation of the cancellation operation control on the game interface is received. That is, step 202 may specifically include: responding to continuous touch operation in the visual angle switching area of the first game interface, adding a display front sight on the first game interface according to a preset screen position, and adding a display cancellation operation control on the first game interface to obtain a second game interface with the initial visual angle.
Correspondingly, the target position interaction method may further include: and canceling the display of the sight in the target game interface when responding to the touch operation of the cancel operation control in the target game interface. Wherein the target game interface is the second game interface or the third game interface.
For example, after receiving continuous touch operations of a player in a view angle switching area of a first game interface, a display front sight is added to the first game interface according to a preset screen position, and a display cancellation operation control is added to the first game interface, so that a second game interface with an initial view angle is obtained. Then, receiving a touch operation of the cancel operation control of the player on the second game interface, and at this time, canceling the operation: and canceling the display of the sight bead in the second game interface and the operation canceling control, and restoring the game interface of the current running game to the first game interface. And the dragging operation of the player in the visual angle switching area of the second game interface can not be continuously received, and the visual angle of the second game interface and the visual angle pointing position of the sight are switched.
For another example, after receiving a continuous touch operation of a player in a view angle switching area of a first game interface, a display front sight is added to the first game interface according to a preset screen position, and a display cancellation operation control is added to the first game interface, so as to obtain a second game interface with an initial view angle. And then, after receiving the dragging operation of the player in the visual angle switching area of the second game interface, switching the visual angle of the second game interface and the visual angle pointing position of the sight to obtain a third game interface with a target visual angle. Next, a touch operation of the cancel operation control of the player on the third game interface is received, and at this time, the operation may be cancelled: and canceling the display of the foresight in the third game interface and the operation canceling control, and restoring the game interface of the current running game to a fourth game interface with a target view angle. And the target instruction is not continuously executed according to the view angle pointing position of the sight bead in the third game interface as the target interaction position.
Furthermore, in order to improve the operation convenience of human-computer interaction, the visual angle switching area and at least one game operation control additionally displayed on the game interface can be respectively arranged on the left hand side and the right hand side of the game interface close to the terminal of the handheld touch display screen of the user, so that the user can more favorably touch the visual angle switching area and the at least one newly added game operation control at the same time. For example, a viewing angle switching area may be set on the left side of the game interface, and at least one newly added game operation control may be set on the right side of the game interface, so that the user may use left-hand touch to operate the viewing angle switching area and right-hand touch to operate the at least one newly added game operation control. For another example, the perspective switching area may be set on the right side of the game interface, and the newly added at least one game operation control may be set on the left side of the game interface, so that the user may use the right hand to touch and operate the perspective switching area and the left hand to touch and operate the newly added at least one game operation control.
As can be seen from the above, in the embodiment, on the first hand, by setting the visual angle switching area for switching the visual angle of the game interface in the game interface of the currently running game and matching with the sight to aim at the target interaction position, the visual angle switching of the game interface can be completed only by dragging within a smaller amplitude range of the visual angle switching area; the problem that the signal key at the marking position needs to be dragged to the target position needing to be marked is avoided, so that the problem that the signal key at the marking position needs to be dragged to a large extent due to the fact that the distance between the signal key at the marking position and the target position needing to be marked in a scene is long, even fingers are seriously separated from the original touch range is avoided, and convenience of interactive operation of the target position is improved to a certain extent. In a second aspect, the visual angle of the second game interface and the visual angle pointing position of the sight are switched by responding to the dragging operation in the visual angle switching area of the second game interface, so that the second game interface with the initial visual angle is switched to the third game interface with the target visual angle, and the sight is kept displayed on the third game interface according to the preset screen position, so that the game interface is switched from the initial visual angle to the target visual angle, the visual angle pointing position of the sight is driven to change, the sight is driven to move to a specific position where interaction is needed, and the target position can be accurately positioned. In the third aspect, the foresight keeps displaying on the third game interface according to the preset screen position, so that the foresight screen position is unchanged, the foresight screen position in the target visual angle does not need to be detected, the visual angle direction of the foresight can be determined quickly, and the marking speed of the target position can be increased. Therefore, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
In order to better implement the above method, an interaction apparatus for a target location is further provided in the embodiments of the present application, where the interaction apparatus for a target location may be specifically integrated in an electronic device, for example, a computer device, and the computer device may be a terminal, a server, or the like.
The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in this embodiment, the method in the embodiment of the present application is described in detail by taking an example that an interaction device at a target location is specifically integrated in a smart phone.
For example, as shown in fig. 6, the interaction means of the target position may include:
a first display unit 601, configured to display a first game interface with an initial viewing angle of a currently running game, where the first game interface includes a viewing angle switching area;
a second display unit 602, configured to respond to a continuous touch operation in a view switching area of the first game interface, and add a display front sight to the first game interface according to a preset screen position, so as to obtain a second game interface with the initial view;
a switching unit 603, configured to switch, in response to a dragging operation in a view switching area of the second game interface, a view of the second game interface and a view pointing position of the sight, so as to obtain a third game interface with a target view, where the sight remains to be displayed on the third game interface according to the preset screen position, and the view pointing position of the sight is switched from the view position at the preset screen position in the second game interface to the view position at the preset screen position in the third game interface;
and the interaction unit 604 is configured to take the angle pointing position of the front sight in the third game interface as a target interaction position, and execute a target instruction.
In some embodiments, the third gaming interface further comprises at least one game play control; the second display unit 602 is specifically configured to:
responding to continuous touch operation in a visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with the initial visual angle;
in some embodiments, the interaction unit 604 is specifically configured to:
taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position, wherein the target instruction is an operation instruction corresponding to the target game operation control.
In some embodiments, the operation instruction corresponding to the target game operation control is a go-to instruction, and the third game interface further includes a game map of the currently running game; the interaction unit 604 is specifically configured to:
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and determining a target line from the current position of the target virtual character of the currently running game to the target interaction position;
displaying the target route in a game map of the third game interface;
and controlling the target virtual role to move to the target interaction position from the current position according to the target line.
In some embodiments, the operation instruction corresponding to the target game operation control is an attack instruction, a quit instruction, or a go-to instruction, and the interaction unit 604 is specifically configured to:
and responding to the touch operation of a target game operation control in the at least one game operation control, and controlling the target virtual character of the current running game to attack, withdraw or go to the target interaction position from the current position.
In some embodiments, the target position interaction apparatus further includes a third display unit (not shown in the figure), where the third display unit is specifically configured to:
acquiring identification information of an operation instruction corresponding to the target game operation control;
and highlighting the target interaction position and the identification information in a game map of the current running game.
In some embodiments, before the continuous touch operation in the view angle switching area of the first game interface is responded, and a display front sight is added to the first game interface according to a preset screen position, so as to obtain a second game interface with the initial view angle, the first display unit 601 is specifically configured to:
receiving a first click operation in a visual angle switching area of the first game interface;
receiving a second click operation in a visual angle switching area of the first game interface;
and if the interval between the time of the first click operation and the time of the second click operation is smaller than a preset time interval, determining that the continuous touch operation of the visual angle switching area of the first game interface exists.
In some embodiments, the switching unit 603 is specifically configured to:
responding to the movement operation of the front sight in the third game interface, and determining the screen position of the front sight after movement;
and acquiring the visual angle position of the screen position after the movement in the third game interface as the visual angle pointing position of the sight bead in the third game interface.
In some embodiments, the second display unit 602 is specifically configured to:
responding to continuous touch operation in a visual angle switching area of the first game interface, adding a display front sight on the first game interface according to a preset screen position, and adding a display cancellation operation control on the first game interface to obtain a second game interface with the initial visual angle;
in some embodiments, the target location interaction apparatus further includes a cancellation unit (not shown in the figure), where the cancellation unit is specifically configured to:
and when the touch operation of the cancel operation control in the target game interface is responded, canceling the display of the sight in the target game interface, wherein the target game interface is the second game interface or the third game interface.
In some embodiments, the interaction unit 604 is specifically configured to:
when the dragging operation is terminated, taking the visual angle pointing position of the front sight in the third game interface as a target interaction position;
and marking the target interaction position.
As can be seen from the above, the interaction device for a target position in this embodiment may be configured to use the first display unit 601 to display a first game interface with an initial viewing angle of a currently running game, where the first game interface includes a viewing angle switching area; responding to continuous touch operation in the view angle switching area of the first game interface by the second display unit 602, and adding a display front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial view angle; switching, by a switching unit 603, in response to a dragging operation in a view switching area of the second game interface, a view of the second game interface and a view pointing position of the sight, so as to obtain a third game interface with a target view, wherein the sight remains displayed on the third game interface according to the preset screen position, and the view pointing position of the sight is switched from the view position at the preset screen position in the second game interface to the view position at the preset screen position in the third game interface; and the interaction unit 604 takes the visual angle pointing position of the front sight in the third game interface as a target interaction position, and executes a target instruction.
Therefore, the interaction device for the target position provided by the embodiment of the application can bring the following technical effects: on the first hand, the visual angle switching area is arranged in the game interface of the current running game for switching the visual angle of the game interface, and the sight switching area is matched with the sight aiming target interaction position, so that the visual angle switching of the game interface can be completed only by dragging within a smaller amplitude range of the visual angle switching area; the problem that the signal key at the marking position needs to be dragged to the target position needing to be marked is avoided, so that the problem that the signal key at the marking position needs to be dragged to a large extent due to the fact that the distance between the signal key at the marking position and the target position needing to be marked in a scene is long, even fingers are seriously separated from the original touch range is avoided, and convenience of interactive operation of the target position is improved to a certain extent. In the second aspect, the visual angle of the second game interface and the visual angle pointing position of the sight bead are switched by responding to the dragging operation in the visual angle switching area of the second game interface, so that the second game interface with the initial visual angle is switched to the third game interface with the target visual angle, and the sight bead is kept displayed on the third game interface according to the preset screen position, so that the game interface is switched from the initial visual angle to the target visual angle, the visual angle pointing position of the sight bead is driven to change, the sight bead is driven to move to a specific position where interaction is required, and the target position can be accurately positioned. In the third aspect, the foresight keeps displaying on the third game interface according to the preset screen position, so that the foresight screen position is unchanged, the foresight screen position in the target visual angle does not need to be detected, the visual angle direction of the foresight can be determined quickly, and the marking speed of the target position can be increased. Therefore, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
Correspondingly, the embodiment of the present application further provides an electronic device, where the electronic device may be a terminal, and the terminal may be a terminal such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game console, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 700 includes a processor 701 having one or more processing cores, a memory 702 having one or more computer-readable storage media, and a computer program stored on the memory 702 and executable on the processor. The processor 701 is electrically connected to the memory 702. Those skilled in the art will appreciate that the electronic device configurations shown in the figures do not constitute limitations of the electronic device, and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The processor 701 is a control center of the electronic device 700, connects various parts of the entire electronic device 700 using various interfaces and lines, and performs various functions of the electronic device 700 and processes data by running or loading software programs and/or modules stored in the memory 702 and calling data stored in the memory 702, thereby performing overall monitoring of the electronic device 700.
In this embodiment, the processor 701 in the electronic device 700 loads instructions corresponding to processes of one or more application programs into the memory 702 according to the following steps, and the processor 701 executes the application programs stored in the memory 702, thereby implementing various functions:
displaying a first game interface with an initial view angle of a current running game, wherein the first game interface comprises a view angle switching area;
responding to continuous touch operation in a visual angle switching area of the first game interface, and adding a display front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
responding to dragging operation in a visual angle switching area of the second game interface, switching a visual angle of the second game interface and a visual angle pointing position of the sight bead to obtain a third game interface with a target visual angle, wherein the sight bead is kept to be displayed on the third game interface according to the preset screen position, and the visual angle pointing position of the sight bead is switched from the visual angle position at the preset screen position in the second game interface to the visual angle position at the preset screen position in the third game interface;
and taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction.
In some embodiments, the third gaming interface further comprises at least one game play control;
the responding to the continuous touch operation in the visual angle switching area of the first game interface, and additionally displaying a front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle, includes:
responding to continuous touch operation in a visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with the initial visual angle;
taking the visual angle pointing position of the front sight in the third game interface as a target interaction position, and executing a target instruction, wherein the method comprises the following steps:
taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position, wherein the target instruction is an operation instruction corresponding to the target game operation control.
In some embodiments, the operation instruction corresponding to the target game operation control is a go-to instruction, and the third game interface further includes a game map of the currently running game;
the executing, in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, the operation instruction corresponding to the target game operation control at the target interaction position includes:
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and determining a target line from the current position of the target virtual character of the currently running game to the target interaction position;
displaying the target route in a game map of the third game interface;
and controlling the target virtual role to move to the target interaction position from the current position according to the target line.
In some embodiments, the executing, at the target interaction position, the operation instruction corresponding to the target game operation control is an attack instruction, a quit instruction, or a go-to instruction, and the executing, in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, the executing includes:
and responding to the touch operation of a target game operation control in the at least one game operation control, and controlling the target virtual character of the currently running game to attack, withdraw or go to the target interaction position from the current position.
In some embodiments, the method further comprises:
acquiring identification information of an operation instruction corresponding to the target game operation control;
and highlighting the target interaction position and the identification information in a game map of the current running game.
In some embodiments, before the step of adding a display front sight to the first game interface according to a preset screen position in response to the continuous touch operation in the view angle switching area of the first game interface to obtain a second game interface with the initial view angle, the method further includes:
receiving a first click operation in a visual angle switching area of the first game interface;
receiving a second click operation in a visual angle switching area of the first game interface;
and if the interval between the time of the first click operation and the time of the second click operation is smaller than a preset time interval, determining that the continuous touch operation of the visual angle switching area of the first game interface exists.
In some embodiments, the method further comprises:
responding to the movement operation of the front sight in the third game interface, and determining the screen position of the front sight after movement;
and acquiring the visual angle position of the screen position after the movement in the third game interface as the visual angle pointing position of the sight bead in the third game interface.
In some embodiments, the adding a display front sight to the first game interface according to a preset screen position in response to the continuous touch operation in the view angle switching area of the first game interface to obtain a second game interface with the initial view angle includes:
responding to continuous touch operation in a visual angle switching area of the first game interface, adding a display front sight on the first game interface according to a preset screen position, and adding a display cancellation operation control on the first game interface to obtain a second game interface with the initial visual angle;
the method further comprises the following steps:
and when responding to the touch operation of the cancel operation control in the target game interface, canceling to display the sight in the target game interface, wherein the target game interface is the second game interface or the third game interface.
In some embodiments, the executing the target instruction by using the perspective pointing position of the sight in the third game interface as the target interaction position includes:
when the dragging operation is terminated, taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
and marking the target interaction position.
Therefore, the electronic device 700 provided by the embodiment can bring the following technical effects: on the first hand, the visual angle switching area is arranged in the game interface of the current running game for switching the visual angle of the game interface, and the sight switching area is matched with the sight aiming target interaction position, so that the visual angle switching of the game interface can be completed only by dragging within a smaller amplitude range of the visual angle switching area; the problem that the signal key at the marking position needs to be dragged to the target position needing to be marked is avoided, so that the problem that the signal key at the marking position needs to be dragged to a large extent due to the fact that the distance between the signal key at the marking position and the target position needing to be marked in a scene is long, even fingers are seriously separated from the original touch range is avoided, and convenience of interactive operation of the target position is improved to a certain extent. In the second aspect, the visual angle of the second game interface and the visual angle pointing position of the sight bead are switched by responding to the dragging operation in the visual angle switching area of the second game interface, so that the second game interface with the initial visual angle is switched to the third game interface with the target visual angle, and the sight bead is kept displayed on the third game interface according to the preset screen position, so that the game interface is switched from the initial visual angle to the target visual angle, the visual angle pointing position of the sight bead is driven to change, the sight bead is driven to move to a specific position where interaction is required, and the target position can be accurately positioned. In the third aspect, the foresight is kept displayed on the third game interface according to the preset screen position, so that the screen position of the foresight is unchanged, the screen position of the foresight in the target visual angle does not need to be detected, the visual angle direction of the foresight can be determined quickly, and the marking speed of the target position is improved. Therefore, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 7, the electronic device 700 further includes: a touch display screen 703, a radio frequency circuit 704, an audio circuit 705, an input unit 706, and a power supply 707. The processor 701 is electrically connected to the touch display screen 703, the radio frequency circuit 704, the audio circuit 705, the input unit 706, and the power source 707. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 7 does not constitute a limitation of the electronic device and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The touch display screen 703 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 703 may include a display panel and a touch panel. Among other things, the display panel may be used to display information input by or provided to a user as well as various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user (for example, operations of the user on or near the touch panel by using a finger, a stylus pen, or any other suitable object or accessory) and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 701, and receives and executes commands sent by the processor 701. The touch panel may cover the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 701 to determine the type of the touch event, and then the processor 701 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 703 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 703 can also be used as a part of the input unit 706 to implement an input function.
The radio frequency circuit 704 may be used for transceiving radio frequency signals to establish wireless communication with a network device or other electronic devices through wireless communication, and for transceiving signals with the network device or other electronic devices.
The audio circuit 705 may be used to provide an audio interface between a user and an electronic device through a speaker, microphone. The audio circuit 705 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 705 and converted into audio data, and then the audio data is processed by the output processor 701 and sent to another electronic device via the rf circuit 704, or the audio data is output to the memory 702 for further processing. The audio circuit 705 may also include an earbud jack to provide communication of peripheral headphones with the electronic device.
The input unit 706 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 707 is used to power the various components of the electronic device 700. Optionally, the power source 707 may be logically connected to the processor 701 through a power management system, so as to implement functions of managing charging, discharging, power consumption, and the like through the power management system. The power supply 707 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 7, the electronic device 700 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute the steps in any one of the target location interaction methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a first game interface with an initial view angle of a current running game, wherein the first game interface comprises a view angle switching area;
responding to continuous touch operation in a visual angle switching area of the first game interface, and additionally displaying a front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
responding to dragging operation in a view angle switching area of the second game interface, switching a view angle of the second game interface and a view angle pointing position of the sight bead to obtain a third game interface with a target view angle, wherein the sight bead is kept to be displayed on the third game interface according to the preset screen position, and the view angle pointing position of the sight bead is switched from the view angle position at the preset screen position in the second game interface to the view angle position at the preset screen position in the third game interface;
and taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction.
In some embodiments, the third game interface further comprises at least one game play control;
the responding to the continuous touch operation in the visual angle switching area of the first game interface, and additionally displaying a front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle, includes:
responding to continuous touch operation in a visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with the initial visual angle;
the step of executing a target instruction by taking the view pointing position of the sight bead in the third game interface as a target interaction position includes:
taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
responding to touch operation on a target game operation control in at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position, wherein the target instruction is an operation instruction corresponding to the target game operation control.
In some embodiments, the operation instruction corresponding to the target game operation control is a go-to instruction, and the third game interface further includes a game map of the currently running game;
the executing, in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, the operation instruction corresponding to the target game operation control at the target interaction position includes:
responding to the touch operation of a target game operation control in at least one game operation control of the third game interface, and determining a target line from the current position of the target virtual character of the current running game to the target interaction position;
displaying the target route in a game map of the third game interface;
and controlling the target virtual role to move to the target interaction position from the current position according to the target line.
In some embodiments, the executing, at the target interaction position, the operation instruction corresponding to the target game operation control is an attack instruction, a quit instruction, or a go-to instruction, and the executing, in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, the executing includes:
and responding to the touch operation of a target game operation control in the at least one game operation control, and controlling the target virtual character of the currently running game to attack, withdraw or go to the target interaction position from the current position.
In some embodiments, the method further comprises:
acquiring identification information of an operation instruction corresponding to the target game operation control;
and highlighting the target interaction position and the identification information in a game map of the current running game.
In some embodiments, before the step of adding a display front sight to the first game interface according to a preset screen position in response to the continuous touch operation in the view angle switching area of the first game interface and obtaining a second game interface with the initial view angle, the method further includes:
receiving a first click operation in a visual angle switching area of the first game interface;
receiving a second click operation in a visual angle switching area of the first game interface;
and if the interval between the time of the first click operation and the time of the second click operation is smaller than a preset time interval, determining that the continuous touch operation of the visual angle switching area of the first game interface exists.
In some embodiments, the method further comprises:
responding to the movement operation of the front sight in the third game interface, and determining the screen position of the front sight after movement;
and acquiring the visual angle position of the screen position after the movement in the third game interface as the visual angle pointing position of the sight bead in the third game interface.
In some embodiments, the adding a display front sight to the first game interface according to a preset screen position in response to the continuous touch operation in the view angle switching area of the first game interface to obtain a second game interface with the initial view angle includes:
responding to continuous touch operation in a visual angle switching area of the first game interface, adding a display front sight on the first game interface according to a preset screen position, and adding a display cancellation operation control on the first game interface to obtain a second game interface with the initial visual angle;
the method further comprises the following steps:
and when the touch operation of the cancel operation control in the target game interface is responded, canceling the display of the sight in the target game interface, wherein the target game interface is the second game interface or the third game interface.
In some embodiments, the taking the perspective pointing position of the front sight in the third game interface as the target interaction position, and executing the target instruction includes:
when the dragging operation is terminated, taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
and marking the target interaction position.
It can be seen that the computer program can be loaded by a processor to execute the steps in any method for interacting with a target location provided in the embodiments of the present application, so as to achieve the following technical effects: on the first hand, the visual angle switching area is arranged in the game interface of the current running game for switching the visual angle of the game interface, and the sight switching area is matched with the sight aiming target interaction position, so that the visual angle switching of the game interface can be completed only by dragging within a smaller amplitude range of the visual angle switching area; the problem that the signal key at the marked position needs to be dragged to the target position needing to be marked is avoided, so that the problem that the signal key at the marked position needs to be dragged to a large extent due to the fact that the distance between the signal key at the marked position and the target position needing to be marked in a scene is long, even fingers are seriously separated from the original touch range is avoided, and convenience of interactive operation of the target position is improved to a certain extent. In the second aspect, the visual angle of the second game interface and the visual angle pointing position of the sight bead are switched by responding to the dragging operation in the visual angle switching area of the second game interface, so that the second game interface with the initial visual angle is switched to the third game interface with the target visual angle, and the sight bead is kept displayed on the third game interface according to the preset screen position, so that the game interface is switched from the initial visual angle to the target visual angle, the visual angle pointing position of the sight bead is driven to change, the sight bead is driven to move to a specific position where interaction is required, and the target position can be accurately positioned. In the third aspect, the foresight keeps displaying on the third game interface according to the preset screen position, so that the foresight screen position is unchanged, the foresight screen position in the target visual angle does not need to be detected, the visual angle direction of the foresight can be determined quickly, and the marking speed of the target position can be increased. Therefore, the marking accuracy and the marking speed of the target position can be improved on the basis of considering the convenience in operation.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the computer-readable storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the computer-readable storage medium can execute the steps in the method for interacting with any target location provided in the embodiment of the present application, the beneficial effects that can be achieved by the method for interacting with any target location provided in the embodiment of the present application can be achieved, for details, see the foregoing embodiments, and are not described herein again.
The above detailed description is provided for a target location interaction method, apparatus, electronic device and computer-readable storage medium, and specific examples are applied in this document to explain the principles and embodiments of the present application, and the description of the above embodiments is only used to help understand the method and its core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (12)
1. An interaction method for target positions is characterized by comprising the following steps:
displaying a first game interface with an initial view angle of a currently running game, wherein the first game interface comprises a view angle switching area;
responding to continuous touch operation in a visual angle switching area of the first game interface, and adding a display front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
responding to dragging operation in a view angle switching area of the second game interface, switching a view angle of the second game interface and a view angle pointing position of the sight bead to obtain a third game interface with a target view angle, wherein the sight bead is kept to be displayed on the third game interface according to the preset screen position, and the view angle pointing position of the sight bead is switched from the view angle position at the preset screen position in the second game interface to the view angle position at the preset screen position in the third game interface;
and taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position, and executing a target instruction.
2. The method of target location interaction of claim 1, wherein the third game interface further comprises at least one game play control;
the responding to the continuous touch operation in the visual angle switching area of the first game interface, and additionally displaying a front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle, includes:
responding to continuous touch operation in a visual angle switching area of the first game interface, additionally displaying a front sight on the first game interface according to a preset screen position, and additionally displaying at least one game operation control on the first game interface to obtain a second game interface with the initial visual angle;
taking the visual angle pointing position of the front sight in the third game interface as a target interaction position, and executing a target instruction, wherein the method comprises the following steps:
taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and executing an operation instruction corresponding to the target game operation control at the target interaction position, wherein the target instruction is an operation instruction corresponding to the target game operation control.
3. The method for interacting with the target location according to claim 2, wherein the operation command corresponding to the target game operation control is a go-to command, and the third game interface further includes a game map of the currently running game;
the executing, in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface, the operation instruction corresponding to the target game operation control at the target interaction position includes:
responding to touch operation of a target game operation control in at least one game operation control of the third game interface, and determining a target line from the current position of the target virtual character of the currently running game to the target interaction position;
displaying the target route in a game map of the third game interface;
and controlling the target virtual role to move to the target interaction position from the current position according to the target line.
4. The method for interacting with the target position according to claim 2, wherein the operation instruction corresponding to the target game operation control is an attack instruction, a quit instruction or a go-to instruction, and the executing the operation instruction corresponding to the target game operation control on the target interaction position in response to the touch operation on the target game operation control in the at least one game operation control of the third game interface comprises:
and responding to the touch operation of a target game operation control in the at least one game operation control, and controlling the target virtual character of the current running game to attack, withdraw or go to the target interaction position from the current position.
5. The method of interacting a target location of claim 2, the method further comprising:
acquiring identification information of an operation instruction corresponding to the target game operation control;
and highlighting the target interaction position and the identification information in a game map of the current running game.
6. The method for interacting the target position according to claim 1, wherein the step of responding to the continuous touch operation in the view angle switching area of the first game interface, and adding a display front sight to the first game interface according to a preset screen position before obtaining a second game interface with the initial view angle further comprises:
receiving a first click operation in a visual angle switching area of the first game interface;
receiving a second click operation in a visual angle switching area of the first game interface;
and if the interval between the time of the first click operation and the time of the second click operation is smaller than a preset time interval, determining that the continuous touch operation of the visual angle switching area of the first game interface exists.
7. The method of interacting a target location of claim 1, the method further comprising:
responding to the movement operation of the front sight in the third game interface, and determining the screen position of the front sight after movement;
and acquiring the visual angle position of the screen position after the movement in the third game interface as the visual angle pointing position of the sight bead in the third game interface.
8. The method for interacting the target position according to claim 1, wherein the step of adding a display front sight on the first game interface according to a preset screen position in response to the continuous touch operation in the view angle switching area of the first game interface to obtain a second game interface with the initial view angle comprises:
responding to continuous touch operation in a visual angle switching area of the first game interface, adding a display front sight on the first game interface according to a preset screen position, and adding a display cancellation operation control on the first game interface to obtain a second game interface with the initial visual angle;
the method further comprises the following steps:
and when responding to the touch operation of the cancel operation control in the target game interface, canceling to display the sight in the target game interface, wherein the target game interface is the second game interface or the third game interface.
9. The method for interacting the target position according to any one of claims 1 to 8, wherein the step of executing the target instruction by taking the angle-of-view pointing position of the front sight in the third game interface as the target interaction position comprises the steps of:
when the dragging operation is terminated, taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position;
and marking the target interaction position.
10. An apparatus for interacting a target location, comprising:
the game system comprises a first display unit, a second display unit and a display unit, wherein the first display unit is used for displaying a first game interface with an initial visual angle of a current running game, and the first game interface comprises a visual angle switching area;
the second display unit is used for responding to continuous touch operation in the visual angle switching area of the first game interface, and adding display front sight on the first game interface according to a preset screen position to obtain a second game interface with the initial visual angle;
the switching unit is used for responding to dragging operation in a view angle switching area of the second game interface, switching a view angle of the second game interface and a view angle pointing position of the sight to obtain a third game interface with a target view angle, wherein the sight keeps being displayed on the third game interface according to the preset screen position, and the view angle pointing position of the sight is switched from the view angle position at the preset screen position in the second game interface to the view angle position at the preset screen position in the third game interface;
and the interaction unit is used for taking the visual angle pointing position of the sight bead in the third game interface as a target interaction position and executing a target instruction.
11. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps in the method of interaction of target locations according to any one of claims 1 to 9.
12. A computer-readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method for interacting with a target location according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211111993.0A CN115430145A (en) | 2022-09-13 | 2022-09-13 | Target position interaction method and device, electronic equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211111993.0A CN115430145A (en) | 2022-09-13 | 2022-09-13 | Target position interaction method and device, electronic equipment and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115430145A true CN115430145A (en) | 2022-12-06 |
Family
ID=84248136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211111993.0A Pending CN115430145A (en) | 2022-09-13 | 2022-09-13 | Target position interaction method and device, electronic equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115430145A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108671543A (en) * | 2018-05-18 | 2018-10-19 | 腾讯科技(深圳)有限公司 | Labelled element display methods, computer equipment and storage medium in virtual scene |
CN110115838A (en) * | 2019-05-30 | 2019-08-13 | 腾讯科技(深圳)有限公司 | Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment |
CN111773705A (en) * | 2020-08-06 | 2020-10-16 | 网易(杭州)网络有限公司 | Interaction method and device in game scene |
CN113262489A (en) * | 2021-04-28 | 2021-08-17 | 网易(杭州)网络有限公司 | Game route generation method, game route generation device, nonvolatile storage medium, and electronic device |
-
2022
- 2022-09-13 CN CN202211111993.0A patent/CN115430145A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108671543A (en) * | 2018-05-18 | 2018-10-19 | 腾讯科技(深圳)有限公司 | Labelled element display methods, computer equipment and storage medium in virtual scene |
CN110115838A (en) * | 2019-05-30 | 2019-08-13 | 腾讯科技(深圳)有限公司 | Method, apparatus, equipment and the storage medium of mark information are generated in virtual environment |
US20210286446A1 (en) * | 2019-05-30 | 2021-09-16 | Tencent Technology (Shenzhen) Company Limited | Method and apparatus for generating mark information in virtual environment, electronic device, and storage medium |
CN111773705A (en) * | 2020-08-06 | 2020-10-16 | 网易(杭州)网络有限公司 | Interaction method and device in game scene |
CN113262489A (en) * | 2021-04-28 | 2021-08-17 | 网易(杭州)网络有限公司 | Game route generation method, game route generation device, nonvolatile storage medium, and electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111760274A (en) | Skill control method and device, storage medium and computer equipment | |
CN113082712A (en) | Control method and device of virtual role, computer equipment and storage medium | |
CN113398590B (en) | Sound processing method, device, computer equipment and storage medium | |
CN113633963A (en) | Game control method, device, terminal and storage medium | |
CN113332721B (en) | Game control method, game control device, computer equipment and storage medium | |
CN113426124A (en) | Display control method and device in game, storage medium and computer equipment | |
CN116115991A (en) | Aiming method, aiming device, computer equipment and storage medium | |
CN115970284A (en) | Attack method and device of virtual weapon, storage medium and computer equipment | |
CN114053714A (en) | Virtual object control method and device, computer equipment and storage medium | |
WO2024103623A1 (en) | Method and apparatus for marking virtual item, and computer device and storage medium | |
CN115382221A (en) | Method and device for transmitting interactive information, electronic equipment and readable storage medium | |
CN115212567A (en) | Information processing method, information processing device, computer equipment and computer readable storage medium | |
CN115193046A (en) | A game display control method, device, computer equipment and storage medium | |
CN115430145A (en) | Target position interaction method and device, electronic equipment and readable storage medium | |
CN116139483A (en) | Game function control method, game function control device, storage medium and computer equipment | |
CN113426115A (en) | Game role display method and device and terminal | |
CN118477305A (en) | Game card control method, device, computer equipment and storage medium | |
CN117482516A (en) | Game interaction method, game interaction device, computer equipment and computer readable storage medium | |
CN116920384A (en) | Information display method and device in game, computer equipment and storage medium | |
CN119633369A (en) | Game control method, device, computer equipment and storage medium | |
CN118179012A (en) | Game interaction method, game interaction device, computer equipment and computer readable storage medium | |
CN117919694A (en) | Game control method, game control device, computer equipment and storage medium | |
CN117482523A (en) | Game interaction method, game interaction device, computer equipment and computer readable storage medium | |
CN115193062A (en) | Game control method, device, storage medium and computer equipment | |
CN118576971A (en) | Game control method, device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |