[go: up one dir, main page]

CN116459509A - Method, device, computer equipment and storage medium for controlling movement of virtual object - Google Patents

Method, device, computer equipment and storage medium for controlling movement of virtual object Download PDF

Info

Publication number
CN116459509A
CN116459509A CN202310401376.2A CN202310401376A CN116459509A CN 116459509 A CN116459509 A CN 116459509A CN 202310401376 A CN202310401376 A CN 202310401376A CN 116459509 A CN116459509 A CN 116459509A
Authority
CN
China
Prior art keywords
control
area
virtual object
movement
functional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310401376.2A
Other languages
Chinese (zh)
Inventor
江阳晨
粟鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310401376.2A priority Critical patent/CN116459509A/en
Publication of CN116459509A publication Critical patent/CN116459509A/en
Priority to PCT/CN2023/113071 priority patent/WO2024212412A1/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method, a device, computer equipment and a storage medium for controlling movement of a virtual object; the method comprises the steps of obtaining the current position of a controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control; and when the current position is located in the driving area in the game scene, controlling to display the first functional control and at least one second functional control in response to the touch operation on the movement control area. In the embodiment of the application, on the premise of not increasing the operation control, a new function control is called out through the touch operation of the original mobile control area of the graphical user interface, so that a new function is realized, an interactive interface is simplified, and the interactive efficiency is improved.

Description

Method, device, computer equipment and storage medium for controlling movement of virtual object
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and apparatus for controlling movement of a virtual object, a computer device, and a storage medium.
Background
In recent years, with the development of game technology, game contents are becoming more and more diversified, and daily lives of users are becoming more and more enriched. In some game scenarios of games, a user may control a virtual object to perform a variety of virtual actions. In particular, in a game scenario of a competitive game, a user may control virtual objects to drive a vehicle in the game scenario.
In the prior art, the control of the virtual carrier is directly added on the game interface, so that the interactive interface control is too many and is easy to touch by mistake. In addition, the user can drive the virtual carrier only by executing a plurality of interaction processes based on the control of the virtual carrier. For example, the user first searches for and determines the virtual vehicle to be driven from the existing virtual vehicles in the game scene, and then touches the control for driving the virtual vehicle to drive the virtual vehicle to move, so that the interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides a method, a device, computer equipment and a storage medium for controlling the movement of a virtual object, which can simplify an interactive interface and improve interactive efficiency.
The embodiment of the application provides a method for controlling movement of a virtual object, which comprises the steps of providing a graphical user interface through a terminal, wherein content displayed by the graphical user interface at least partially comprises a game scene and a controlled virtual object positioned in the game scene, and the method comprises the following steps: acquiring the current position of the controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on a movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying the first functional control and at least one second functional control in response to touch operation acted on the movement control area, wherein a movement state corresponding to the second functional control is different from a movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
The embodiment of the application also provides a mobile control device for virtual objects, which provides a graphical user interface through a terminal, wherein the content displayed by the graphical user interface at least partially comprises a game scene and controlled virtual objects positioned in the game scene, and the device comprises: an acquisition unit configured to acquire a current position of the controlled virtual object; the control unit is used for responding to touch operation acted on a movement control area when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; the control unit is further configured to, when the current position is located in a driving area in the game scene, control and display the first functional control and at least one second functional control in response to a touch operation acting on the movement control area, where a movement state corresponding to the second functional control is different from a movement state corresponding to the first functional control, and the second functional control is configured to, in response to a trigger operation, control the controlled virtual object to move according to a second movement state corresponding to the second functional control, where the second movement state corresponds to the driving area.
The embodiment of the application also provides computer equipment, which comprises a processor and a memory, wherein the memory stores a plurality of instructions; the processor loads instructions from the memory to execute steps in any of the virtual object movement control methods provided in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium, which stores a plurality of instructions adapted to be loaded by a processor to perform the steps in any of the virtual object movement control methods provided in the embodiment of the application.
The embodiment of the application can acquire the current position of the controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on a movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying the first functional control and at least one second functional control in response to touch operation acted on the movement control area, wherein a movement state corresponding to the second functional control is different from a movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
In the application, on the premise of not increasing the operation control, a new function control can be called out through touch operation of the original mobile control area of the graphical user interface, so that a new function is realized, an interactive interface is simplified, and false touch caused by excessive controls on the interface is avoided. In addition, when the controlled virtual object is located in different areas such as a non-driving area and a driving area, different functional controls can be exhaled by touch operation on the movement control area, so that a user can control the controlled virtual object to execute different movement states. When the controlled virtual object is located in the driving area, the second functional control corresponding to the driving area can be exhaled, so that the second functional control is triggered and enters a second moving state in a mode that a natural player well understands, and interaction efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1a is a schematic view of a scenario of a method for controlling movement of a virtual object according to an embodiment of the present application;
fig. 1b is a flow chart of a method for controlling movement of a virtual object according to an embodiment of the present application;
fig. 1c is an interface schematic diagram of a controlled virtual object provided in the embodiment of the present application in a driving area and a non-driving area, respectively;
FIG. 1d is a schematic diagram of a functionality control and a functionality response area provided by an embodiment of the present application;
fig. 1e is a schematic diagram of another interface of a method for controlling movement of a virtual object according to an embodiment of the present application;
FIG. 1f is a schematic diagram of another interface of a method for controlling movement of a virtual object according to an embodiment of the present disclosure;
fig. 1g is a schematic diagram of a touch display function control provided in an embodiment of the present application;
FIG. 1h is a schematic diagram of an adjustment functionality control provided by an embodiment of the present application;
FIG. 1i is yet another schematic diagram of an adjustment functionality control provided by an embodiment of the present application;
FIG. 1j is a further schematic diagram of an adjustment functionality control provided by an embodiment of the present application;
fig. 2 is a flow chart of a method for controlling movement of a virtual object according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a movement control device for a virtual object according to an embodiment of the present application;
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Before explaining the embodiments of the present application in detail, some terms related to the embodiments of the present application are explained.
Wherein the terms "first," "second," and the like, as used herein, may be used to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. Wherein at least one means one or more, for example, at least one user may be an integer number of users of one or more of any one user, two users, three users, and the like. The plurality means two or more, and for example, the plurality of users may be an integer number of two or more of any two users, three users, or the like.
Wherein, the game scene: is a game scene that an application displays (or provides) while running on a terminal. The game scene can be a simulation environment for the real world, a half-simulation and half-fictional virtual environment, or a pure fictional virtual environment. The game scene may be any one of a two-dimensional game scene, a 2.5-dimensional game scene or a three-dimensional game scene, and the dimension of the game scene is not limited in the embodiment of the present application. For example, a game scene may include sky, land, sea, etc., where land may include environmental elements such as deserts, cities, etc., where a user may control virtual objects to move.
Virtual object: refers to objects used in a game scene to simulate a character or animal. The virtual object may be a virtual character, a virtual animal, a cartoon character, or the like. Such as characters, animals, displayed in a game scene. The virtual object may be an avatar in the game scene for representing a user. A plurality of virtual objects may be included in the game scene, each virtual object having its own shape and volume in the game scene, occupying a portion of the space in the game scene. The activity of the virtual object may include: adjusting body posture, crawling, walking, running, riding, flying, jumping, aiming using virtual aiming, shooting, driving, picking up, attacking, throwing, releasing skills, and the like.
In some embodiments, the content displayed in the graphical user interface comprises, at least in part, a game scene, wherein the game scene comprises at least one virtual object.
In some embodiments, virtual objects in a game scene include a user-manipulated game Character (Player Character) and a system preset-controlled, non-user-manipulated game Character (NPC).
Virtual carrier: is a vehicle for transporting virtual operation objects in a game scene, and the specific form of the virtual vehicle in the game scene can include, but is not limited to, at least one of the following: automobiles, bicycles, motorcycles, ships, airplanes, trains, animals, skateboards, and the like.
Game interface: the interface corresponding to the application program is provided or displayed through a graphical user interface, wherein the interface comprises a graphical user interface for user interaction and a game picture, and the game picture is a picture of a game scene.
In some embodiments, game controls (e.g., skill controls, movement rockers, character control controls, and functionality controls such as backpack controls, chat controls, system setup controls, etc.), indication identifiers (e.g., direction indication identifiers, character indication identifiers, etc.), information presentation areas (e.g., number of taps, time of play, etc.) may be included in the game interface.
The embodiment of the application provides a method, a device, computer equipment and a storage medium for controlling movement of a virtual object.
The mobile control device of the virtual object may be integrated in an electronic device, and the electronic device may be a terminal, a server, or other devices. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer (Personal Computer, PC) or the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the mobile control device of the virtual object may be operated on a terminal device or a server. The terminal device may be a local terminal device. When the method for controlling the movement of the virtual object runs on the server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the movement control method of the virtual object are completed on the cloud game server, and the function of the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a terminal, a television, a computer, a palm computer and the like; but the terminal device for role control is cloud game server of cloud. When playing the game, the user operates the client device to send an operation instruction, such as an operation instruction of touch operation, to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In some embodiments, the server may also be implemented in the form of a terminal.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with a user through a graphical user interface, namely, conventionally downloading and installing a game program through the electronic device and running the game program. The way in which the local terminal device provides the graphical user interface to the user may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the user by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen. A user can operate on the interface through an input device such as a touch screen, mouse, keyboard, or handle.
In some embodiments, the server may also be implemented in the form of a terminal.
For example, referring to fig. 1a, a schematic view of a scenario of a movement control system of a virtual object is provided, which may implement a movement control method of the virtual object. In this scenario, a terminal and a game server may be included. The terminal provides a graphical user interface, and the content displayed by the graphical user interface at least partially comprises a game scene and a controlled virtual object positioned in the game scene.
The terminal can be used for acquiring the current position of the controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying a first functional control and at least one second functional control in response to touch operation acting on the movement control area, wherein the movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
The game server may be used to obtain data of a game played by a user at the terminal.
The following will describe in detail. It should be noted that the following embodiments are not limited to the preferred sequence of the embodiments. It can be appreciated that in the specific embodiments of the present application, related data related to a user, such as a touch operation, a triggering operation, a sliding operation, an adjusting operation, a virtual carrier, a virtual backpack, a movement parameter, etc., when the above embodiments of the present application are applied to specific products or technologies, permission or consent of the user needs to be obtained, and collection, use and processing of related data need to comply with related laws and regulations and standards of related countries and regions.
In this embodiment, a method for controlling movement of a virtual object is provided, a graphical user interface is provided through a terminal, and content displayed by the graphical user interface at least partially includes a game scene and a controlled virtual object located in the game scene, as shown in fig. 1b, a specific flow of the method for controlling movement of a virtual object may be as follows:
110. and acquiring the current position of the controlled virtual object.
Wherein, the controlled virtual object refers to a virtual object controlled by an operation on the terminal, such as a user character controlled by the user through the terminal, such as a game character. In some implementations, the virtual object can be a virtual character that plays in a game scene. In some embodiments, the number of virtual objects participating in the interaction in the game scene may be preset, or may be dynamically determined according to the number of terminals participating in the interaction.
The current position refers to the position of the controlled virtual object in the game scene at the current moment.
For example, when a user controls a corresponding controlled virtual object to move in a game scene, the game server may monitor the position of the controlled virtual object in the game scene in real time, and determine that the position is a driving area or a non-driving area in the game scene.
120. When the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation on the movement control area, and displaying a first functional control.
The first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control.
The driving area refers to an area preset in the game scene and used for driving the virtual vehicle, and the specific form of the driving area is not limited in the embodiment of the application, for example, the driving area can be in a regular shape such as a circle, a rectangle, or the like, can be in an irregular shape, or can be an area preset in the game scene such as towns, barren, and the like. The non-driving area refers to an area preset in the game scene and not available for driving the virtual vehicle. For example, in a game scene, one or more open areas may be preset as driving areas according to game settings, that is, the driving areas are fixed areas preset on a game map; the driving area may be an area (i.e., a non-fixed area) set according to the current position or the moving range of the controlled virtual object in the game, for example, the driving area may be an area of a specified shape determined centering on the real-time position of the controlled virtual object in the game scene at a specified time before the start of the game, for example, 10 minutes before the start. In any driving area, the user can control the controlled virtual object to move in a second moving state; in the game scene, the area other than the driving area is a non-driving area.
The movement control area refers to an area on the graphical user interface for triggering and displaying the first functional control and/or the second functional control.
In some implementations, the movement control region may include a response region (hereinafter referred to as an expiration region) for triggering display of the first and/or second functionality controls, e.g., the controlled virtual object may be controlled to move in a walking, running, crawling, etc. manner when the movement rocker is dragged, and the first and/or second functionality controls are expired when the movement rocker is dragged to the expiration region.
In some embodiments, the first functionality control and/or the second functionality control is exhaled when the distance the movement rocker is dragged in the specified direction is greater than a preset threshold distance.
In some embodiments, the movement control region may further include a control region (hereinafter referred to as an initial control region) for controlling the controlled virtual object to move in a normal state, such as a control region corresponding to a joystick. It should be noted that, through the initial touch operation of the initial control area, the controlled virtual object may be controlled to move in the game scene in a normal state other than the first moving state and the second moving state, for example, the normal state may be to control the controlled virtual object to move in a walking, running, crawling or other manner. Alternatively, the initial touch operation may be a sliding operation. For example, the user may control the controlled virtual object to move in the game scene in the form of walking, running, or the like by an initial sliding operation in the initial control area until the controlled virtual object moves into the driving area. The initial sliding operation may be a sliding operation in an arbitrary direction or a specified direction.
In some embodiments, the initial control region and the movement control region may be two independent regions.
The first functional control is a functional control for controlling the controlled virtual object to move in a first moving state. The first movement state may be a movement state preset according to an application scenario or actual needs. For example, the first movable state may be to control the controlled virtual object to move in a running, etc. manner.
For example, when the game server monitors that the controlled virtual object is located in a non-driving area in the game scene (i.e., the current position of the controlled virtual object is located in the non-driving area in the game scene), the user may touch the movement control area, and in response to the touch operation, display the first functional control to control the controlled virtual object to move in the first movement state by triggering the first functional control, as shown in an interface schematic diagram of the controlled virtual object located in the non-driving area in (1) in fig. 1 c. Therefore, the first functional control is triggered and displayed through the touch operation of the movement control area, on one hand, the function of controlling the virtual object to move in the first movement state can be newly added on the basis of the original function of the movement control area, on the other hand, a user can autonomously control whether the first functional control is displayed or not, and false touch caused by excessive control on the interface can be avoided.
In some embodiments, the first functionality control is a running control and the first movement state is a running state.
The running control may refer to a control for controlling running of the virtual object. For example, the logo may be displayed in the form of text, graphics, etc., which may include, for example, asterisks, dots, triangles, icons, etc.
In some embodiments, the touch operation for displaying the first functional control may be a first touch operation, and the initial touch operation and the operation mode of the first touch operation are different. For example, the initial touch operation may be a sliding operation in any direction acting in the initial control area, and the first touch operation may be a sliding operation in a specified direction acting in the movement control area.
In some implementations, the first functionality control corresponds to a first functionality response area.
The function response area is an area for responding to a triggering operation of the function control. For example, as shown in the schematic diagrams of the functional control and the functional response area shown in fig. 1d, the first functional response area corresponding to the first functional control may be an area located above the moving rocker, in which the first functional control is displayed, that is, when the user drags the rocker upward to the first functional response area, the first functional control is triggered, and if the user drags the moving rocker in other directions or the user does not drag the moving rocker to the first functional response area, the first functional control is not triggered.
130. And when the current position is located in the driving area in the game scene, controlling to display the first functional control and at least one second functional control in response to the touch operation on the movement control area.
The movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, the second functional control is configured to respond to the triggering operation, and the controlled virtual object is controlled to move according to the second movement state corresponding to the second functional control, wherein the second movement state corresponds to the driving area.
The second functional control is a functional control for controlling the controlled virtual object to move in a second moving state. The second movement state refers to a movement state corresponding to the driving region, and for example, the second movement state may be a driving state.
The triggering operation may include, but is not limited to, operations such as touching, dragging, pressing, long pressing, short pressing, double clicking, ending dragging, sliding, etc., where the user can perform the triggering operation through an input device such as a touch screen, a mouse, a keyboard, or a handle, and the specific operation mode depends on the game operation method or the specific game setting.
For example, when the user controls the controlled virtual object to enter the driving area in the game scene (i.e., the current position of the controlled virtual object is located in the driving area in the game scene), an interface schematic diagram of the controlled virtual object located in the driving area as shown in (2) in fig. 1c may display the first function control and the second function control above the moving rocker when the moving rocker is dragged. Therefore, on the premise of not increasing the operation control, the embodiment of the application can call out a new function control by touch operation of the original mobile control area of the graphical user interface, and realize a new function, so that the interactive interface is simplified, and false touch caused by excessive control on the interface is avoided. In addition, when the controlled virtual object is located in different areas such as a non-driving area and a driving area, different functional controls can be exhaled by touch operation on the movement control area, so that a user can control the controlled virtual object to execute different movement states. When the controlled virtual object is located in the driving area, the first functional control and the second functional control corresponding to the driving area can be exhaled, so that the second functional control is triggered and enters a second moving state in a mode that a natural player well understands, and interaction efficiency is improved.
Optionally, after the second functional control is displayed, if the controlled virtual object does not leave the driving area or the controlled virtual object moves according to the second movement state corresponding to the second functional control, the second functional control is kept to be displayed until the controlled virtual object leaves the driving area or the second functional control is triggered, so that the second functional control is prevented from being repeatedly operated and displayed in the driving area, and the interaction process is simplified. Specifically, the method for controlling movement of the virtual object may further include: and if the controlled virtual object leaves the driving area or moves according to the second movement state corresponding to the second functional control, hiding the driving control.
In some embodiments, the graphical user interface further comprises a movement rocker, the movement control area is a contact area of the movement rocker, the triggering operation of the target functionality control is a release operation, and the target functionality control comprises at least one of a first functionality control and a second functionality control.
The movable rocker is used for controlling the virtual object to move, and the contact area of the movable rocker is used for responding to the touch operation of the movable rocker. For example, the contact area of the moving rocker may include, but is not limited to, a wheel control of a chassis and a rocker, and the virtual object may be controlled to move in different directions in the game scene by dragging the rocker in the moving rocker control, or by dragging a moving rocker displayed by a graphical user interface. The functional response area of the target functional control may be located in an associated area of the moving rocker, such as the first functional response area and/or the second functional response area located near the moving rocker, such as above the chassis of the moving rocker and the wheel-type control of the rocker.
The operation of loosening the hands refers to the operation of ending the touch control on the functional control. For example, the user may touch the first functional control displayed on the touch screen with a finger, and when the finger leaves the touch screen, the touch on the first functional control is ended, and the operation of leaving the touch screen is the releasing operation.
For example, the controlled virtual object may be controlled to move in the first movement state or the second movement state after the triggering operation of the first function control or the second function control is finished. Therefore, after the touch is finished, the user can perform other operations, such as driving operation on the virtual carrier, in the first moving state or the second moving state, so that user experience is improved.
In some implementations, the first functionality control and/or the second functionality control may be displayed in an associated region of the mobile rocker. The association area of the moving rocker can refer to an area where the corresponding moving rocker is arranged in the graphical user interface according to actual needs, for example, the association area can be an area located near the moving rocker in the graphical user interface, for example, the association area can be an area located above the moving rocker, and the association area can also be an area located in a moving control area.
In some embodiments, when the controlled virtual object enters the driving area, related prompt information may be displayed on the game interface to prompt the user to perform related operations of the driving area. Specifically, the method further comprises:
And when the controlled virtual object is detected to enter the driving area, displaying driving prompt information on the graphical user interface.
The driving prompt information may refer to information for prompting that the controlled virtual object has entered the driving area, and may be displayed in at least one form of text, graphics, images, and the like. For example, when it is detected that the controlled virtual object enters the driving area, an interface schematic diagram of the controlled virtual object in the driving area as shown in (2) in fig. 1c may be displayed in the center of the game interface (graphical user interface), and a popup window prompt "current area drivable carrier" may be displayed repeatedly at least three times.
Optionally, the driving prompt information and the second functional control may also be displayed according to game setting limits. For example, only 10 minutes before the game starts can be set, driving prompt information and a second functional control can be displayed, so that the diversity of the game is increased, and the user experience is improved.
Optionally, the driving prompt information may include at least one of a popup prompt and a control prompt, and the control prompt may be a prompt located near the movement control area for a triggering operation related to the second function control. For example, as shown in the interface diagrams of fig. 1e and 1f, the control prompt may be an upward arrow on the moving rocker, the arrow pointing to the second response area or the second function response area of the second function control, and the first function control and the second function control may be triggered in the direction of the arrow.
In some embodiments, the touch operation for displaying the first functional control and the at least one second functional control may be a second touch operation, and the operation modes of the first touch operation and the second touch operation may be the same or different. Therefore, when the mobile control area comprises the initial control area, the mobile control area can respond to the initial touch operation, the first touch operation, the second touch operation and the like to realize different control functions, and therefore, the mobile control area has various different functions, a game interface can be simplified, excessive controls on the interface are avoided, false touch is easy to occur, user experience is improved, user retention rate is increased, and server consumption is reduced.
Alternatively, the first touch operation may be a first sliding operation, and the second touch operation may be a second sliding operation. The sliding operation may include operations such as touch, drag, and end touch. For example, the process of the sliding operation of the moving stick may include touching the moving stick displayed on the touch screen by a finger, dragging the finger to an arbitrary position in a state where the touch is maintained, stopping the dragging, and moving the finger away from the touch screen to end the touch.
Alternatively, the initial sliding operation and the first sliding operation may be continuous operation or intermittent operation. The initial sliding operation and the second sliding operation may be continuous operation or intermittent operation.
Optionally, to avoid false triggering of the display of the first functionality control and/or the second functionality control, the first sliding operation and/or the second sliding operation may be a sliding operation in a specified direction. The designated directions corresponding to the first sliding operation and the second sliding operation may be the same or different.
For example, when the current position of the controlled virtual object is located in the non-driving area in the game scene, if a sliding operation from the movement control area to the specified direction is detected, the first functional control is displayed, and if the current position is located in the non-driving area in the game scene, if a sliding operation from the movement control area to the specified direction is displayed, the first functional control and at least one second functional control are displayed, and the detection of the sliding operation may be the detection of an upward dragging movement rocker or the detection of a distance of the upward dragging rocker satisfying a preset distance.
Alternatively, the first sliding operation and/or the second sliding operation may be operations in which a sliding distance of the sliding operation in the specified direction is greater than a preset threshold distance. For example, the display of the first functionality control and/or the second functionality control may be triggered when the slide up exceeds a certain threshold distance.
Alternatively, the first sliding operation and the second sliding operation may be sliding operations in the first specified direction, that is, the specified directions to which the first sliding operation and the second sliding operation correspond are the same. For example, the first designated direction may be set according to a video game operation method or a game specific setting, if the first designated direction may be upward, that is, when the user slides upward in the movement control area, if the controlled virtual object is located in the non-driving area, the first function control is displayed, if the controlled virtual object is located in the driving area, the first function control and the second function control are displayed, and if the user slides in other directions, the first function control and/or the second function control are not displayed.
Alternatively, the initial sliding operation may be the same as or different from the first sliding operation and/or the second sliding operation.
For example, the initial sliding operation may be the same as the first sliding operation and/or the second sliding operation, and may be, for example, a sliding operation from a movement control area (movement rocker) to an arbitrary direction. Namely, when the user touches the mobile rocker and drags the mobile rocker in any direction (i.e., performs a sliding operation in any direction), the controlled virtual object is controlled to move in response to the sliding operation (initial sliding operation), and if the controlled virtual object is located in the driving area at this time, the first functional control and the second functional control are displayed in response to the sliding operation (second sliding operation).
For another example, the initial sliding operation may be different from the first sliding operation and/or the second sliding operation. The moving rocker may include a wheel responsive area (i.e., a wheel area of the moving rocker), the initial sliding operation may be a sliding operation of touching the moving rocker and dragging to the wheel responsive area, and the first sliding operation and/or the second sliding operation may be a sliding operation of touching the moving rocker and dragging to the exhale area. If the user drags the rocker from the wheel response area to the call-out area, the first sliding operation or the second sliding operation is considered to be executed, that is, the first functional control and the second functional control are displayed. In the first sliding operation or the second sliding operation, the controlled virtual object can be controlled to move continuously until touch control is released.
Optionally, in order to avoid false triggering of the display of the first functionality control and/or the second functionality control, the exhalation area may include a first exhalation area and/or a second exhalation area, the first sliding operation may be a sliding operation to slide to the first exhalation area, the second sliding operation may be a sliding operation to slide to the second exhalation area, and the first exhalation area and the second exhalation area may be the same area or different areas. For example, when it is detected that the user drags the movement rocker outside the boundary of the movement rocker (i.e., the first exhalation zone), the first functionality control is displayed.
Alternatively, the first sliding operation may be a sliding operation to a specified direction, sliding to the first exhalation area. The second sliding operation may be a sliding operation to a specified direction, sliding to the second exhalation area. For example, the first calling area and the second calling area may be set according to a video game operation method or a game specific setting, as shown in a schematic diagram of a touch display function control shown in fig. 1g, where the second calling area may be an area located above the mobile rocker, that is, when the user drags the rocker upward to the second calling area, the first function control and the second function control are displayed, and if the user drags the rocker in other directions or the user does not drag the rocker to the second calling area, the first function control and the second function control are not displayed.
Alternatively, the first functional control may be displayed when the first sliding operation drags in the specified direction and/or reaches the first exhalation area, or may be displayed when the first sliding operation ends, i.e., when the touch is disengaged.
Optionally, the first functional control and the second functional control may be displayed when the second sliding operation drags in the specified direction and/or reaches the second exhalation area, or the first functional control and the second functional control may be displayed when the second sliding operation ends, i.e. when the second sliding operation is separated from the touch.
In some implementations, the second functionality control includes a vehicle call control configured to display the virtual vehicle in the game scene in response to the triggering operation and to control the controlled virtual object to drive the virtual vehicle to move.
The vehicle calling control is a control for calling and displaying the virtual vehicle, for example, the virtual vehicle can be displayed as a character, a graph or the like, and the graph can comprise an asterisk, a dot, a triangle, an icon or the like.
For example, when a user controls a controlled virtual object to enter a driving area in a game scene, a virtual carrier is not displayed in the driving area, and after the user triggers a carrier calling control through touch operation, the virtual carrier is displayed in the driving area. Therefore, a new mechanism for triggering and displaying the driving control can be provided for the user, and the diversity experience is improved. As shown in the interface schematic diagram of fig. 1f, after the user touches the control, the virtual vehicle may be displayed near the controlled virtual object in the driving area.
It should be noted that the virtual carrier may include various types of carriers such as automobiles, bicycles, motorcycles, ships, planes, trains, animals, surfboards, skateboards, and the like. Controlling the controlled virtual object to drive the virtual vehicle to move can refer to controlling the virtual vehicle to carry the controlled virtual object to move by using a control method corresponding to the virtual vehicle, and the control methods corresponding to different types of virtual vehicles are different. For example, the control method of the virtual vehicle of the type of an automobile, a bicycle, a motorcycle, a ship, an airplane, a train, etc. may be driving, the corresponding control method of the virtual vehicle of the type of an animal, etc. may be riding, and the corresponding control method of the virtual vehicle of the type of a surfboard, a skateboard, etc. may be sliding.
It can be appreciated that the virtual vehicle in the embodiment of the present application is not an existing virtual vehicle in the driving area in the game scene, but a virtual vehicle that is summoned by the triggering operation of the vehicle summoning control. In practical application, the game can automatically allocate and display the virtual carriers for the controlled virtual objects according to the application scene or game setting, and can call and display the corresponding virtual carriers from the virtual carriers obtained by the carrier calling control. The method for calling and displaying the virtual carrier through the triggering operation of the driving control can call and display the virtual carrier which does not exist in the game scene near the controlled virtual object, does not need a series of operations of searching, going to the virtual carrier in the game scene and the like, can promote and simplify the interaction process of the controlled virtual object using the virtual carrier, and improves the efficiency.
In some embodiments, after the triggering operation is performed on the carrier calling control, the virtual carrier in the virtual knapsack configured by the controlled virtual object can be acquired and displayed, so that the displayed virtual carrier has a certain association relationship with the controlled virtual object, and the user can drive the virtual carrier more proficiently. Specifically, the virtual carrier acquires and displays the virtual carrier from the virtual knapsack corresponding to the controlled virtual object.
For example, the virtual knapsack of the controlled virtual object stores the medicine, the virtual weapon, the virtual carrier and the like acquired by the controlled virtual object, and when the user touches the carrier calling control, the virtual carrier in the virtual prop knapsack can be displayed near the controlled virtual object.
Optionally, if there are multiple virtual carriers in the virtual knapsack of the controlled virtual object, the virtual carrier with the highest priority may be obtained and displayed according to the priority of the virtual carrier. The priorities may include, but are not limited to, the order of storage of the virtual vehicles in the virtual backpack, the speed parameters of the virtual vehicles, the degree of breakage of the virtual vehicles, etc.
In some embodiments, when the controlled virtual object is located in the non-driving area, the first functional control may be displayed through a touch operation on the movement control area, so as to control the controlled virtual object to move in the first movement state, and when the controlled virtual object enters the driving area, the second functional control is displayed. The method for controlling the movement of the virtual object further comprises the following steps:
when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control;
Responding to triggering operation for the first functional control, and controlling the controlled virtual object to move according to a first movement state corresponding to the first functional control;
and controlling to display the second functional control in response to detecting that the controlled virtual object enters the driving area from the non-driving area.
For example, as shown in an interface schematic diagram of the controlled virtual object in the non-driving area in (1) in fig. 1c, when the controlled virtual object controlled by the user is in the non-driving area, the user may drag the movement rocker to the movement control area (i.e., the user performs a touch operation on the movement control area), and in response to the operation, the first function control is displayed. When the user touches the first functional control, the controlled virtual object is controlled to move in a first moving state such as a running state in the game scene. As shown in the interface schematic diagram of the controlled virtual object in the non-driving area in fig. 1c (2), when the controlled virtual object is detected to enter the driving area, displaying the second functional control, and at the same time, displaying the first functional control and the second functional control simultaneously.
In some embodiments, the first and second functionality controls may correspond to first and second functionally responsive regions, respectively.
The function response area is an area for responding to a triggering operation of the function control. The size and shape of the functional response area can be set according to actual needs or game settings, for example, the functional response area can be circular, fan-shaped or annular.
Optionally, the first functional response area and the second functional response area are different areas, so that the two functional controls can be distinguished conveniently, and false touch is avoided. For example, as shown in the schematic diagram of the functionality controls and the functionality responsive zones of fig. 1d, the first functionality responsive zone may be a semi-annular zone remote from the movement control zone (movement rocker) and the second functionality responsive zone may be a semi-annular zone close to the movement control zone (movement rocker).
Alternatively, the triggering operation of the function response area may include, but is not limited to, a triggering operation of touching, dragging, pressing, long pressing, short pressing, double clicking, ending dragging, sliding, and the like, of the function response area.
Alternatively, the triggering operation of the first functional response area and the triggering operation of the second functional response area may be included as continuous operation or intermittent operation.
Alternatively, the touch operation on the movement control area and the trigger operation on the second function response area may be continuous operation or intermittent operation. For example, after the user performs the second sliding operation, the first functional control and the second functional control may be displayed, when the user releases the touch of the second sliding operation, the user may touch again and drag the moving rocker to the functional response area of the second functional control, and when it is detected that the user releases the touch, the virtual carrier may be displayed in the driving area of the game scene.
Optionally, the touch operation on the mobile control area and the trigger operation on the second functional response area may be continuous operation, so that the process of displaying the second functional control and triggering the second functional control can be completed through one-time sliding operation, and the interaction efficiency is improved. For example, when the user touches and starts to drag the moving rocker (i.e. starts to execute the second sliding operation), the first functional control and the second functional control are displayed above the moving rocker in response to the sliding operation, if the moving rocker is continuously dragged to the functional response area of the first functional control, the controlled virtual object can be controlled to run, and if the moving rocker is continuously dragged to the functional response area of the second functional control, the virtual carrier can be displayed in the driving area. For another example, after the first functional control and the second functional control are displayed, the function response area of the first functional control can be bypassed and dragged to the function response area of the second functional control, so that the controlled virtual object can be better controlled.
In some embodiments, the triggering operation of the different function response areas can be performed to trigger the corresponding function control. The method for controlling the movement of the virtual object further comprises the following steps:
Responding to the triggering operation in the first function response area, triggering a first function control corresponding to the first function response area, and controlling the controlled virtual object to move according to a first movement state corresponding to the first function control;
and the contact points responding to the triggering operation move from the first functional response area to the second functional response area, the triggering operation is executed in the second functional response area, the second functional control is triggered, and the controlled virtual object is controlled to move according to the second moving state corresponding to the second functional control.
For example, as shown in the interface schematic of fig. 1e, the joystick may be moved to the first functionality control (first functionality response area) by clicking or dragging to trigger the first functionality control. As shown in the interface schematic diagram in fig. 1f, when the first functional response area is touched, the first functional response area may be dragged to the second functional response area to trigger the second functional control, or the moving rocker may be dragged to trigger the second functional control through the first functional response area to the second functional response area. Therefore, the second functional control is triggered through continuous operation from the first functional response area to the second functional response area, the second functional control is prevented from being touched by mistake, and the triggering efficiency can be improved.
Optionally, when the touch point of the triggering operation moves to the first functional response area or the second functional response area, the identification of the first functional control or the second functional control may be highlighted to give the user visual feedback that the touch point of the first functional control or the second functional control has been touched.
Optionally, if the end position (loose hand position) of the touch point of the triggering operation is not located in any functional response area and/or the end position is located between the first functional control and the second functional control, the functional control closest to the end position may be triggered. For example, if the second functionality control is closer to the end position of the contact than the first functionality control, the first functionality control is triggered.
In some embodiments, the first functionally responsive area is positioned in the same orientation as the second functionally responsive area, and the first functionally responsive area is between the second functionally responsive area and the movement control area.
For example, as shown in the schematic diagram of the functional controls and the functional response area in fig. 1d, the first functional response area is located above the moving rocker, and the second functional response area is located above the first functional response area, so that the control layout on the interface is optimized, and continuous triggering of multiple functional controls is also facilitated.
In some embodiments, the shape, position, layout, etc. of the functional response area can be custom modified by adjustment operations to provide a diversified triggering method. The method for controlling the movement of the virtual object further comprises the following steps:
adjusting the shape and/or size of a target function response area in response to a first adjustment operation on the target function response area, the target function response area including at least one of the first function response area and the second function response area;
and adjusting the area layout of the first function response area and the second function response area in response to a second adjustment operation on the target function response area.
Wherein the region layout includes at least one of an adjoining arrangement and a spacing arrangement.
The first adjustment operation refers to an operation for adjusting the shape and/or size of the target function response area. The second adjustment operation refers to an operation for adjusting the region layout of the second function response region. The adjustment operation may include, but is not limited to, touching, dragging, pressing, long pressing, short pressing, double clicking, ending dragging, sliding, etc., and the user may be able to perform the adjustment operation through an input device such as a touch screen, mouse, keyboard, or handle, depending on the game operation method or game specific setting.
Wherein, the adjacent arrangement means that the first functional response area is adjacent to the second functional response area, and the interval arrangement means that other areas such as a movement control area are spaced between the first functional response area and the second functional response area.
In some implementations, the region layout includes adjusting the location of the functional region.
Optionally, the second adjustment operation includes a first position adjustment operation to customize a response speed of the different functionality controls. Specifically, adjusting the region layout of the first function response region and the second function response region in response to a second adjustment operation on the target function response region includes:
responding to a first position adjustment operation aiming at the first functional response area and the second functional response area, taking the first functional response area as a functional response area corresponding to the second functional control, and taking the second functional response area as a functional response area corresponding to the first functional control.
Optionally, when the function response area of the function control is adjusted by the second adjustment operation, the control identifier displayed in the graphical user interface by the function response area is also changed with the adjustment. Specifically, the positions of the first and second functionality controls are exchanged in response to a second adjustment operation for the first and second functionality response areas.
For example, a user may enter a custom setting page set by the game system, and click on a remote control on the page to generate a first functional control and a second functional control, as shown in a schematic diagram of adjusting the functional controls in fig. 1h, and the user may touch and drag the first functional control up to a position of the second functional control to exchange upper and lower positions of control identifications of the first functional control and the second functional control in the game interface, and exchange functional response areas corresponding to the first functional control and the second functional control while exchanging positions.
Optionally, the second adjustment operation includes a second position adjustment operation, and the distances between the running control and the driving control in the game interface can be customized and modified through the second position adjustment operation so as to avoid false touch. Specifically, adjusting the region layout of the first functional response region and the second functional response region includes:
and adjusting the distance between the target control and the movement control area in the designated direction in response to the second position adjustment operation for the target functional control.
For example, any one of the first functional control and the second functional control can be touched and dragged on a custom setting page of the game, so that the functional response area of the first functional control and/or the second functional control in the game interface is far away from or near to other areas, and the response positions of the first functional control and the second functional control can be custom, and false touch is avoided.
In some embodiments, the response speed corresponding to the first functional response area is related to a first distance, the response speed corresponding to the second functional response area is related to a second distance, the first distance is a distance between the first functional response area and the movement control area, and the second distance is a distance between the second functional response area and the movement control area. For example, different response coefficients can be provided according to the distance between the functional control and the movement control area, the response speed can be positively correlated with the distance and the corresponding coefficient, and the response speed is faster as the response coefficient is larger.
In some implementations, the position of the target functionality control may be adjusted by adjusting the vertical distance of the target functionality control from the movement control area in a specified direction. The shorter the vertical distance between the first functional control and/or the second functional control and the movement control area (movement rocker) is, the faster the trigger response is, and the longer the vertical distance is, the slower the trigger response is, so that the response efficiency of the functional control can be adjusted by adjusting the distance between the target functional control and the movement rocker.
Optionally, when the distance between the target functional control and the movement control area in the designated direction is adjusted, adjusting the functional response area corresponding to the target functional control according to the adjusted distance. For example, the vertical distance between the first functional control and the second functional control which are initially displayed and the movable rocker is 368PX and 560PX respectively, and the first functional control and/or the second functional control can be dragged upwards or downwards through the custom setting page of the game so as to adjust the vertical distance between the control identification of the first functional control and/or the second functional control and the control identification of the movable rocker, and the functional response area of the first functional control and/or the functional response area of the second functional control can be synchronously increased or decreased by the adjusted vertical distance. As shown in the schematic diagram of the adjustment function control in fig. 1i, the first function control is touched and dragged upwards to adjust and increase the vertical distance between the first function control and the moving rocker. As shown in the schematic diagram of the adjustment function control shown in fig. 1j, the vertical distance of the first function control is dragged upwards to 622PX, and the first function response area of the first function control is correspondingly adjusted from a semi-annular shape with an outer radius of 560PX to a semi-annular shape with an outer radius of 622 PX.
In some embodiments, when the controlled virtual object is located in the driving area, the first functional control and the at least one second functional control may be exhaled in a grading manner to optimize the display effect of the functional controls. Specifically, when the current position is located in a driving area in a game scene, in response to a touch operation acting on a movement control area, controlling to display a first functional control and at least one second functional control, including:
and when the current position is positioned in a driving area in the game scene, grading and calling out the first functional control and at least one second functional control in the mobile control area.
Wherein, the step call-out refers to step display function control.
For example, hierarchical exhalations may refer to exhaling a first functionality control and a second functionality control sequentially, based on a displacement of a touch operated contact point acting on a movement control area. In this way, the first functional control and the second functional control are not displayed simultaneously in the touch operation process, and are displayed in a grading manner according to the contact position of the touch operation. For example, when the contact point of the touch operation slides upwards by a certain distance, such as a preset distance a, the first functional control can be exhaled first, and then when the contact point of the touch operation slides upwards by a certain distance, such as a preset distance B, the second functional control can be exhaled again, wherein the preset distance B is greater than the preset distance a.
In some embodiments, the preset number of first functional controls and second functional controls may be exhaled according to the preset priority of the functional controls, so that the functional controls with high priority are exhaled preferentially, the display effect of the functional controls is optimized, and the exhaled functional controls may be displayed in the mobile control area or may be displayed outside the mobile control area.
In some embodiments, the second movement state may be a state in which the controlled virtual object is controlled to drive the virtual vehicle to move. The method for controlling the movement of the virtual object further comprises the following steps:
and controlling the controlled virtual object to drive the virtual carrier.
For example, controlling the controlled virtual object to drive the virtual vehicle may be controlling the controlled virtual object to ride in the driving position of the virtual vehicle, where the virtual vehicle is in a static state, and the user is required to execute the triggering operation on the second functional control again, such as dragging the moving rocker upwards to the second functional response area of the second functional control again, so as to control the virtual vehicle to travel forwards, where the manner may cause a pause and a breakpoint of experience.
In some embodiments, the controlling the controlled virtual object to drive the virtual vehicle may be controlling the controlled virtual object to drive the virtual vehicle to move according to the driving movement parameter, so as to improve the smoothness of the game operation and improve the user experience.
In some embodiments, the driving movement parameters of the virtual vehicle can be set based on the object movement parameters of the controlled virtual object, so that the difficulty of driving the virtual vehicle by the user is reduced, and the user experience is improved. The step of controlling the controlled virtual object to move according to the second movement state corresponding to the second functional control comprises the following steps: the controlled virtual object is controlled to move with a driving movement parameter, and the driving movement parameter is associated with an object movement parameter of the controlled virtual object.
Wherein the movement parameter may refer to a parameter related to the movement of the virtual object in the game. The movement parameters may include at least one of speed, direction, etc. The driving movement parameter movement may refer to at least one of a speed, a direction, etc. of the virtual vehicle, and the object movement parameter may control at least one of a speed, a direction, etc. of the virtual object.
Because the moving speed of the virtual carrier is generally higher than the moving speed of the controlled virtual object, if the initial moving speed (driving moving parameter) of the virtual carrier is set at the corresponding speed of the virtual carrier, the controlled virtual object can be directly switched from the lower moving speed to the higher moving speed, so that the difficulty of driving the virtual carrier by a user is increased, the initial moving parameter of the virtual carrier is set based on the object moving parameter of the controlled virtual object, the smoothness of switching the moving speed of the controlled virtual object is increased, the moving state of the virtual carrier can be used more quickly by the user when the user starts to drive the virtual carrier, the difficulty of driving the virtual carrier by the user is reduced, and the user experience is improved.
Optionally, the movement parameters include parameters of speed and/or direction.
For example, when the second function control is triggered, the moving speed of the controlled virtual object is X, the moving direction is a, and the controlled virtual object can be controlled to automatically ride on the driving position of the virtual carrier, and the initial moving speed of the virtual carrier is controlled to be y=x×120%, and the initial moving direction is a.
In some embodiments, the driving movement parameter includes a movement speed, and the step of controlling the controlled virtual object to move according to a second movement state corresponding to the second function control further includes:
in response to a speed control operation acting on the speed control area, the moving speed is adjusted according to the speed control operation, and the controlled virtual object is controlled to move at the adjusted moving speed.
The speed control area is a response area for controlling the moving speed. Further, the response area for controlling the moving direction is a direction control area. In the embodiment of the present application, the movement control area, the initial control area (the control area corresponding to the rocker, which is used to control the controlled virtual object to move in the normal state), the speed control area, and the direction control area may be the same area or may be different areas. For example, the speed control region may be an acceleration control in the graphical user interface other than a rocker; the directional control region may be a directional adjustment control in the graphical user interface other than a joystick.
In some embodiments, the initial control region, the speed control region, and the direction control region may also be sub-regions of the movement control region.
The speed control operation is an operation for adjusting a driving movement parameter of the vehicle. For example, the speed control operation may be to accelerate the virtual vehicle through a touch speed control area.
In some embodiments, the method for controlling movement of a virtual object further includes:
in response to a direction control operation acting on the direction control area, the movement direction of the controlled virtual object is adjusted according to the direction control operation, and the controlled virtual object is controlled to move in the adjusted movement direction according to the second movement state.
The direction control operation is an operation for adjusting the movement direction of the virtual carrier. For example, the directional control operation may steer the virtual vehicle for the touch directional control region.
For example, while the virtual object is being controlled to drive the virtual vehicle (i.e., move in the second movement state), the user may touch the movement rocker to take over movement control of the virtual vehicle, such as by dragging the movement rocker to accelerate or steer the virtual vehicle.
Optionally, when the controlled virtual object drives the virtual vehicle, a vehicle control may be displayed on the graphical user interface, where the vehicle control is used to control the moving speed and direction of the vehicle, and so on. The user can adjust the moving speed and direction of the carrier through the carrier control on the touch graphical user interface.
Optionally, before the user adjusts the moving speed and direction of the carrier, if an obstacle exists in the moving direction of the virtual carrier, the virtual carrier can be automatically controlled to avoid the obstacle, so that the difficulty of driving the virtual carrier by the user is reduced, and the user experience is improved.
The virtual object movement control scheme provided by the embodiment of the application can be applied to various game scenes. For example, taking shooting game as an example, the current position of the controlled virtual object is obtained; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying a first functional control and at least one second functional control in response to touch operation acting on the movement control area, wherein the movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
From the above, embodiments of the present application may be described. Therefore, the scheme can be improved, new functional controls are exhaled through touch operation of the original mobile control area of the graphical user interface on the premise that the operation controls are not added, and new functions are realized, so that an interactive interface is simplified, and false touch caused by excessive controls on the interface is avoided. In addition, when the controlled virtual object is located in different areas such as a non-driving area and a driving area, different functional controls can be exhaled by touch operation on the movement control area, so that a user can control the controlled virtual object to execute different movement states. When the controlled virtual object is located in the driving area, the second functional control corresponding to the driving area can be exhaled, so that the second functional control is triggered and enters a second moving state in a mode that a natural player well understands, and interaction efficiency is improved.
The method described in the above embodiments will be described in further detail below.
In this embodiment, a method according to the embodiment of the present application will be described in detail by taking an FPS (First-Person Shooter Game ) hand game as an example.
As shown in fig. 2, a specific flow of a method for controlling movement of a virtual object is as follows:
210. In response to an initial sliding operation for moving the joystick, the controlled virtual object is controlled to enter the driving area.
For example, in the FPS hand tour, a single area is defined as a driving area in a game scene, and in an initial state, there is no virtual vehicle in the area. The user can control the controlled virtual object to enter the driving area through the sliding operation of the movable rocker in any direction. When the game background monitors that the controlled virtual object enters the driving area, a popup window prompt 'the current area can drive the carrier' can be displayed on the game interface, and the display is repeated three times.
220. And in response to a first sliding operation for the mobile rocker, displaying a vehicle calling control and a running control in an associated area of the mobile rocker.
For example, the first sliding operation and the initial sliding operation may be continuous operations. After the controlled virtual object enters the driving area, the user can continuously drag the movable rocker to the calling area above the rocker along the appointed direction, and the vehicle calling control and the running control can be displayed above the movable rocker in response to the operation of dragging to the calling area. As shown in interface diagrams (1) and (2) in fig. 1c, in the FPS hand game, the game interface may display a bullet changing control and two shooting controls located on two sides, and the user may control the controlled virtual object to perform a bullet changing action and a shooting action through touching the bullet changing control and the shooting control. The user can operate the left side and move the rocker, continuously promote to the screen top, can appear two guide buttons, be [ the running lock (the running control) and [ the driving carrier (the carrier calling control) ] respectively, the running control corresponds first function response region, the carrier calling control corresponds second function response region.
For another example, the first sliding operation and the third sliding operation may be intermittent operations. After the controlled virtual object enters the driving area, the user can release the touch, move the rocker from the touch again to drag the movable rocker to the calling area above the rocker, and display a vehicle calling control and a running control above the movable rocker in response to the operation of dragging to the calling area.
230. And responding to the second sliding operation, displaying the virtual carrier in the driving area, and controlling the controlled virtual object to drive the virtual carrier.
For example, the first sliding operation and the second sliding operation may be continuous operations. After the vehicle calling control and the running control are displayed, the user can continuously drag and move the rocker to a second function response area above the rocker, touch control is released in response to dragging to the second function response area, the virtual vehicle can be called and displayed from a knapsack of the controlled virtual object, and the controlled virtual object is controlled to be ridden at the driving position of the virtual vehicle.
For another example, the first sliding operation and the second sliding operation may be intermittent operations. The touch can be released, the rocker is moved from the touch again to drag the rocker to a second function response area above the rocker, the touch is released in response to dragging to the second function response area, the virtual carrier can be summoned and displayed from the knapsack of the controlled virtual object, and the controlled virtual object is controlled to be taken at the driving position of the virtual carrier.
When the controlled virtual object is seated at the driving position of the virtual vehicle, the virtual vehicle can move with the moving speed X and the moving direction as y=x×120%, X is the moving speed of the controlled virtual object, and a is the moving direction of the controlled virtual object. After the controlled virtual object is taken at the driving position of the virtual carrier, the moving speed and direction of the carrier can be adjusted by the carrier control on the touch graphical user interface.
It should be noted that, the user may freely select a drag path of the second sliding operation, for example, the drag path may pass through the running control, that is, the user may drag up to the second functional response area quickly, and touch the first functional response area of the running control during the drag process; the drag operation can bypass the running control, namely, the user can drag slowly, and bypass the first functional response area of the running control by using the dragging path of the arc line, so that the running control cannot be touched in the drag process. The user can drag the movable rocker upwards for a certain distance and then enter the first function response area of the running control), trigger the running function after releasing the touch, drag the movable rocker upwards for a certain distance again and then enter the second function response area of the vehicle calling control, and trigger the function of driving the virtual vehicle after releasing the touch.
In practical applications, in order to enable a user to trigger a corresponding function quickly, as shown in the schematic diagram of the function response area in fig. 1h, a vertical distance and two corresponding sector areas for response may be set for the function response area. The vertical distance may provide for a user to customize the operating distance; the sector area reduces the false touch condition of the two controls to a certain extent through the sector area, so that users can respond to functions under various operation angles, and the fault tolerance is improved.
In addition, the position and sequence of the carrier and the running can be supported to be adjusted by the user. The user can enter a system-setting-self-defining interface of the game, click on a remote control, and display two controls of a running control and a carrier calling control, such as the schematic diagrams of the adjustment function control shown in fig. 1h and fig. 1j, and the user can drag the up-down position (and the screen distance of the movable rocker) of the function control. The closer the distance the faster the trigger response (too close a distance may create false touches to the novice user). Meanwhile, the user can be supported to exchange the upper and lower positions of the carrier and the running control, the operation requirements of the user in different playing mechanisms and different types of users are met, the action experience of the user in the game is improved, and the game is more free and personalized.
As can be seen from the above, in the embodiment of the present application, by calling the control by the mobile rocker to use the method of calling and displaying the virtual carrier, the virtual carrier that does not exist in the game scene can be called and displayed near the controlled virtual object, and a series of operations of searching, going to the virtual carrier in the game scene, etc. are not required to be performed by the user, so that the interaction process of simplifying the virtual carrier used by the controlled virtual object can be improved, and the efficiency is improved.
In order to better implement the above method, the embodiment of the present application further provides a movement control device for a virtual object, where the movement control device for a virtual object may be specifically integrated in an electronic device, and the electronic device may be a device such as a terminal or a server. The terminal can be a mobile phone, a tablet personal computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in the present embodiment, a method of the embodiment of the present application will be described in detail by taking a specific integration of a mobile control device of a virtual object in a terminal as an example.
For example, as shown in fig. 3, the movement control device for a virtual object provides a graphical user interface through a terminal, where the content displayed by the graphical user interface at least partially includes a game scene and a controlled virtual object located in the game scene, and the movement control device for a virtual object may include an acquisition unit 310 and a control unit 320, as follows:
First acquisition unit 310
For acquiring a current position of the controlled virtual object.
(II) control unit 320
And when the current position is positioned in the non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to the touch operation acted on the movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control.
The movement control device of the virtual object is further used for controlling and displaying a first functional control and at least one second functional control in response to touch operation acted on the movement control area when the current position is located in a driving area in the game scene, the movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, the second functional control is configured to respond to triggering operation, and the controlled virtual object is controlled to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
In some implementations, the second functionality control includes a vehicle call control configured to display the virtual vehicle in the game scene in response to the triggering operation and to control the controlled virtual object to drive the virtual vehicle to move.
In some embodiments, the virtual vehicle is acquired from a virtual backpack corresponding to the controlled virtual object and displayed.
In some embodiments, the control unit 320 may also be configured to:
when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on a movement control area, and displaying a first functional control;
responding to triggering operation for the first functional control, and controlling the controlled virtual object to move according to a first movement state corresponding to the first functional control;
and in response to detecting that the controlled virtual object enters the driving area from the non-driving area, controlling to display the second functional control.
In some embodiments, the step of controlling the controlled virtual object to move according to the second movement state corresponding to the second function control includes controlling the controlled virtual object to move with a driving movement parameter, the driving movement parameter being associated with an object movement parameter of the controlled virtual object.
In some embodiments, the driving movement parameter includes a movement speed, and the step of controlling the controlled virtual object to move according to a second movement state corresponding to the second function control further includes:
And adjusting the moving speed according to the speed control operation in response to the speed control operation acting on the speed control area, and controlling the controlled virtual object to move at the adjusted moving speed.
In some embodiments, the control unit 320 is further configured to:
and responding to a direction control operation acted on a direction control area, adjusting the moving direction of the controlled virtual object according to the direction control operation, and controlling the controlled virtual object to move according to the second moving state and the adjusted moving direction.
In some embodiments, the control unit 320 is further configured to:
responding to the triggering operation in the first function response area, triggering a first function control corresponding to the first function response area, and controlling the controlled virtual object to move according to a first movement state corresponding to the first function control;
and the contact points responding to the triggering operation move from the first functional response area to the second functional response area, the triggering operation is executed in the second functional response area, the second functional control is triggered, and the controlled virtual object is controlled to move according to the second moving state corresponding to the second functional control.
In some embodiments, the first functionally responsive area is positioned in the same orientation as the second functionally responsive area, and the first functionally responsive area is between the second functionally responsive area and the movement control area.
In some embodiments, the control unit 320 is further configured to:
adjusting the shape and/or size of a target function response area in response to a first adjustment operation on the target function response area, the target function response area including at least one of the first function response area and the second function response area;
in response to a second adjustment operation on the target function response area, an area layout of the first function response area and the second function response area is adjusted, the area layout including at least one of an adjacency setting and an interval setting.
In some embodiments, the response speed corresponding to the first functional response area is related to a first distance, the response speed corresponding to the second functional response area is related to a second distance, the first distance is a distance between the first functional response area and the movement control area, and the second distance is a distance between the second functional response area and the movement control area.
In some embodiments, the graphical user interface further comprises a movement rocker, the movement control area is a contact area of the movement rocker, the triggering operation of the target functionality control is a release operation, and the target functionality control comprises at least one of a first functionality control and a second functionality control.
In some embodiments, the control unit 320 may be specifically configured to:
and when the current position is positioned in a driving area in the game scene, grading and calling out the first functional control and at least one second functional control in the mobile control area.
In the implementation, each unit may be implemented as an independent entity, or may be implemented as the same entity or several entities in any combination, and the implementation of each unit may be referred to the foregoing method embodiment, which is not described herein again.
Therefore, on the premise of not increasing the operation control, the embodiment of the application can call out a new function control by touch operation of the original mobile control area of the graphical user interface, and realize a new function, so that the interactive interface is simplified, and false touch caused by excessive control on the interface is avoided. In addition, when the controlled virtual object is located in different areas such as a non-driving area and a driving area, different functional controls can be exhaled by touch operation on the movement control area, so that a user can control the controlled virtual object to execute different movement states. When the controlled virtual object is located in the driving area, the second functional control corresponding to the driving area can be exhaled, so that the second functional control is triggered and enters a second moving state in a mode that a natural player well understands, and interaction efficiency is improved.
Correspondingly, the embodiment of the application also provides a computer device, which can be a terminal or a server, wherein the terminal can be a terminal device such as a smart phone, a tablet computer, a notebook computer, a touch screen, a game console, a personal computer, a personal digital assistant (Personal Digital Assistant, PDA) and the like.
As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device provided in an embodiment of the present application, where the computer device 400 includes a processor 410 with one or more processing cores, a memory 420 with one or more computer readable storage media, and a computer program stored on the memory 420 and executable on the processor. The processor 410 is electrically connected to the memory 420. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 410 is a control center of computer device 400, connects various portions of the entire computer device 400 using various interfaces and lines, and performs various functions of computer device 400 and processes data by running or loading software programs and/or modules stored in memory 420, and invoking data stored in memory 420, thereby performing overall monitoring of computer device 400.
In the embodiment of the present application, the processor 410 in the computer device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 420 according to the following steps, and the processor 410 executes the application programs stored in the memory 420, so as to implement various functions:
acquiring the current position of a controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying a first functional control and at least one second functional control in response to touch operation acting on the movement control area, wherein the movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch display 430, radio frequency circuit 440, audio circuit 450, input unit 460, and power supply 470. The processor 410 is electrically connected to the touch display 430, the rf circuit 440, the audio circuit 450, the input unit 460 and the power supply 470, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 4 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 430 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 430 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a liquid crystal display (LCD, liquid Crystal Display), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 410, and can receive commands from the processor 410 and execute them. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 410 to determine the type of touch event, and the processor 410 then provides a corresponding visual output on the display panel based on the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 430 to implement the input and output functions. In some embodiments, however, the touch panel and the display panel may be implemented as two separate components to implement the input and output functions. I.e. the touch display 430 may also implement an input function as part of the input unit 460.
In this embodiment, executing the game application by the processor 410 generates a graphical user interface on the touch display 430, wherein the virtual scene on the graphical user interface includes at least one skill control area, and the skill control area includes at least one skill control. The touch display 430 is used for presenting a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface.
The radio frequency circuit 440 may be used to transceive radio frequency signals to establish wireless communication with a network device or other computer device via wireless communication.
Audio circuitry 450 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 450 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 450 and converted into audio data, which are processed by the audio data output processor 410 for transmission to, for example, another computer device via the radio frequency circuit 440, or which are output to the memory 420 for further processing. The audio circuit 450 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 460 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 470 is used to power the various components of the computer device 400. Alternatively, the power supply 470 may be logically connected to the processor 410 through a power management system, so as to perform functions of managing charging, discharging, and power consumption management through the power management system. The power supply 470 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., and will not be described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment may call out a new function control by touch operation on the original mobile control area of the graphical user interface without adding an operation control, so as to implement a new function, simplify the interactive interface, and avoid false touch caused by excessive controls on the interface. In addition, when the controlled virtual object is located in different areas such as a non-driving area and a driving area, different functional controls can be exhaled by touch operation on the movement control area, so that a user can control the controlled virtual object to execute different movement states. When the controlled virtual object is located in the driving area, the second functional control corresponding to the driving area can be exhaled, so that the second functional control is triggered and enters a second moving state in a mode that a natural player well understands, and interaction efficiency is improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, the embodiments of the present application provide a computer readable storage medium in which a plurality of computer programs are stored, the computer programs being capable of being loaded by a processor to perform steps in any of the virtual object movement control methods provided in the embodiments of the present application. For example, the computer program may perform the steps of:
acquiring the current position of a controlled virtual object; when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on the movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control; when the current position is located in a driving area in the game scene, controlling and displaying a first functional control and at least one second functional control in response to touch operation acting on the movement control area, wherein the movement state corresponding to the second functional control is different from the movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any one of the virtual object movement control methods provided in the embodiments of the present application may be executed by the computer program stored in the storage medium, so that the beneficial effects that any one of the virtual object movement control methods provided in the embodiments of the present application may be achieved are detailed in the previous embodiments and are not described herein.
The foregoing describes in detail a method, apparatus, computer device and storage medium for controlling movement of a virtual object provided in the embodiments of the present application, and specific examples are applied to illustrate principles and implementations of the present application, where the foregoing description of the embodiments is only for helping to understand the method and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (16)

1. A movement control method of a virtual object, characterized in that a graphical user interface is provided by a terminal, the content displayed by the graphical user interface at least partially containing a game scene, a controlled virtual object located in the game scene, the method comprising:
acquiring the current position of the controlled virtual object;
when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acting on a movement control area, and displaying a first functional control, wherein the first functional control is configured to respond to triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control;
when the current position is located in a driving area in the game scene, controlling and displaying the first functional control and at least one second functional control in response to touch operation acted on the movement control area, wherein a movement state corresponding to the second functional control is different from a movement state corresponding to the first functional control, and the second functional control is configured to respond to triggering operation to control the controlled virtual object to move according to a second movement state corresponding to the second functional control, and the second movement state corresponds to the driving area.
2. The method of claim 1, wherein the second functionality control comprises a vehicle call control configured to display a virtual vehicle in the game scene in response to a trigger operation and control the controlled virtual object to drive the virtual vehicle to move.
3. The method of claim 2, wherein the virtual carrier is acquired from a virtual backpack corresponding to the controlled virtual object and displayed.
4. The method for controlling movement of a virtual object according to claim 1, further comprising:
when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene in response to touch operation acted on a movement control area, and displaying the first functional control;
responding to triggering operation for the first functional control, and controlling the controlled virtual object to move according to a first movement state corresponding to the first functional control;
and in response to detecting that the controlled virtual object enters the driving area from the non-driving area, controlling to display the second functional control.
5. The method for controlling movement of a virtual object according to claim 2, wherein the step of controlling movement of the controlled virtual object according to a second movement state corresponding to the second function control comprises:
and controlling the controlled virtual object to move according to driving movement parameters, wherein the driving movement parameters are associated with object movement parameters of the controlled virtual object.
6. The method for controlling movement of a virtual object according to claim 5, wherein the driving movement parameter includes a movement speed, and the step of controlling the controlled virtual object to move according to a second movement state corresponding to the second function control further includes:
and adjusting the moving speed according to the speed control operation in response to the speed control operation acting on the speed control area, and controlling the controlled virtual object to move at the adjusted moving speed.
7. The method for controlling movement of a virtual object according to claim 5, further comprising:
and responding to a direction control operation acted on a direction control area, adjusting the moving direction of the controlled virtual object according to the direction control operation, and controlling the controlled virtual object to move according to the second moving state and the adjusted moving direction.
8. The method for controlling movement of a virtual object according to claim 1, further comprising:
responding to a triggering operation in a first function response area, triggering a first function control corresponding to the first function response area, and controlling the controlled virtual object to move according to the first movement state corresponding to the first function control;
and the contact points responding to the triggering operation move from the first functional response area to the second functional response area, the triggering operation is executed in the second functional response area, the second functional control is triggered, and the controlled virtual object is controlled to move according to the second movement state corresponding to the second functional control.
9. The method of movement control of a virtual object according to claim 8, wherein the first function response area and the second function response area are located at the same orientation of the movement control area, and the first function response area is between the second function response area and the movement control area.
10. The method of movement control of a virtual object according to claim 8, wherein the method further comprises:
Adjusting the shape and/or size of a target function response area in response to a first adjustment operation on the target function response area, the target function response area including at least one of the first function response area and the second function response area;
and adjusting a region layout of the first functional response region and the second functional response region in response to a second adjustment operation on the target functional response region, the region layout including at least one of an adjacency setting and a spacing setting.
11. The method according to claim 8, wherein a response speed corresponding to the first function response area is related to a first distance, and a response speed corresponding to the second function response area is related to a second distance, the first distance being a distance between the first function response area and the movement control area, and the second distance being a distance between the second function response area and the movement control area.
12. The method of claim 1, wherein the graphical user interface further comprises a movement rocker, the movement control area is a contact area of the movement rocker, the triggering operation of a target function control is a release operation, and the target function control comprises at least one of the first function control and the second function control.
13. The method for controlling movement of a virtual object according to claim 1, wherein controlling display of the first function control and at least one second function control in response to a touch operation applied to the movement control area when the current position is located in a driving area in the game scene, comprises:
and when the current position is positioned in a driving area in the game scene, grading calling out the first functional control and at least one second functional control in the mobile control area.
14. A movement control device for virtual objects, characterized in that a graphical user interface is provided by a terminal, the content displayed by the graphical user interface at least partially comprising a game scene, controlled virtual objects located in the game scene, the device comprising:
an acquisition unit configured to acquire a current position of the controlled virtual object;
the control unit is used for responding to touch operation acted on a movement control area when the current position is located in a non-driving area in the game scene, controlling the controlled virtual object to move in the game scene and displaying a first functional control, wherein the first functional control is configured to respond to the triggering operation and control the controlled virtual object to move according to a first movement state corresponding to the first functional control;
The control unit is further configured to, when the current position is located in a driving area in the game scene, control and display the first functional control and at least one second functional control in response to a touch operation acting on the movement control area, where a movement state corresponding to the second functional control is different from a movement state corresponding to the first functional control, and the second functional control is configured to, in response to a trigger operation, control the controlled virtual object to move according to a second movement state corresponding to the second functional control, where the second movement state corresponds to the driving area.
15. A computer device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps in the method of movement control of a virtual object as claimed in any one of claims 1 to 13.
16. A computer readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps in the method of controlling movement of a virtual object according to any one of claims 1 to 13.
CN202310401376.2A 2023-04-13 2023-04-13 Method, device, computer equipment and storage medium for controlling movement of virtual object Pending CN116459509A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310401376.2A CN116459509A (en) 2023-04-13 2023-04-13 Method, device, computer equipment and storage medium for controlling movement of virtual object
PCT/CN2023/113071 WO2024212412A1 (en) 2023-04-13 2023-08-15 Movement control method and apparatus for virtual object, and computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310401376.2A CN116459509A (en) 2023-04-13 2023-04-13 Method, device, computer equipment and storage medium for controlling movement of virtual object

Publications (1)

Publication Number Publication Date
CN116459509A true CN116459509A (en) 2023-07-21

Family

ID=87172991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310401376.2A Pending CN116459509A (en) 2023-04-13 2023-04-13 Method, device, computer equipment and storage medium for controlling movement of virtual object

Country Status (2)

Country Link
CN (1) CN116459509A (en)
WO (1) WO2024212412A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024212412A1 (en) * 2023-04-13 2024-10-17 网易(杭州)网络有限公司 Movement control method and apparatus for virtual object, and computer device and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6588177B1 (en) * 2019-03-07 2019-10-09 株式会社Cygames Information processing program, information processing method, information processing apparatus, and information processing system
CN112933591B (en) * 2021-03-15 2024-07-09 网易(杭州)网络有限公司 Game virtual character control method and device, storage medium and electronic equipment
CN114377395A (en) * 2022-01-13 2022-04-22 腾讯科技(深圳)有限公司 Virtual carrier and virtual object control method, device, equipment and medium
CN116459509A (en) * 2023-04-13 2023-07-21 网易(杭州)网络有限公司 Method, device, computer equipment and storage medium for controlling movement of virtual object

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024212412A1 (en) * 2023-04-13 2024-10-17 网易(杭州)网络有限公司 Movement control method and apparatus for virtual object, and computer device and storage medium

Also Published As

Publication number Publication date
WO2024212412A1 (en) 2024-10-17

Similar Documents

Publication Publication Date Title
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN112546627B (en) Route guiding method, device, storage medium and computer equipment
CN113413600B (en) Information processing method, information processing device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN116459509A (en) Method, device, computer equipment and storage medium for controlling movement of virtual object
CN113521724B (en) Method, device, equipment and storage medium for controlling virtual character
CN114377395A (en) Virtual carrier and virtual object control method, device, equipment and medium
CN114522429B (en) Virtual object control method, device, storage medium and computer equipment
CN116510287B (en) Game control method, game control device, electronic equipment and storage medium
CN116585701A (en) Game control method, game control device, computer equipment and storage medium
CN115888101A (en) Virtual role state switching method and device, storage medium and electronic equipment
CN117122925A (en) Virtual vehicle control method, device, equipment and medium
CN116650963A (en) Game information display method, game information display device, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN116999852B (en) Training method, device and medium for AI model for controlling virtual character
CN115569380A (en) Game role control method, device, computer equipment and storage medium
CN117861213A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN117160031A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN116999847A (en) Virtual character control method, device, computer equipment and storage medium
CN117462949A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN117323665A (en) Information processing method and device in game, computer equipment and storage medium
CN118179012A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN117482523A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN115430145A (en) Target position interaction method and device, electronic equipment and readable storage medium
CN118976241A (en) Game control method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination