CN113559520B - Interaction control method and device in game, electronic equipment and readable storage medium - Google Patents
Interaction control method and device in game, electronic equipment and readable storage medium Download PDFInfo
- Publication number
- CN113559520B CN113559520B CN202110850549.XA CN202110850549A CN113559520B CN 113559520 B CN113559520 B CN 113559520B CN 202110850549 A CN202110850549 A CN 202110850549A CN 113559520 B CN113559520 B CN 113559520B
- Authority
- CN
- China
- Prior art keywords
- game
- virtual character
- interactive
- sliding operation
- chat window
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 230000003993 interaction Effects 0.000 title claims description 44
- 230000002452 interceptive effect Effects 0.000 claims abstract description 155
- 230000004044 response Effects 0.000 claims abstract description 91
- 230000000007 visual effect Effects 0.000 claims abstract description 67
- 230000000694 effects Effects 0.000 claims abstract description 49
- 230000009471 action Effects 0.000 claims description 34
- 238000004590 computer program Methods 0.000 claims description 4
- 230000000977 initiatory effect Effects 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 18
- 238000007667 floating Methods 0.000 description 7
- 230000014509 gene expression Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000000386 athletic effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/87—Communicating with other players during game play, e.g. by e-mail or chat
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/795—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The application provides an interactive control method, an interactive control device, electronic equipment and a readable storage medium in a game, wherein a graphical user interface on which a game interface is displayed is provided through a first terminal, and the game interface is displayed with a chat window and part or all of game scenes; a second virtual role controlled by a second terminal exists in the game scene; responding to the triggering operation aiming at the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying the first interactive visual special effect; responding to a first sliding operation aiming at the second virtual character when the second virtual character is in an interactive response state, and displaying a second interactive visual special effect when the first sliding operation slides to an area associated with the chat window; if the first sliding operation is ended in the area associated with the chat window, the chat window is set to be a window for communicating messages with the second terminal. Thus, private chat can be initiated to the second virtual character in the game interface quickly, and the step of initiating operation of the private chat is reduced.
Description
Technical Field
The present application relates to the field of game interaction technologies, and in particular, to a method and apparatus for interaction control in a game, an electronic device, and a readable storage medium.
Background
With the development of technology, more and more types of games appear in people's lives. The types of games include a wide variety of, for example, cards, RTS (Real-TIME STRATEGY GAME), instant strategy games, moba (Multiplayer Online Battle ARENA GAMES, multiplayer online tactical game), MMO (Massive Multiplayer Online Role-PLAYING GAME, massively multiplayer online role playing game), and the like.
At present, in most MMO games, although a chat window for a player to chat is always displayed in a main interface, the chat window of a team channel or a world channel to which the player belongs is usually displayed by default, if the player wants to initiate private chat to a virtual character nearby, the chat window needs to be opened through complicated operation, if the player is in the game process at this time, the player cannot timely talk with other virtual characters, and the operation efficiency in the game process is affected.
Disclosure of Invention
Accordingly, an object of the present application is to provide a method, an apparatus, an electronic device, and a readable storage medium for controlling interaction in a game, which can quickly initiate a private chat to a second virtual character in a game scene displayed by a graphical user interface, reduce operation steps of initiating operations in the private chat during the game, and simplify operation difficulty of the game.
The embodiment of the application provides an interactive control method in a game, which comprises the steps of providing a graphical user interface through a first terminal, wherein the graphical user interface comprises a game interface, and contents displayed by the game interface comprise a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the method comprises the following steps:
responding to the triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect;
Responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window;
and setting the chat window as a window for communicating messages with the second terminal in response to the first sliding operation ending in the area associated with the chat window.
In one possible implementation, the triggering operation is a long press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation.
In one possible implementation, the triggering operation is a click operation for the second virtual character.
In one possible embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character.
In one possible embodiment, the method further comprises:
executing a second game action in response to a second sliding operation initiated outside of the display position of the second virtual character, the second game action including any one of:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
In one possible embodiment, the method further comprises:
Executing a third game action in response to a third sliding operation starting from a display position of the second virtual character when the second virtual character is not in the interactive response state, the third game action including any one of the following actions:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
In one possible implementation, the displaying the first interactive visual effect includes: displaying the first interactive visual special effect in an associated area of the trigger operation action position;
the first interactive visual effect comprises at least one of: graphic identifiers and text identifiers.
In one possible implementation, the displaying the second interactive visual effect includes:
And displaying the chat window as the second interactive visual special effect.
In a possible implementation manner, after the chat window is set to a window for message interworking with the second terminal, the method further includes:
And controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
In one possible embodiment, the method further comprises:
In response to a first sliding operation for the second virtual character, an interactive icon is generated and controlled to move following the first sliding operation.
In one possible embodiment, the method further comprises:
And responding to the first sliding operation ending in the area outside the area associated with the chat window, controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
The embodiment of the application also provides an interaction control device in the game, which is characterized in that a graphical user interface is provided through the first terminal, wherein the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the method comprises the following steps:
The state response module is used for responding to the triggering operation of the second virtual character, controlling the second virtual character to enter an interactive response state and displaying a first interactive visual special effect;
the effect display module is used for responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to the area associated with the chat window;
And the window setting module is used for setting the chat window as a window for communicating messages with the second terminal in response to the fact that the first sliding operation ends in the area associated with the chat window.
In one possible implementation, the triggering operation is a long press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation.
In one possible implementation, the triggering operation is a click operation for the second virtual character.
In one possible embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character.
In a possible implementation manner, the interaction control device further comprises a first adjustment module, wherein the first adjustment module is used for:
executing a second game action in response to a second sliding operation initiated outside of the display position of the second virtual character, the second game action including any one of:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
In a possible implementation manner, the interaction control device further comprises a second adjustment module, wherein the second adjustment module is used for:
Executing a third game action in response to a third sliding operation starting from a display position of the second virtual character when the second virtual character is not in the interactive response state, the third game action including any one of the following actions:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
In one possible implementation, the status response module, when used to display the first interactive visual effect, is configured to:
Displaying the first interactive visual special effect in an associated area of the trigger operation action position;
the first interactive visual effect comprises at least one of: graphic identifiers and text identifiers.
In a possible implementation manner, the window setting module, when used for displaying the second interactive visual special effect, is used for:
And displaying the chat window as the second interactive visual special effect.
In one possible implementation manner, the interaction control device further includes a first state exit module, where the first state exit module is configured to:
And controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
In one possible implementation manner, the interaction control device further comprises an icon generating module, wherein the icon generating module is used for:
In response to a first sliding operation for the second virtual character, an interactive icon is generated and controlled to move following the first sliding operation.
In a possible implementation manner, the interaction control device further comprises a second state exit module, where the second state exit module is configured to:
And responding to the first sliding operation ending in the area outside the area associated with the chat window, controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
The embodiment of the application also provides electronic equipment, which comprises: the system comprises a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, the processor and the memory are communicated through the bus when the electronic device runs, and the machine-readable instructions are executed by the processor to execute the steps of the interactive control method in the game.
The embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the interactive control method in a game as described above.
According to the interactive control method, the interactive control device, the electronic equipment and the readable storage medium in the game, a graphical user interface is provided through the first terminal, the graphical user interface comprises a game interface, and contents displayed on the game interface comprise a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the method comprises the following steps: responding to the triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window; and setting the chat window as a window for communicating messages with the second terminal in response to the first sliding operation ending in the area associated with the chat window. Therefore, aiming at the second virtual character in the game scene displayed by the graphical user interface, the private chat can be quickly initiated to the second virtual character in the game scene through triggering operation and sliding operation, so that the operation steps of initiating the operation by the private chat can be reduced, and the operation difficulty of the game is simplified.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a schematic diagram of a graphical user interface according to an embodiment of the present application;
FIG. 1b is a diagram illustrating a second graphical user interface according to an embodiment of the present application;
FIG. 1c is a third graphical user interface according to an embodiment of the present application;
FIG. 1d is a diagram illustrating a graphical user interface according to an embodiment of the present application;
FIG. 2 is a flowchart of an interactive control method in a game according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a chat window according to an embodiment of the present application;
FIG. 4 is a second schematic diagram of a chat window according to an embodiment of the application;
FIG. 5 is a schematic diagram of an interactive control structure in a game according to an embodiment of the present application;
FIG. 6 is a second schematic diagram of an interactive control in a game according to an embodiment of the present application;
FIG. 7 is a third schematic diagram of an interactive control in a game according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an interactive control structure in a game according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. Based on the embodiments of the present application, every other embodiment obtained by a person skilled in the art without making any inventive effort falls within the scope of protection of the present application.
Virtual object:
Refers to dynamic objects that can be controlled in a virtual scene. Alternatively, the dynamic object may be a virtual character, a virtual animal, a cartoon character, or the like. The virtual object is a character that a player controls through an input device, or is an artificial intelligence set in a virtual environment fight by training (ARTIFICIAL INTELLIGENCE, AI), or is a Non-player character set in a virtual environment fight (Non-PLAYER CHARACTER, NPC). Optionally, the virtual object is a virtual character playing an athletic in the virtual scene. Optionally, the number of virtual objects in the virtual scene fight is preset, or dynamically determined according to the number of clients joining the fight, which is not limited by the embodiment of the present application. In one possible implementation, a user can control a virtual object to move in the virtual scene, e.g., control the virtual object to run, jump, crawl, etc., as well as control the virtual object to fight other virtual objects using skills, virtual props, etc., provided by the application.
Game scene:
The interface is an interface corresponding to the application program provided or displayed through a graphical user interface, and the interface comprises a UI interface and a game screen for the player to interact. In alternative embodiments, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indication identifiers (e.g., direction indication identifiers, character indication identifiers, etc.), information presentation areas (e.g., number of clicks, time of play, etc.), or game setting controls (e.g., system settings, stores, gold coins, etc.) may be included in the UI interface. In an alternative embodiment, the game screen is a display screen corresponding to the virtual scene displayed by the terminal device, and the game screen may include virtual objects such as game characters, NPC characters, AI characters, and the like for executing game logic in the virtual scene. Optionally, the game scene is a simulated environment for the real world, or a semi-simulated semi-fictional game environment, or a purely fictional game environment. The game scene is any one of a two-dimensional virtual scene and a three-dimensional virtual scene, and can be sky, land, ocean and the like, wherein the land comprises environmental elements such as deserts, cities and the like. The game scene is a scene of virtual objects such as user control and the like, namely complete game logic.
It should be noted that, the game scene in the present application refers to a scene involved in a game process, for example, in a process of executing a game task; rather than the scene involved before the game begins, e.g., on the player's home page.
The interactive control method in the game in one embodiment of the present disclosure may be run on a terminal device or a server. The terminal device may be a local terminal device. When the in-game interaction control method runs on the server, the in-game interaction control method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the information processing method are completed on the cloud game server, the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the terminal device for information processing is cloud game server of cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
It is found that in most MMO games, although a chat window for a player to chat is always displayed in a main interface of a game scene during a game process of the player, the chat window is set as a chat window for the player to issue information to a team channel or a world channel to which the player belongs, and in the chat window, the player can view information issued to the team channel or the world channel by other players, if the player inputs information in the chat window, the information is issued to the team channel to which the player belongs, or the world channel in the game, as shown in fig. 1a, fig. 1a is one of graphical user interface diagrams provided by an embodiment of the present application, a chat window 12, a first virtual character 13 and a second virtual character 14 are displayed in a graphical user interface 11, and if the player inputs information in the chat window 12, the information is issued to the team channel to which the player belongs, or the world channel to which the player belongs;
Because the player is currently in the game process, if the player wants to initiate a private chat to a virtual character nearby in the current game scene, a chat window needs to be opened through complex operations, for example: as shown in fig. 1a to 1d, first, as shown in fig. 1a, a player needs to click a finger, a mouse or a button to display a second virtual character 14 displayed in the gui 11, so that business card information 15 of the second virtual character 14 is displayed in the gui 11 (as shown in fig. 1b, fig. 1b is a second schematic diagram of the gui provided in the embodiment of the present application); then, the player clicks the business card information 15 displayed in the gui 11 with a finger, a mouse or a button (as shown in fig. 1 b), so that the interactive list 16 of the second avatar 14 is displayed in the gui 11 (as shown in fig. 1c, fig. 1c is a third diagram of the gui according to the embodiment of the present application); finally, the player clicks the interactive control 17 of "send message" in the interactive list 16 displayed in the graphical user interface 11 by means of a finger, a mouse or a button, and opens the second terminal private chat window 18 which can be controlled to the second virtual character 14 (as shown in fig. 1d, fig. 1d is a fourth graphical user interface schematic diagram provided by the embodiment of the present application); because the player is in the game process at this time, if the player can talk with other virtual characters through multiple clicking operations, the communication efficiency of the player in the game process can be reduced.
Based on the above, the embodiment of the application provides an interaction control method in a game, which can reduce the operation steps of initiating private chat to a second terminal used by a player to which a second virtual character displayed in a game scene belongs in the game process, and reduce the operation difficulty of the game.
Referring to fig. 2, fig. 2 is a flowchart of an interactive control method in a game according to an embodiment of the application. Providing a graphical user interface through a first terminal, wherein the graphical user interface comprises a game interface, and the content displayed by the game interface comprises a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the first virtual role can interact with the second virtual role in the game scene according to the control instruction issued by the first user. As shown in fig. 2, the method for controlling interaction in a game provided by the embodiment of the application includes:
s201, responding to triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect.
And S202, when the second virtual character is in the interaction response state, responding to a first sliding operation aiming at the second virtual character, and when the first sliding operation slides to the area associated with the chat window, displaying a second interaction visual special effect.
S203, responding to the first sliding operation ending in the area associated with the chat window, and setting the chat window as a window for communicating messages with the second terminal.
The terminal mentioned in the present application mainly refers to an intelligent device for displaying game pictures and capable of controlling and operating virtual characters, and the terminal can include any one of the following devices: smart phones, tablet computers, notebook computers, desktop computers, and the like.
The graphic user interface refers to an interface of a game screen displayed in a display screen of the terminal. The game scene refers to a virtual game space carrying a virtual character in the game process (for example, in the process of executing game tasks by the virtual character), the virtual character can be controlled by an operation instruction issued by a user to the intelligent terminal to perform actions such as movement, skill release and the like in the game scene, and the game scene can comprise any one or more of the following elements: game background elements, game virtual character elements, game prop elements, and the like. A game screen refers to a partial screen in the virtual world observed at a specified viewing angle (e.g., viewing angle of eyes of a first virtual character controlled by a first user), which is presented on a graphical user interface.
Displayed in the graphical user interface is a game screen that can be seen by eyes. A game scene refers to a scene used in a normal course of a game in which there may be a large number of virtual characters controlled by different users (a first virtual character controlled by a first user and a second virtual character controlled by a second user). Common game scenes include a combat scene, an interaction scene, and the like, where the combat scene refers to a scene in which at least two virtual characters are engaged in a conversation or a scene in which at least two virtual characters are engaged in a behavior such as an article exchange. Of course, in some cases, both the first virtual character and the second virtual character may be controlled by the same user.
According to the trigger operation or the sliding operation in the scheme provided by the application, a player can apply the trigger operation or the sliding operation by pressing a touch key corresponding to the trigger operation or the sliding operation; the player can also press a preset combined key to issue a trigger operation or a sliding operation instruction; specifically, the player can touch keys through fingers, a mouse and the like; or through preset combination keys in the keyboard, for example, ctrl keys, alt keys, a keys and other keys in the keyboard, the preset keys can be set manually according to the needs of players.
The chat window is used for a virtual character to communicate with another virtual character in terms of language and/or expression in the game process, that is, the virtual character can send characters, voice or expression to the other virtual character through the chat window to express own views, attitudes and the like. An information input box is arranged in the chat window, and the information input box refers to a position for the virtual character to input characters, voices or expressions.
In step S201, in response to a trigger operation applied for the second virtual character in the graphical user interface by a finger, a mouse, or a button, the second virtual character selected by the trigger operation is controlled to enter an interactive response state.
Here, in order to enable the user to clearly know that the second virtual character displayed in the game scene has entered the interactive response state, a first interactive visual effect may be displayed in the game scene displayed in the graphical user interface, wherein the first interactive visual effect may be displayed in an associated area of the trigger operation action position, for example, around the touch position of the trigger operation; or around a second virtual character for which the trigger operation is directed; or superposing and displaying the second virtual character above the second virtual character aimed at by the triggering operation; the first interactive visual effect may comprise at least one of: graphic identifiers and text identifiers; the first interactive visual special effect can be used for prompting the current moment of the player, and the second virtual character is in an interactive response state; it should be noted that, the text identifier may be used to prompt the player how to initiate a private chat to the second terminal of the player to which the second virtual character belongs, in addition to reminding the player that the second virtual character is in the interactive response state; specifically, the text identifier can be displayed in a floating manner within a preset range around the second virtual character; or the text mark is displayed in an overlapping way at any position in the graphical user interface, for example, the text mark of 'dragging to the chat window to initiate private chat' is displayed at any position in the graphical user interface.
For the graphic identifier, the graphic identifier may be displayed in a floating manner in a preset range around the second virtual character in the graphic user interface, for example, a hand-shaped icon is displayed in a floating manner in a preset range around the second virtual character; for another example, a graphic mark is displayed in a floating manner at the touch position of the triggering operation; and/or displaying the graphical identification superimposed on the second virtual character in the graphical user interface, e.g., displaying an aperture below the second virtual character (a mark may also be displayed around the second virtual character, etc.); or superposing and displaying an aperture on the outer ring of the displayed second virtual character; a second virtual character of high transparency may also be displayed superimposed over the already displayed second virtual character, and so on.
In step S202, on the premise that the second virtual character enters the interactive response state, executing a corresponding first game behavior by applying a first sliding operation to the second virtual character in the game scene; specifically, determining a sliding position of the first sliding operation, when the sliding position indicates that the first sliding operation slides to an area associated with the chat window, displaying a second interactive visual special effect in the graphical user interface so as to prompt the player through the second interactive visual special effect, and if the first sliding operation is finished at the moment, performing message intercommunication with the second terminal.
Here, the displayed second interactive visual effect may include at least one of: graphic identifiers and text identifiers; the text identifier can be used for prompting the user to end the first sliding operation at the current position, and can initiate private chat to a second terminal of the player to which the second virtual character belongs; specifically, the text identifier can be displayed in a floating manner within a preset range around the second virtual character; or the text identifier is displayed in a superimposed manner at any position in the graphical user interface, for example, the text identifier of "drag can start private chat at the current position" is displayed at any position in the graphical user interface.
For the graphic identifier, the graphic identifier may be displayed in a floating manner in a preset range around the chat window in the graphical user interface, for example, a conversation bubble icon is displayed in a floating manner in a preset range around the chat window; and/or, displaying the graphic mark in an overlapping manner on the chat window displayed in the graphic user interface, for example, displaying an aperture in an overlapping manner on the outer circle of the chat window which is displayed; or the chat window may be enlarged for display, etc.
Here, the second interactive visual effect may also be to display the chat window as the second interactive visual effect; for example, the original graphical user interface does not display a chat window for the player to communicate messages, and when the first sliding operation slides to the area associated with the chat window, the chat window can be displayed as a second interactive visual special effect in the graphical user interface.
Setting a chat window in a game generally means that a chat window for sending a message to a virtual world constructed by the game or sending a message to a team to which the player belongs is set for the player in a game scene displayed by a graphical user interface, but the chat window cannot be used for the player to initiate private chat to other players, so that the chat window displayed in the graphical user interface is a chat window used when private chat is carried out between the player and players corresponding to other virtual roles.
Here, whether or not the first sliding operation ends in the chat window-related region may be determined based on the touch position at the end of the first sliding operation, and if the touch position at the end of the first sliding operation is located in the chat window-related region, the first sliding operation may be determined to end in the chat window-related region.
When the chat window is a chat window used when the player performs private chat with the controlled terminals of other virtual roles, if the first sliding operation ends in the area associated with the chat window, the chat window used when the player performs private chat with the controlled terminals of other virtual roles can be set as a window for performing message intercommunication with the second terminal of the player to which the second virtual role belongs.
Here, in one embodiment, the triggering operation may be a long press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation, namely, the player can select a second virtual role which wants to carry out private chat through one continuous operation and realize message intercommunication with a second terminal of the player to which the second virtual role belongs.
In another embodiment, the first sliding operation is a sliding operation starting from a display position of the second virtual character. For example, after the second virtual character enters the interactive response state, the user may perform a second action, that is, a first sliding operation, and the user starts to perform the first sliding operation from the display position of the second virtual character, so that the chat window may start to be mobilized for the second virtual character.
In one embodiment, the triggering operation is a clicking operation for the second virtual character, that is, the player preselects the second virtual character that wants to perform private chat through the clicking operation, and then performs message intercommunication with a second terminal of the player to which the second virtual character belongs through implementation of the first sliding operation.
It should be noted that, unlike the long-press operation, the touch time of the click operation is far shorter than the touch time of the long-press operation.
The initial position of the first sliding operation is a touch position at the beginning of the first sliding operation, and when a user applies a clicking operation or a sliding operation in a manner of touching the graphical user interface by a finger or the like, the position of the user's finger pressed in the graphical user interface is the initial touch position of the first sliding operation; when the user applies the first sliding operation by pressing a preset combination key in the keyboard, the position of the touch icon (for example, a mouse icon) in the graphical user interface when the user presses the combination key is the initial touch position of the first sliding operation.
In one embodiment, for the purpose of prompting the user of the real-time touch position of the first sliding operation, the method further includes: in response to a first sliding operation for the second virtual character, an interactive icon is generated and controlled to move following the first sliding operation.
In the step, in the sliding process of the first sliding operation, an interactive icon for representing the real-time position of the first sliding operation is generated in the graphical user interface, and the generated interactive icon is controlled to move along with the sliding track of the first sliding operation, so that the purpose of prompting the user of the current touch position of the first sliding operation is achieved.
Specifically, when the user applies the first sliding operation by touching the graphical user interface with a finger or the like, the interactive icon moves with the position of the finger; when a user applies a sliding touch operation by pressing a preset combination key in the keyboard, the interactive icon moves along with the position of the touch icon displayed in the graphical user interface.
Here, the chat window may be displayed in a pop-up manner in the graphical user interface, and may also be displayed in a slide-out manner in the graphical user interface, without limitation.
The chat window may include a text entry area, an expression selection control, and a voice input control; when clicking the text input area, a virtual keyboard for the player to input text can be popped up; likewise, when the expression selection control is touched, an expression list can be popped up for the player to select; when the voice input control is touched, a voice input prompt mark is popped up to prompt the player to input voice information.
As an example, as shown in fig. 3, fig. 3 is one of display diagrams of a chat window provided by an embodiment of the present application, in which a part of a game scene, a first virtual character 3b, a second virtual character 3c, and a chat window 3d in the game scene are displayed in a graphical user interface 3a, and in response to a trigger operation for the second virtual character 3c in the game scene, the second virtual character 3c is controlled to enter an interactive response state; at the same time, a first interactive visual effect is displayed in the graphical user interface 3a, for example, a mark 3e is displayed around the second virtual character 3c to inform the player that the second virtual character 3c has entered the interactive response state.
In the case where the second virtual character 3c is in the interactive response state, in response to the first sliding operation applied to the second virtual character 3c, an icon 3f is displayed around the second virtual character 3c (in fig. 3, the icon 3f is an example of displaying a second virtual character image with high transparency in a superimposed manner on the second virtual character 3c display image, in other embodiments, the interactive mark may be another image or mark, which is not limited herein), and the icon 3f is controlled to move along the sliding track of the sliding touch operation during the sliding process of the first sliding operation.
In response to the end of the first sliding operation applied, if the first sliding operation ends in the chat window associated area 3g (in this embodiment, the chat window associated area 3g coincides with the chat window 3d displayed in the graphical user interface 3a, in other embodiments, the area occupied by the chat window associated area 3g may deviate from the area occupied by the chat window 3d by a certain degree, for example, the area occupied by the chat window associated area 3g is slightly larger than the area occupied by the chat window 3d, and may be specifically set according to the actual situation, without limitation, the chat window 3d is set in the graphical user interface 3a as a window for communicating messages with the second terminal of the player to which the second virtual character 3c belongs, and the original display form of the chat window may be changed appropriately in the setting process, for example, the display form of the chat window may be changed and the character name of the second virtual character 3c may be displayed in the chat window (as shown in fig. 3 g).
It should be noted that, the display form of the original chat window may not be changed in the setting process, and may be determined according to circumstances.
In one embodiment, after said setting the chat window to a window for message interworking with the second terminal, the method further comprises: and controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
After the message intercommunication interaction is carried out, the player needs to continuously control the virtual character to play, so that the displayed chat window needs to be closed; when the player closes the chat window, the player can be considered to select to finish the message intercommunication and reenter the game state, at the moment, the second virtual character can be controlled to exit the interactive response state, and meanwhile, the first interactive visual special effect displayed for the player is hidden so as not to block the vision of the player in the game process. Of course, in the embodiment of the invention, the first virtual character can be moved when the chat window is displayed, and the chat window can be automatically closed or hidden when the movement of the first virtual character is detected, so that limited screen resources can be fully utilized, the second virtual character is controlled to exit the interactive response state, and the first interactive visual special effect is hidden.
In one embodiment, the method further comprises: and responding to the first sliding operation ending in the area outside the area associated with the chat window, controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
Here, the player gives up the private chat for various reasons in the process of initiating the private chat, and in the case that the second virtual character is in the interactive response state, the player can control the second virtual character to exit the interactive response state by dragging the first sliding operation to a mode that the area outside the area associated with the chat window is ended; and after the second virtual character exits the interactive response state, hiding the first interactive visual effect displayed in the graphical user interface.
As shown in fig. 4, fig. 4 is a second display schematic diagram of a chat window provided in the embodiment of the present application, as shown in fig. 4, a part of a game scene, a first virtual character 4b, a second virtual character 4c and a chat window 4d in the game scene are displayed in the gui 4a, at this time, the second virtual character 4c in the game scene has already entered an interactive response state, and accordingly, a first interactive visual special effect is displayed around the second virtual character 4c, that is, a mark 4e is displayed to indicate that the second virtual character 4c is in the interactive response state; meanwhile, in response to the first sliding operation, an interactive icon 4f representing the sliding position of the first sliding operation is further displayed around the second virtual character 4c in the graphical user interface 4a, the interactive icon 4f moves along with the movement of the first sliding operation, and when the position of the interactive icon 4f indicates that the ending position of the first sliding operation is located in an area 4h outside the area 4g associated with the chat window, it is determined that the player gives up the message intercommunication, at this time, the second virtual character 4c can be controlled to exit the interactive response state, and the first interactive visual special effect, namely the mark 4e, is hidden.
In one embodiment, the method further comprises: executing a second game action in response to a second sliding operation initiated outside of the display position of the second virtual character, the second game action including any one of: controlling the second virtual character to move in the game scene; or adjusting the display view angle of the game scene.
Here, the first sliding operation is a sliding operation applied above the display position of the second virtual character, and if a second sliding operation other than the display position of the second virtual character is applied during the game, a corresponding second game action is performed in response to the second sliding operation starting from the second sliding operation other than the display position of the second virtual character, for example, the selected second virtual character is controlled to move in the game scene; or, adjusting the display view angle of the game scene in the current graphical user interface; wherein, the adjustment of the display view angle can be from a first person view angle to a third person view angle; the adjustment of the display angle of the display view angle of the game scene may also be referred to, for example, the display view angle is originally behind the second virtual character (in this case, the back image of the second virtual character is displayed in the graphical user interface), and the display view angle may be adjusted to the left of the second virtual character (in this case, the left image of the second virtual character is displayed) by the second sliding operation; or the display angle is adjusted to the front of the second virtual character (in this case, the front image of the second virtual character is displayed).
In one embodiment, the method further comprises: executing a third game action in response to a third sliding operation starting from a display position of the second virtual character when the second virtual character is not in the interactive response state, the third game action including any one of the following actions: controlling the second virtual character to move in the game scene; or adjusting the display view angle of the game scene.
Here, the first sliding operation is a sliding operation applied to the second virtual character above the display position of the second virtual character when the second virtual character is in the interactive response state, and if there is a third sliding operation applied to the second virtual character above the display position of the second virtual character when the second virtual character in the game scene is not in the interactive response state during the game, a corresponding third game action is performed in response to the third sliding operation initiated to the second virtual character above the display position of the second virtual character, for example, the selected second virtual character is controlled to move in the game scene; or, adjusting the display view angle of the game scene in the current graphical user interface; wherein, the adjustment of the display view angle can be from a first person view angle to a third person view angle; the adjustment of the display angle of the display view angle of the game scene may also be referred to, for example, the display view angle is originally behind the second virtual character (in this case, the back image of the second virtual character is displayed in the graphical user interface), and the display view angle may be adjusted to the left of the second virtual character (in this case, the left image of the second virtual character is displayed) by the second sliding operation; or the display angle is adjusted to the front of the second virtual character (in this case, the front image of the second virtual character is displayed).
According to the interactive control method in the game, a graphical user interface is provided through the first terminal, the graphical user interface comprises a game interface, and contents displayed by the game interface comprise a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; responding to the triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window; and setting the chat window as a window for communicating messages with the second terminal in response to the first sliding operation ending in the area associated with the chat window. Therefore, aiming at the second virtual character in the game scene displayed by the graphical user interface, the private chat can be quickly initiated to the second virtual character in the game scene through triggering operation and sliding operation, so that the operation steps of initiating the operation by the private chat can be reduced, and the operation difficulty of the game is simplified.
Referring to fig. 5 to 8, fig. 5 is a schematic diagram of an interactive control in a game according to an embodiment of the present application, fig. 6 is a schematic diagram of an interactive control in a game according to an embodiment of the present application, fig. 7 is a schematic diagram of an interactive control in a game according to an embodiment of the present application, and fig. 8 is a schematic diagram of an interactive control in a game according to an embodiment of the present application.
As shown in fig. 5, the interaction control device 500 may be applied to a first terminal, and provide a graphical user interface through the first terminal, where the graphical user interface includes a game interface, and contents displayed on the game interface include a chat window and a part or all of game scenes; the interaction control apparatus 500 includes:
a state response module 510, configured to control the second virtual character to enter an interactive response state and display a first interactive visual special effect in response to a trigger operation for the second virtual character;
an effect display module 520 for responding to a first sliding operation for the second virtual character when the second virtual character is in the interactive response state, and displaying a second interactive visual effect when the first sliding operation slides to the area associated with the chat window;
And a window setting module 530, configured to set the chat window as a window for communicating with the second terminal in response to the first sliding operation ending in the area associated with the chat window.
Further, as shown in fig. 6, the interaction control apparatus 500 further includes a first adjustment module 540, where the first adjustment module 540 is configured to:
executing a second game action in response to a second sliding operation initiated outside of the display position of the second virtual character, the second game action including any one of:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
Further, as shown in fig. 6, the interaction control apparatus 500 further includes a second adjustment module 550, where the second adjustment module 550 is configured to:
Executing a third game action in response to a third sliding operation starting from a display position of the second virtual character when the second virtual character is not in the interactive response state, the third game action including any one of the following actions:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
Further, as shown in fig. 7, the interaction control apparatus 500 further includes a first state exit module 560, where the first state exit module 560 is configured to:
And controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
Further, as shown in fig. 7, the interaction control apparatus 500 further includes a second state exit module 570, where the second state exit module 570 is configured to:
And responding to the first sliding operation ending in the area outside the area associated with the chat window, controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
Further, as shown in fig. 8, the interaction control apparatus 500 further includes an icon generating module 580, where the icon generating module 580 is configured to:
In response to a first sliding operation for the second virtual character, an interactive icon is generated and controlled to move following the first sliding operation.
Further, the triggering operation is a long-press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation.
Further, the triggering operation is a clicking operation for the second virtual character.
Further, the first sliding operation is a sliding operation starting from a display position of the second virtual character.
Further, when the status response module 510 is configured to display the first interactive visual effect, the status response module 510 is configured to:
Displaying the first interactive visual special effect in an associated area of the trigger operation action position;
the first interactive visual effect comprises at least one of: graphic identifiers and text identifiers.
Further, when the window setting module 530 is configured to display the second interactive visual effect, the window setting module 530 is configured to:
And displaying the chat window as the second interactive visual special effect.
According to the interactive control device in the game, a graphical user interface is provided through the first terminal, the graphical user interface comprises a game interface, and contents displayed on the game interface comprise a chat window and part or all of game scenes; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the method comprises the following steps: responding to the triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect; responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window; and setting the chat window as a window for communicating messages with the second terminal in response to the first sliding operation ending in the area associated with the chat window. Therefore, aiming at the second virtual character in the game scene displayed by the graphical user interface, the private chat can be quickly initiated to the second virtual character in the game interface through triggering operation and sliding operation, so that the operation steps of initiating the operation by the private chat can be reduced, and the operation difficulty of the game is simplified.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the application. As shown in fig. 9, the electronic device 900 includes a processor 910, a memory 920, and a bus 930.
The memory 920 stores machine-readable instructions executable by the processor 910, when the electronic device 900 is running, the processor 910 communicates with the memory 920 through the bus 930, and when the machine-readable instructions are executed by the processor 910, the steps of the interactive control method in the game in the method embodiment shown in fig. 2 may be executed, and a specific implementation may refer to the method embodiment and will not be described herein.
The embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps of the interactive control method in the game in the method embodiment shown in fig. 2 may be executed, and the specific implementation manner may refer to the method embodiment and will not be described herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.
Claims (14)
1. The interactive control method in the game is characterized in that a graphical user interface is provided through a first terminal, wherein the graphical user interface comprises a game interface, and contents displayed on the game interface comprise a chat window and part or all of game scenes; the chat window is set as default and is used for sending messages to the virtual world or the team to which the game is constructed; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the method comprises the following steps:
responding to the triggering operation for the second virtual character, controlling the second virtual character to enter an interactive response state, and displaying a first interactive visual special effect;
Responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to an area associated with the chat window; wherein the first sliding operation and the triggering operation are continuous operations;
and setting the chat window as a window for communicating messages with the second terminal in response to the first sliding operation ending in the area associated with the chat window.
2. The interaction control method according to claim 1, wherein the triggering operation is a long press operation for the second virtual character; the first sliding operation is a sliding operation continuous with the long press operation.
3. The interactive control method according to claim 1, wherein the triggering operation is a click operation for the second virtual character.
4. The interactive control method according to claim 1, wherein the first sliding operation is a sliding operation starting from a display position of the second virtual character.
5. The interactive control method according to claim 4, further comprising:
executing a second game action in response to a second sliding operation initiated outside of the display position of the second virtual character, the second game action including any one of:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
6. The interactive control method according to claim 4, further comprising:
Executing a third game action in response to a third sliding operation starting from a display position of the second virtual character when the second virtual character is not in the interactive response state, the third game action including any one of the following actions:
controlling the second virtual character to move in the game scene; or alternatively
And adjusting the display view angle of the game scene.
7. The interactive control method according to claim 1, wherein displaying the first interactive visual effect comprises: displaying the first interactive visual special effect in an associated area of the trigger operation action position;
the first interactive visual effect comprises at least one of: graphic identifiers and text identifiers.
8. The interactive control method according to claim 1, wherein displaying the second interactive visual effect comprises:
And displaying the chat window as the second interactive visual special effect.
9. The interactive control method according to claim 1, wherein after said setting said chat window as a window for message interworking with said second terminal, said method further comprises:
And controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
10. The interactive control method according to claim 1, characterized in that the method further comprises:
In response to a first sliding operation for the second virtual character, an interactive icon is generated and controlled to move following the first sliding operation.
11. The interactive control method according to claim 1, characterized in that the method further comprises:
And responding to the first sliding operation ending in the area outside the area associated with the chat window, controlling the second virtual character to exit the interactive response state, and hiding the first interactive visual special effect.
12. An interactive control device in a game is characterized in that a graphical user interface is provided through a first terminal, wherein the graphical user interface comprises a game interface, and contents displayed on the game interface comprise a chat window and part or all of game scenes; the chat window is set as default and is used for sending messages to the virtual world or the team to which the game is constructed; a first virtual role controlled by the first terminal and a second virtual role controlled by the second terminal exist in the game scene; the device comprises:
The state response module is used for responding to the triggering operation of the second virtual character, controlling the second virtual character to enter an interactive response state and displaying a first interactive visual special effect;
The effect display module is used for responding to a first sliding operation aiming at the second virtual character when the second virtual character is in the interaction response state, and displaying a second interaction visual special effect when the first sliding operation slides to the area associated with the chat window; wherein the first sliding operation and the triggering operation are continuous operations;
And the window setting module is used for setting the chat window as a window for communicating messages with the second terminal in response to the fact that the first sliding operation ends in the area associated with the chat window.
13. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the interactive control method in a game according to any one of claims 1 to 11.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the interactive control method in a game as claimed in any one of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110850549.XA CN113559520B (en) | 2021-07-27 | 2021-07-27 | Interaction control method and device in game, electronic equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110850549.XA CN113559520B (en) | 2021-07-27 | 2021-07-27 | Interaction control method and device in game, electronic equipment and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113559520A CN113559520A (en) | 2021-10-29 |
CN113559520B true CN113559520B (en) | 2024-07-19 |
Family
ID=78167930
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110850549.XA Active CN113559520B (en) | 2021-07-27 | 2021-07-27 | Interaction control method and device in game, electronic equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113559520B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114272618A (en) * | 2021-11-22 | 2022-04-05 | 腾讯科技(深圳)有限公司 | Interactive method, device, electronic device and storage medium based on virtual character |
CN114217711A (en) * | 2021-12-08 | 2022-03-22 | 北京字跳网络技术有限公司 | Virtual role control method, terminal, electronic device and storage medium |
CN114632328B (en) * | 2022-03-29 | 2024-12-03 | 广州博冠信息科技有限公司 | A method, device, terminal and storage medium for displaying special effects in a game |
CN117654062A (en) * | 2022-08-23 | 2024-03-08 | 腾讯科技(成都)有限公司 | Virtual character display method, device, equipment and storage medium |
CN115888098A (en) * | 2022-12-06 | 2023-04-04 | 网易(杭州)网络有限公司 | Information processing method and device for game, computer equipment and storage medium |
CN118259746A (en) * | 2022-12-28 | 2024-06-28 | 腾讯科技(深圳)有限公司 | Method, device, equipment and storage medium for joining a multimedia interactive room |
CN118903835A (en) * | 2023-05-06 | 2024-11-08 | 腾讯科技(深圳)有限公司 | Interaction method, device, equipment, storage medium and product for non-player character |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106575196A (en) * | 2014-07-31 | 2017-04-19 | 三星电子株式会社 | Electronic device and method for displaying user interface thereof |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102050814B1 (en) * | 2013-04-02 | 2019-12-02 | 삼성전자주식회사 | Apparatus and method for private chatting in group chats |
CN104461299B (en) * | 2014-12-05 | 2019-01-18 | 蓝信移动(北京)科技有限公司 | A kind of method and apparatus for chat to be added |
CN110233742B (en) * | 2018-03-06 | 2022-04-01 | 阿里巴巴集团控股有限公司 | Group establishing method, system, terminal and server |
KR102383973B1 (en) * | 2019-08-12 | 2022-04-07 | 주식회사 엔씨소프트 | Appartus and method for providing user interface |
CN110691027A (en) * | 2019-08-29 | 2020-01-14 | 维沃移动通信有限公司 | Information processing method and device, electronic equipment and medium |
CN111298436B (en) * | 2020-02-26 | 2023-04-18 | 网易(杭州)网络有限公司 | Message sending method and device in game |
CN112346636A (en) * | 2020-11-05 | 2021-02-09 | 网易(杭州)网络有限公司 | In-game information processing method and device and terminal |
-
2021
- 2021-07-27 CN CN202110850549.XA patent/CN113559520B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106575196A (en) * | 2014-07-31 | 2017-04-19 | 三星电子株式会社 | Electronic device and method for displaying user interface thereof |
Also Published As
Publication number | Publication date |
---|---|
CN113559520A (en) | 2021-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113559520B (en) | Interaction control method and device in game, electronic equipment and readable storage medium | |
CN111729306A (en) | Game character transmission method, device, electronic equipment and storage medium | |
KR102734373B1 (en) | Adaptive display method and device for virtual scenes, electronic device, storage medium and computer program product | |
US20250018294A1 (en) | Method and apparatus for displaying information of virtual object, electronic device, and storage medium | |
CN111913624B (en) | Interaction method and device for objects in virtual scene | |
KR102756415B1 (en) | Method and apparatus for controlling virtual objects, devices, storage media and computer programs | |
US12048878B2 (en) | Method and apparatus for controlling virtual object, device, storage medium, and program product | |
CN113350779A (en) | Game virtual character action control method and device, storage medium and electronic equipment | |
WO2022042435A1 (en) | Method and apparatus for displaying virtual environment picture, and device and storage medium | |
CN113476825B (en) | Role control method, role control device, equipment and medium in game | |
CN113680062B (en) | Information viewing method and device in game | |
CN115634450A (en) | Control method, control device, equipment and medium for virtual role | |
CN115708956A (en) | Game picture updating method and device, computer equipment and medium | |
CN114570030A (en) | Method and device for processing virtual equipment in game, electronic equipment and storage medium | |
US12366956B2 (en) | Virtual object control method and apparatus, device, storage medium, and computer program product | |
US20240424393A1 (en) | Virtual world-based character interaction method and apparatus, device, and medium | |
CN114210068A (en) | Configuration method and device for virtual game scene, storage medium and electronic device | |
CN117861205A (en) | Interaction control method and device in game, electronic equipment and readable storage medium | |
CN119587963A (en) | A method, device, electronic device and storage medium for processing information in a game | |
CN118161864A (en) | Virtual skill processing method and device, electronic equipment and storage medium | |
CN118161861A (en) | Method and device for controlling virtual object in game, electronic equipment and storage medium | |
WO2023231557A9 (en) | Interaction method for virtual objects, apparatus for virtual objects, and device, storage medium and program product | |
CN117582672A (en) | Data processing method, device, electronic equipment and storage medium | |
CN115089968A (en) | An operation guidance method, device, electronic device and storage medium in a game | |
CN116392807A (en) | Method and device for sending message in game and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |