CN114191814A - Information processing method, information processing device, computer equipment and storage medium - Google Patents
Information processing method, information processing device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN114191814A CN114191814A CN202111530343.5A CN202111530343A CN114191814A CN 114191814 A CN114191814 A CN 114191814A CN 202111530343 A CN202111530343 A CN 202111530343A CN 114191814 A CN114191814 A CN 114191814A
- Authority
- CN
- China
- Prior art keywords
- target
- game
- segment
- game interface
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/798—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
- A63F13/49—Saving the game status; Pausing or ending the game
- A63F13/497—Partially or entirely replaying previous game actions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses an information processing method, an information processing device, computer equipment and a storage medium, wherein a game interface is displayed; responding to a segment identification display instruction, and displaying candidate segment identifications on the game interface; when a selection operation for a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, wherein the target training segment comprises a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used for judging whether the player performs a touch operation in the detection areas, and the target training segment is used for training the player on the touch operation corresponding to the target action indication icon; by entering the training mode after the game stage is finished, the training segment is generated based on the fault operation directivity in the game stage so as to provide the player with targeted training, thereby improving the game efficiency of the player.
Description
Technical Field
The present application relates to the field of game technologies, and in particular, to an information processing method and apparatus, a computer device, and a storage medium.
Background
With the continuous development of computer communication technology, a great deal of popularization and application of terminals such as smart phones, tablet computers, notebook computers and the like are realized, the terminals are developed towards diversification and individuation directions, and become indispensable terminals in life and work increasingly, and entertainment games capable of being operated on the terminals are produced in order to meet the pursuit of people for spiritual life, for example, games of multi-player on-line tactical sports games (MOBA) developed based on client or server architectures, massively multi-player on-line games (MMO), music games and the like are deeply favored by users due to the characteristics of high fluency, good operation hand feeling, instant game matching and the like.
At present, after the music game is played, the total score of the player in the music game and the operation statistical results of various operations executed by the player in the music game are displayed on a game interface, so that the player cannot accurately know the specific operation of losing the score in the music game, cannot accurately know the influence factors of clearance failure or low score, and accordingly the game efficiency of the player is low.
Disclosure of Invention
The embodiment of the application provides an information processing method, an information processing device, computer equipment and a storage medium, which can enter a training mode after a game stage is finished, and generate training segments based on the fault operation directivity in the game stage so as to provide a player with targeted training, thereby improving the game efficiency of the player.
The embodiment of the application provides an information processing method, which comprises the following steps:
displaying a game interface, wherein the game interface is used for displaying a game picture and an action indication icon positioned in the game picture, and the action indication icon is used for indicating touch operation corresponding to an action of the game interface;
displaying candidate segment identifications on the game interface in response to a segment identification display instruction, wherein the candidate segment identifications are associated with candidate training segments;
when selection operation aiming at a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, wherein the target training segment comprises a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used for judging whether the player executes touch operation in the detection areas, and the target training segment is used for training the player on the touch operation corresponding to the target action indication icon.
Optionally, when a selection operation for a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, where the method includes:
when the selection operation aiming at the target segment identification in the candidate segment identification is detected, acquiring a target training segment associated with the target segment identification;
determining display positions of all target action indication icons in the target training segment;
in response to an icon display instruction for the target action indication icon, generating a plurality of detection regions based on a display position of the target action indication icon.
Optionally, the generating, in response to an icon display instruction, a plurality of detection regions based on the display position of the target action indication icon includes:
in response to an icon display instruction for the target action indication icon, determining a display position of the target action indication icon on the game interface;
and generating the plurality of detection areas on the game interface around the display position by taking the display position as a center.
Optionally, after displaying the target training segment corresponding to the target segment identifier on the game interface, the method further includes:
when the target training segment is detected to be finished, displaying prompt information in a preset information display area on the game interface, wherein the prompt information is determined based on the number of touch operations detected in all detection areas and detection marks corresponding to the detection areas.
Optionally, before displaying the prompt information in the preset information display area on the game interface, the method further includes:
determining a trigger state based on the trigger time aiming at the target action indication icon and the specified time period corresponding to the target action indication icon;
when the target training segment is detected to be finished, generating prompt information based on the trigger states of all target action indication icons;
and displaying the prompt information in a preset information display area on the game interface.
Optionally, the determining a trigger state based on the trigger time for the target action indication icon and the specified time period corresponding to the target action indication icon includes:
if the trigger time is before the specified time period, determining that the trigger state is premature trigger;
if the trigger time is between the specified time periods, determining that the trigger state is quasi-point trigger;
and if the trigger time is after the specified time period, determining that the trigger state is delayed trigger.
Optionally, a first function control is displayed in the preset information display area;
after the prompt message is displayed in the preset message display area on the game interface, the method further comprises the following steps:
and responding to the touch operation of the first function control, and displaying the target training segment corresponding to the target segment identification on the game interface again.
Optionally, a second function control is displayed in the preset information display area;
after the prompt message is displayed in the preset message display area on the game interface, the method further comprises the following steps:
and responding to the touch operation of the second function control, canceling the display of the prompt message, and displaying the candidate segment identification on the game interface.
Optionally, before responding to the segment identifier display instruction, the method further includes:
when the end of the current game stage is detected, acquiring an action indication icon of operation failure in the current game stage;
and generating a candidate training segment based on the action indication icon with the operation failure and a first preset time length.
Optionally, before generating the candidate training segment based on the action indication icon of the operation failure and the preset time, the method further includes:
generating a target playback video and a playback identifier associated with the target playback video based on the action indication icon failing to operate and the first preset duration, wherein the playback identifier is used for playing the target playback video on the game interface;
and displaying the playback identifier on the game interface.
Optionally, after the playback identifier is displayed on the game interface, the method further includes:
when the triggering operation aiming at the playback identifier is detected, the target playback video is played on the game interface;
and when the target playback video is detected to be played, displaying a video replay identifier and a training mode trigger identifier on the game interface, wherein the video replay identifier is used for playing the target playback video, and the training mode trigger identifier is used for generating a segment identifier display instruction.
Optionally, after the video replay identifier and the training mode trigger identifier are displayed on the game interface, the method further includes:
when the triggering operation for the video replay identifier is detected, the target playback video is replayed on the game interface.
Optionally, after the video replay identifier and the training mode trigger identifier are displayed on the game interface, the method further includes:
when the trigger operation aiming at the training mode trigger is detected, generating a fragment identifier display instruction;
and responding to a segment identification display instruction, and displaying at least one candidate segment identification on the game interface.
Optionally, before responding to the segment identifier display instruction, the method further includes:
responding to a historical game play display instruction, and displaying a historical game stage on the game interface;
when the selection operation of a target historical game stage in the historical game stages is detected, acquiring an action indication icon of operation failure in the target historical game stage;
and generating candidate training segments of the target historical game stage based on the action indication icon of operation failure in the target historical game stage and a second preset time length.
Correspondingly, an embodiment of the present application further provides an information processing apparatus, including:
the game device comprises a first display unit, a second display unit and a display unit, wherein the first display unit is used for displaying a game interface, the game interface is used for displaying a game picture and an action indication icon positioned in the game picture, and the action indication icon is used for indicating touch operation corresponding to an action of the game interface;
a second display unit, configured to display at least one candidate segment identifier on the game interface in response to a segment identifier display instruction, where the candidate segment identifier is associated with a candidate training segment;
a third display unit, configured to display, on the game interface, a target training segment corresponding to a target segment identifier in the candidate segment identifiers when a selection operation for the target segment identifier is detected, where the target training segment includes a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used to determine whether the player performs a touch operation in the detection areas, and the target training segment is used for the player to train the touch operation corresponding to the target action indication icon.
Optionally, the information processing apparatus further includes:
a first obtaining unit, configured to obtain a target training segment associated with a target segment identifier when a selection operation for the target segment identifier in the candidate segment identifiers is detected;
the first determining unit is used for determining the display positions of all target action indication icons in the target training segment;
a first generation unit configured to generate a plurality of detection regions based on a display position of the target action indication icon in response to an icon display instruction for the target action indication icon.
Optionally, the information processing apparatus further includes:
a second obtaining unit, configured to determine, in response to an icon display instruction for the target action indication icon, a display position of the target action indication icon on the game interface;
and the second generating unit is used for generating the plurality of detection areas on the game interface by taking the display position as a center and surrounding the display position.
Optionally, the information processing apparatus further includes:
the first display subunit is configured to display prompt information in a preset information display area on the game interface when the end of the target training segment is detected, where the prompt information is determined based on the number of touch operations detected in all detection areas and detection identifiers corresponding to the detection areas.
Optionally, the information processing apparatus further includes:
a second determination unit, configured to determine a trigger state based on a trigger time for the target action indication icon and a specified time period corresponding to the target action indication icon;
a third generating unit, configured to generate, when it is detected that the target training segment ends, prompt information based on trigger states of all target action indication icons;
and the second display subunit is used for displaying the prompt information in a preset information display area on the game interface.
Optionally, the information processing apparatus further includes a third determining unit, where the third determining unit is configured to:
if the trigger time is before the specified time period, determining that the trigger state is premature trigger;
if the trigger time is between the specified time periods, determining that the trigger state is quasi-point trigger;
and if the trigger time is after the specified time period, determining that the trigger state is delayed trigger.
Optionally, the information processing apparatus further includes:
and the third display subunit is used for responding to the touch operation of the first function control and displaying the target training segment corresponding to the target segment identifier on the game interface again.
Optionally, the information processing apparatus further includes:
and the response unit is used for responding to the touch operation of the second function control, canceling the display of the prompt message and displaying the candidate segment identification on the game interface.
Optionally, the information processing apparatus further includes:
the third acquisition unit is used for acquiring an action indication icon which is failed in operation in the current game level when the current game level is detected to be finished;
and the fourth generating unit is used for generating a candidate training segment based on the action indication icon with the operation failure and the first preset time length.
Optionally, the information processing apparatus further includes:
a fifth generating unit, configured to generate a target playback video and a playback identifier associated with the target playback video based on the operation-failed action indication icon and the first preset duration, where the playback identifier is used to play the target playback video on the game interface;
and the fourth display subunit is used for displaying the playback identifier on the game interface.
Optionally, the information processing apparatus further includes:
the first processing unit is used for playing the target playback video on the game interface when the triggering operation aiming at the playback identifier is detected;
and the fifth display subunit is configured to display, on the game interface, a video replay identifier and a training mode trigger identifier after the target playback video playing is detected to be finished, where the video replay identifier is used to play the target playback video, and the training mode trigger identifier is used to generate a segment identifier display instruction.
Optionally, the information processing apparatus further includes:
and the second processing unit is used for replaying the target playback video on the game interface when the triggering operation aiming at the video replay identifier is detected.
Optionally, the information processing apparatus further includes:
a sixth generating unit, configured to generate a segment identifier display instruction when a trigger operation for the training mode trigger is detected;
and the sixth display subunit is used for responding to a segment identifier display instruction and displaying at least one candidate segment identifier on the game interface.
Optionally, the information processing apparatus further includes:
the seventh display subunit is used for responding to a historical game play display instruction and displaying a historical game stage on the game interface;
a fourth acquisition unit configured to acquire an action indication icon in which an operation in a target history game level has failed when a selection operation for the target history game level in the history game levels is detected;
and the seventh generating unit is used for generating candidate training segments of the target historical game level based on the action indication icon which is failed in operation in the target historical game level and a second preset time length.
Accordingly, an embodiment of the present application further provides a computer device, which includes a processor, a memory, and a computer program stored on the memory and capable of running on the processor, and when the computer program is executed by the processor, the computer program implements the steps of any of the information processing methods described above.
In addition, an embodiment of the present application further provides a storage medium, where a computer program is stored on the storage medium, and the computer program, when executed by a processor, implements the steps of any one of the information processing methods described above.
The embodiment of the application provides an information processing method, an information processing device, computer equipment and a storage medium, wherein a training mode can be entered after a game stage is finished, and training segments are generated based on the fault operation directivity in the game stage so as to be used for targeted training of a player, so that the game efficiency of the player is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of a scenario of an information processing system according to an embodiment of the present application.
Fig. 2 is a schematic flowchart of an information processing method according to an embodiment of the present application.
Fig. 3 is a schematic view of an application scenario of the information processing method according to the embodiment of the present application.
Fig. 4 is a schematic view of another application scenario of the information processing method according to the embodiment of the present application.
Fig. 5 is a schematic view of another application scenario of the information processing method according to the embodiment of the present application.
Fig. 6 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a computer device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an information processing method, an information processing device, computer equipment and a storage medium. Specifically, the information processing method of the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server or other devices. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the information processing method is operated on a terminal, the terminal device stores a game application and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the information processing method is executed on a server, a cloud game may be used. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the information processing method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic view of a scenario of an information processing system according to an embodiment of the present disclosure. The system may include at least one terminal, at least one server, at least one database, and a network. The terminal held by the user can be connected to servers of different games through a network. A terminal is any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, when the system includes a plurality of terminals, a plurality of servers, and a plurality of networks, different terminals may be connected to each other through different networks and through different servers. The network may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, etc. In addition, different terminals may be connected to other terminals or to a server using their own bluetooth network or hotspot network. For example, multiple users may be online through different terminals to connect and synchronize with each other over a suitable network to support multiplayer gaming. Additionally, the system may include a plurality of databases coupled to different servers and in which information relating to the gaming environment may be stored continuously as different users play the multiplayer game online.
The embodiment of the application provides an information processing method, which can be executed by a terminal or a server. The embodiment of the present application is described as an example in which the information processing method is executed by a terminal. The terminal may include a touch display screen and a processor (of course, the terminal may also use a mouse, a keyboard, and other peripheral devices as input devices, and here, only the touch display screen is taken as an example for description), where the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the scenario diagram of the information processing system shown in fig. 1 is only an example, and the information processing system and the scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application.
In view of the foregoing problems, embodiments of the present application provide an information processing method, an information processing apparatus, a computer device, and a storage medium, which can improve clearance efficiency of a player in a game. The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
The embodiment of the present application provides an information processing method, which may be executed by a terminal or a server, and is described as an example in which the information processing method is executed by the terminal.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an information processing method according to an embodiment of the present disclosure. The specific flow of the information processing method may be as follows, step 101 to step 103:
101, displaying a game interface, wherein the game interface is used for displaying a game picture and an action indication icon located in the game picture, and the action indication icon is used for indicating a touch operation corresponding to an action of the game interface.
In the embodiment of the application, a game screen is displayed on the game interface, and the game screen is a screen displayed (or provided) when the application program runs on the terminal. Specifically, the computer device may display a game screen on the game interface thereof, where the game screen may include a plurality of function icons, and each function icon has a corresponding function control. The user can trigger the target function icon on the game picture, so that the triggering operation is formed on the function control corresponding to the target function icon. Specifically, the target function icon may be an action indication icon, and the player may perform a corresponding touch operation on the game interface according to an action for the game interface indicated by the action indication icon. For example, the action indication icon indicates that the player long-presses for 5 seconds on the game interface, and the player may perform a long-press operation for 5 seconds on the game interface according to the indication of the action indication icon.
Specifically, the embodiment of the application can be applied to a music game or a rhythm game, and the music game can be played by generating touch operation on a game interface while a player is listening to music. For example, the player inputs an input format suitable for the input device according to a music tempo reproduced in real time, so that various corresponding music or visual effects, such as sound effects or reproduction that produces a plurality of visual actions in combination with the sound effects, are made. The action indication icon is used to guide an input operation time point and a touch position of the player with respect to the game interface, is associated with music generated in the music game, and is converted into a click, a long press, and a slide on the game interface when the shape of the action indication icon is changed.
Alternatively, when the action indication icon is displayed on the game interface, the player may perform the action by clicking the action indication icon or long-pressing the action indication icon. The action indication icon may be represented as a circle icon, a note shape icon, or the like. The computer device may determine that the touch operation result corresponding to each action indication icon is successful or failed, and may output a specific operation success indication or a successful sound effect when the touch operation is successful.
For example, please refer to fig. 3, and fig. 3 is a schematic view of a scene of an information processing method according to an embodiment of the present disclosure. Specifically, a game screen is displayed on the game interface, and an operation indication icon and a time limit indication icon are displayed on the game screen. The action indication icon is used for indicating touch operation corresponding to the action of the game interface, namely prompting a player of a position where the touch operation needs to be performed on the game interface; the time limit indication icon is used for representing the time limit of clicking the action indication icon, is parallel to the long edge of the screen, and can perform translational motion parallel to the long edge of the screen according to a preset rule. In the game stage, when the time limit indication icon is intersected with the action indication icon for the first time, a player is required to perform touch operation at the position of the action indication icon, and if the player performs touch operation at the position of the action indication icon after the time limit indication icon is intersected with the action indication icon for the first time and before the time limit indication icon is separated from the action indication icon for the first time after the time limit indication icon is intersected with the action indication icon for the first time, the player determines that the operation is correct; and if the player performs touch operation at the position of the action indication icon before the time limit indication icon is intersected with the action indication icon for the first time and after the time limit indication icon is intersected with the action indication icon for the first time and is separated from the action indication icon, determining that the operation is error.
It should be noted that the triggering operation in the embodiment of the present application may be an operation performed on the game interface by the user through the touch display screen, for example, a touch operation generated by the user clicking or touching the game interface with a finger. The trigger operation generated by clicking on the game interface by controlling a mouse button by the user may also be, for example, the trigger operation generated by clicking on the game interface by pressing a middle mouse button by the user.
And 102, responding to a segment identification display instruction, and displaying a candidate segment identification on the game interface, wherein the candidate segment identification is associated with a candidate training segment.
In a specific embodiment, before the step "responding to the segment identification display instruction", the method may include:
when the end of the current game stage is detected, acquiring an action indication icon of operation failure in the current game stage;
and generating a candidate training segment based on the action indication icon with the operation failure and a first preset time length.
Optionally, before the step "generating a candidate training segment based on the action indication icon failing to operate and a preset time", the method may include:
generating a target playback video and a playback identifier associated with the target playback video based on the action indication icon failing to operate and the first preset duration, wherein the playback identifier is used for playing the target playback video on the game interface;
and displaying the playback identifier on the game interface.
In another embodiment, after the step of "presenting the playback identification on the game interface", the method may comprise:
when the triggering operation aiming at the playback identifier is detected, the target playback video is played on the game interface;
and when the target playback video is detected to be played, displaying a video replay identifier and a training mode trigger identifier on the game interface, wherein the video replay identifier is used for playing the target playback video, and the training mode trigger identifier is used for generating a segment identifier display instruction.
Specifically, after the step "displaying the video replay identifier and the training mode trigger identifier on the game interface", the method may include:
when the triggering operation for the video replay identifier is detected, the target playback video is replayed on the game interface.
Specifically, after the step "displaying the video replay identifier and the training mode trigger identifier on the game interface", the method may include:
when the trigger operation aiming at the training mode trigger is detected, generating a fragment identifier display instruction;
and responding to a segment identification display instruction, and displaying at least one candidate segment identification on the game interface.
In order to allow the user to personally select a game pair requiring training, before the step "display instruction in response to segment identification", the method may include:
responding to a historical game play display instruction, and displaying a historical game stage on the game interface;
when the selection operation of a target historical game stage in the historical game stages is detected, acquiring an action indication icon of operation failure in the target historical game stage;
and generating candidate training segments of the target historical game stage based on the action indication icon of operation failure in the target historical game stage and a second preset time length.
In a particular embodiment, a method may include:
when the end of the current game stage is detected, acquiring an action indication icon of operation failure in the current game stage;
generating a training segment based on the action indication icon of the operation failure and preset time;
when first operation aiming at the game interface is detected, displaying the segment identification corresponding to the training segment on the game interface.
For example, referring to fig. 4, fig. 4 is a schematic view of another scenario of an information processing method according to an embodiment of the present application. When the end of the current game stage is detected, functionality controls are displayed on the game interface, which may include a "resume" functionality control and a "training mode" functionality control. When detecting the touch operation of the player for the 'restart' functional control on the game interface, the current game level can be restarted, and the game of the game level is played again; when detecting a touch operation of a player on a "training mode" function control on a game interface, displaying a plurality of candidate segment identifiers on the game interface, where a training segment corresponding to a candidate segment identifier is a segment corresponding to an action indication icon in a current game level, where the player performs a faulty operation, so that the player can perform targeted training.
Optionally, after the step "obtaining the action indication icon of operation failure in the current game level", the method may include:
acquiring a game level video corresponding to the current game level;
determining a time point of the action indication icon failed in the operation in the game level video and a preset time length to generate a target video and a playback identifier associated with the target video;
when the current game level is detected to be finished, displaying the playback identifier on the game interface, wherein the playback identifier is used for playing the target video on the game interface.
In order to enable the player to review the miss operation, the step "generating a target video and a playback identifier associated with the target video based on the miss operation and a preset time" may include:
based on the time point of the operation failure action indication icon in the game level video as the center, acquiring a first video which is positioned before the time point and has the preset time length in the game level video, and acquiring a second video which is positioned after the time point and has the preset time length in the game level video;
generating the target video based on the first video and the second video.
103, when a selection operation for a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, where the target training segment includes a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used to determine whether the player performs a touch operation in the detection areas, and the target training segment is used for the player to train the touch operation corresponding to the target action indication icon.
In order to detect the accuracy of the touch operation of the player on the action indication icon, the step "when the selection operation on the target segment identifier in the candidate segment identifiers is detected, a target training segment corresponding to the target segment identifier is displayed on the game interface", and the method may include:
when the selection operation aiming at the target segment identification in the candidate segment identification is detected, acquiring a target training segment associated with the target segment identification;
determining display positions of all target action indication icons in the target training segment;
in response to an icon display instruction for the target action indication icon, generating a plurality of detection regions based on a display position of the target action indication icon.
Specifically, the step "generating a plurality of detection regions based on the display position of the target motion indication icon in response to an icon display instruction" may include:
in response to an icon display instruction for the target action indication icon, determining a display position of the target action indication icon on the game interface;
and generating the plurality of detection areas on the game interface around the display position by taking the display position as a center.
For example, referring to fig. 5, fig. 5 is a schematic view of another scenario of an information processing method according to an embodiment of the present application. When the target action indication icon is detected to be displayed on a game picture, a plurality of detection areas are generated around the display position of the target action indication icon by taking the target action indication icon as the center so as to judge whether the player performs touch operation in the detection areas.
To present the training results to the player, after the step "displaying the target segments on the game interface identifying the corresponding target training segments", the method may include:
when the target training segment is detected to be finished, displaying prompt information in a preset information display area on the game interface, wherein the prompt information is determined based on the number of touch operations detected in all detection areas and detection marks corresponding to the detection areas.
Specifically, before the step "displaying the prompt information in the preset information display area on the game interface", the method may include:
determining a trigger state based on the trigger time aiming at the target action indication icon and the specified time period corresponding to the target action indication icon;
when the target training segment is detected to be finished, generating prompt information based on the trigger states of all target action indication icons;
and displaying the prompt information in a preset information display area on the game interface.
In order to provide detailed copy data to a player, the step of "determining a trigger state based on a trigger time for the target action indication icon and a specified time period corresponding to the target action indication icon" may include:
if the trigger time is before the specified time period, determining that the trigger state is premature trigger;
if the trigger time is between the specified time periods, determining that the trigger state is quasi-point trigger;
and if the trigger time is after the specified time period, determining that the trigger state is delayed trigger.
Optionally, a first function control is displayed in the preset information display area. After the step of displaying the prompt message in the preset message display area on the game interface, the method may include:
and responding to the touch operation of the first function control, and displaying the target training segment corresponding to the target segment identification on the game interface again.
Optionally, a second function control is displayed in the preset information display area. After the step of displaying the prompt message in the preset message display area on the game interface, the method may include:
and responding to the touch operation of the second function control, canceling the display of the prompt message, and displaying the candidate segment identification on the game interface.
In summary, the embodiment of the present application provides an information processing method, which enters a training mode after a game level is finished, and generates training segments based on a directional characteristic of a fault operation in the game level, so that a player can train in a targeted manner, thereby improving the game efficiency of the player.
In order to better implement the information processing method provided by the embodiments of the present application, the embodiments of the present application further provide an information processing apparatus based on the information processing method. The terms are the same as those in the above-described information processing method, and details of implementation may refer to the description in the method embodiment.
Referring to fig. 6, fig. 6 is a block diagram of an information processing apparatus according to an embodiment of the present disclosure, where the apparatus includes:
the first display unit 201 is configured to display a game interface, where the game interface is configured to display a game screen and an action indication icon located in the game screen, and the action indication icon is used to indicate a touch operation corresponding to an action of the game interface;
a second display unit 202, configured to display at least one candidate segment identifier on the game interface in response to a segment identifier display instruction, where the candidate segment identifier is associated with a candidate training segment;
a third display unit 203, configured to display, on the game interface, a target training segment corresponding to a target segment identifier in the candidate segment identifiers when a selection operation for the target segment identifier is detected, where the target training segment includes a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used to determine whether the player performs a touch operation in the detection areas, and the target training segment is used for the player to train the touch operation corresponding to the target action indication icon.
Optionally, the information processing apparatus further includes:
a first obtaining unit, configured to obtain a target training segment associated with a target segment identifier when a selection operation for the target segment identifier in the candidate segment identifiers is detected;
the first determining unit is used for determining the display positions of all target action indication icons in the target training segment;
a first generation unit configured to generate a plurality of detection regions based on a display position of the target action indication icon in response to an icon display instruction for the target action indication icon.
Optionally, the information processing apparatus further includes:
a second obtaining unit, configured to determine, in response to an icon display instruction for the target action indication icon, a display position of the target action indication icon on the game interface;
and the second generating unit is used for generating the plurality of detection areas on the game interface by taking the display position as a center and surrounding the display position.
Optionally, the information processing apparatus further includes:
the first display subunit is configured to display prompt information in a preset information display area on the game interface when the end of the target training segment is detected, where the prompt information is determined based on the number of touch operations detected in all detection areas and detection identifiers corresponding to the detection areas.
Optionally, the information processing apparatus further includes:
a second determination unit, configured to determine a trigger state based on a trigger time for the target action indication icon and a specified time period corresponding to the target action indication icon;
a third generating unit, configured to generate, when it is detected that the target training segment ends, prompt information based on trigger states of all target action indication icons;
and the second display subunit is used for displaying the prompt information in a preset information display area on the game interface.
Optionally, the information processing apparatus further includes a third determining unit, where the third determining unit is configured to:
if the trigger time is before the specified time period, determining that the trigger state is premature trigger;
if the trigger time is between the specified time periods, determining that the trigger state is quasi-point trigger;
and if the trigger time is after the specified time period, determining that the trigger state is delayed trigger.
Optionally, the information processing apparatus further includes:
and the third display subunit is used for responding to the touch operation of the first function control and displaying the target training segment corresponding to the target segment identifier on the game interface again.
Optionally, the information processing apparatus further includes:
and the response unit is used for responding to the touch operation of the second function control, canceling the display of the prompt message and displaying the candidate segment identification on the game interface.
Optionally, the information processing apparatus further includes:
the third acquisition unit is used for acquiring an action indication icon which is failed in operation in the current game level when the current game level is detected to be finished;
and the fourth generating unit is used for generating a candidate training segment based on the action indication icon with the operation failure and the first preset time length.
Optionally, the information processing apparatus further includes:
a fifth generating unit, configured to generate a target playback video and a playback identifier associated with the target playback video based on the operation-failed action indication icon and the first preset duration, where the playback identifier is used to play the target playback video on the game interface;
and the fourth display subunit is used for displaying the playback identifier on the game interface.
Optionally, the information processing apparatus further includes:
the first processing unit is used for playing the target playback video on the game interface when the triggering operation aiming at the playback identifier is detected;
and the fifth display subunit is configured to display, on the game interface, a video replay identifier and a training mode trigger identifier after the target playback video playing is detected to be finished, where the video replay identifier is used to play the target playback video, and the training mode trigger identifier is used to generate a segment identifier display instruction.
Optionally, the information processing apparatus further includes:
and the second processing unit is used for replaying the target playback video on the game interface when the triggering operation aiming at the video replay identifier is detected.
Optionally, the information processing apparatus further includes:
a sixth generating unit, configured to generate a segment identifier display instruction when a trigger operation for the training mode trigger is detected;
and the sixth display subunit is used for responding to a segment identifier display instruction and displaying at least one candidate segment identifier on the game interface.
Optionally, the information processing apparatus further includes:
the seventh display subunit is used for responding to a historical game play display instruction and displaying a historical game stage on the game interface;
a fourth acquisition unit configured to acquire an action indication icon in which an operation in a target history game level has failed when a selection operation for the target history game level in the history game levels is detected;
and the seventh generating unit is used for generating candidate training segments of the target historical game level based on the action indication icon which is failed in operation in the target historical game level and a second preset time length.
The embodiment of the application discloses an information processing device, which displays a game interface through a first display unit 201, wherein the game interface is used for displaying a game picture and an action indication icon positioned in the game picture, and the action indication icon is used for indicating touch operation corresponding to an action of the game interface; the second display unit 202 displays at least one candidate segment identifier on the game interface in response to a segment identifier display instruction, wherein the candidate segment identifier is associated with a candidate training segment; when a selection operation for a target segment identifier in the candidate segment identifiers is detected, the third display unit 203 displays a target training segment corresponding to the target segment identifier on the game interface, where the target training segment includes a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used to determine whether the player performs a touch operation in the detection areas, and the target training segment is used for the player to train the touch operation corresponding to the target action indication icon. According to the embodiment of the application, the training mode can be entered after the game stage is finished, and the training segments are generated based on the fault operation directivity in the game stage so that a player can train in a targeted manner, and the game efficiency of the player is improved.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 7, fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 300 includes a processor 301 having one or more processing cores, a memory 302 having one or more computer-readable storage media, and a computer program stored on the memory 302 and executable on the processor. The processor 301 is electrically connected to the memory 302. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 301 is a control center of the computer apparatus 300, connects various parts of the entire computer apparatus 300 by various interfaces and lines, performs various functions of the computer apparatus 300 and processes data by running or loading software programs and/or modules stored in the memory 302, and calling data stored in the memory 302, thereby monitoring the computer apparatus 300 as a whole.
In the embodiment of the present application, the processor 301 in the computer device 300 loads instructions corresponding to processes of one or more application programs into the memory 302, and the processor 301 executes the application programs stored in the memory 302 according to the following steps, so as to implement various functions:
displaying a game interface, wherein the game interface is used for displaying a game picture and an action indication icon positioned in the game picture, and the action indication icon is used for indicating touch operation corresponding to an action of the game interface;
displaying candidate segment identifications on the game interface in response to a segment identification display instruction, wherein the candidate segment identifications are associated with candidate training segments;
when selection operation aiming at a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, wherein the target training segment comprises a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used for judging whether the player executes touch operation in the detection areas, and the target training segment is used for training the player on the touch operation corresponding to the target action indication icon.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 7, the computer device 300 further includes: a touch display 303, a radio frequency circuit 304, an audio circuit 305, an input unit 306, and a power source 307. The processor 301 is electrically connected to the touch display 303, the radio frequency circuit 304, the audio circuit 305, the input unit 306, and the power source 307. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 7 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 303 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 303 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 301, and can receive and execute commands sent by the processor 301. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 301 to determine the type of the touch event, and then the processor 301 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 303 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 303 may also be used as a part of the input unit 306 to implement an input function.
In the present embodiment, a graphical user interface is generated on the touch-sensitive display screen 303 by the processor 301 executing a game application. The touch display screen 303 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 304 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 305 may be used to provide an audio interface between the user and the computer device through speakers, microphones. The audio circuit 305 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 305 and converted into audio data, which is then processed by the audio data output processor 301, and then transmitted to, for example, another computer device via the radio frequency circuit 304, or output to the memory 302 for further processing. The audio circuit 305 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 306 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 307 is used to power the various components of the computer device 300. Optionally, the power supply 307 may be logically connected to the processor 301 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. Power supply 307 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 7, the computer device 300 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, the computer device provided in this embodiment displays a game interface, where the game interface is configured to display a game screen and an action indication icon located in the game screen, and the action indication icon is configured to indicate a touch operation corresponding to an action of the game interface; displaying candidate segment identifications on the game interface in response to a segment identification display instruction, wherein the candidate segment identifications are associated with candidate training segments; when selection operation aiming at a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, wherein the target training segment comprises a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used for judging whether the player executes touch operation in the detection areas, and the target training segment is used for training the player on the touch operation corresponding to the target action indication icon. According to the embodiment of the application, the training mode can be entered after the game stage is finished, and the training segments are generated based on the fault operation directivity in the game stage so that a player can train in a targeted manner, and the game efficiency of the player is improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the information processing methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a game interface, wherein the game interface is used for displaying a game picture and an action indication icon positioned in the game picture, and the action indication icon is used for indicating touch operation corresponding to an action of the game interface;
displaying candidate segment identifications on the game interface in response to a segment identification display instruction, wherein the candidate segment identifications are associated with candidate training segments;
when selection operation aiming at a target segment identifier in the candidate segment identifiers is detected, displaying a target training segment corresponding to the target segment identifier on the game interface, wherein the target training segment comprises a target action indication icon, the target action indication icon is provided with a plurality of detection areas, the detection areas are used for judging whether the player executes touch operation in the detection areas, and the target training segment is used for training the player on the touch operation corresponding to the target action indication icon.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any information processing method provided in the embodiments of the present application, the beneficial effects that can be achieved by any information processing method provided in the embodiments of the present application can be achieved, and detailed descriptions are omitted here for the foregoing embodiments.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The foregoing describes in detail an information processing method, an information processing apparatus, a computer device, and a storage medium provided in the embodiments of the present application, and a specific example is applied in the present application to explain the principles and implementations of the present application, and the description of the foregoing embodiments is only used to help understand the technical solutions and core ideas of the present application; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111530343.5A CN114191814A (en) | 2021-12-14 | 2021-12-14 | Information processing method, information processing device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111530343.5A CN114191814A (en) | 2021-12-14 | 2021-12-14 | Information processing method, information processing device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114191814A true CN114191814A (en) | 2022-03-18 |
Family
ID=80653768
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111530343.5A Pending CN114191814A (en) | 2021-12-14 | 2021-12-14 | Information processing method, information processing device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114191814A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1245729A (en) * | 1998-07-01 | 2000-03-01 | 科乐美股份有限公司 | Game system and readable storage medium capable of storing and executing said game program |
US6342665B1 (en) * | 1999-02-16 | 2002-01-29 | Konami Co., Ltd. | Music game system, staging instructions synchronizing control method for same, and readable recording medium recorded with staging instructions synchronizing control program for same |
US20120071238A1 (en) * | 2010-09-20 | 2012-03-22 | Karthik Bala | Music game software and input device utilizing a video player |
KR101581138B1 (en) * | 2014-12-05 | 2015-12-30 | 박남태 | The method and apparatus of Rhythm game |
CN110585730A (en) * | 2019-09-10 | 2019-12-20 | 腾讯科技(深圳)有限公司 | Rhythm sensing method and device for game and related equipment |
CN113398590A (en) * | 2021-07-14 | 2021-09-17 | 网易(杭州)网络有限公司 | Sound processing method, sound processing device, computer equipment and storage medium |
-
2021
- 2021-12-14 CN CN202111530343.5A patent/CN114191814A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1245729A (en) * | 1998-07-01 | 2000-03-01 | 科乐美股份有限公司 | Game system and readable storage medium capable of storing and executing said game program |
US6342665B1 (en) * | 1999-02-16 | 2002-01-29 | Konami Co., Ltd. | Music game system, staging instructions synchronizing control method for same, and readable recording medium recorded with staging instructions synchronizing control program for same |
US20120071238A1 (en) * | 2010-09-20 | 2012-03-22 | Karthik Bala | Music game software and input device utilizing a video player |
KR101581138B1 (en) * | 2014-12-05 | 2015-12-30 | 박남태 | The method and apparatus of Rhythm game |
CN110585730A (en) * | 2019-09-10 | 2019-12-20 | 腾讯科技(深圳)有限公司 | Rhythm sensing method and device for game and related equipment |
CN113398590A (en) * | 2021-07-14 | 2021-09-17 | 网易(杭州)网络有限公司 | Sound processing method, sound processing device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113101652A (en) | Information display method, device, computer equipment and storage medium | |
CN113398590B (en) | Sound processing method, device, computer equipment and storage medium | |
CN106303733B (en) | Method and device for playing live special effect information | |
CN113485617A (en) | Animation display method and device, electronic equipment and storage medium | |
CN114159789A (en) | Game interaction method and device, computer equipment and storage medium | |
CN113786620A (en) | Game information recommendation method and device, computer equipment and storage medium | |
US11270087B2 (en) | Object scanning method based on mobile terminal and mobile terminal | |
CN113101650A (en) | Game scene switching method and device, computer equipment and storage medium | |
CN113413600B (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN113332726A (en) | Virtual character processing method and device, electronic equipment and storage medium | |
CN113332719A (en) | Virtual article marking method, device, terminal and storage medium | |
CN113082707A (en) | Virtual object prompting method and device, storage medium and computer equipment | |
CN113181632A (en) | Information prompting method and device, storage medium and computer equipment | |
CN113332721A (en) | Game control method and device, computer equipment and storage medium | |
CN115068941A (en) | Game image quality recommendation method and device, computer equipment and storage medium | |
CN115193043B (en) | A method, device, computer equipment and storage medium for sending game information | |
CN114191814A (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN114225412A (en) | Information processing method, information processing device, computer equipment and storage medium | |
CN115193046A (en) | A game display control method, device, computer equipment and storage medium | |
CN115212567A (en) | Information processing method, information processing device, computer equipment and computer readable storage medium | |
CN115501581A (en) | Game control method, device, computer equipment and storage medium | |
CN115225971A (en) | Video progress adjusting method and device, computer equipment and storage medium | |
CN117101121A (en) | Game prop repairing method, device, terminal and storage medium | |
CN114632328B (en) | A method, device, terminal and storage medium for displaying special effects in a game | |
CN117504278A (en) | Interaction method, interaction device, computer equipment and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |