[go: up one dir, main page]

CN117138346A - Game editing method, game control device and electronic equipment - Google Patents

Game editing method, game control device and electronic equipment Download PDF

Info

Publication number
CN117138346A
CN117138346A CN202310858637.3A CN202310858637A CN117138346A CN 117138346 A CN117138346 A CN 117138346A CN 202310858637 A CN202310858637 A CN 202310858637A CN 117138346 A CN117138346 A CN 117138346A
Authority
CN
China
Prior art keywords
interaction
event
game
dialogue
scenario
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310858637.3A
Other languages
Chinese (zh)
Inventor
初小宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310858637.3A priority Critical patent/CN117138346A/en
Publication of CN117138346A publication Critical patent/CN117138346A/en
Priority to PCT/CN2024/100376 priority patent/WO2025011295A1/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a game editing method, a game control method, a device and an electronic apparatus, which construct a game editing scene, and create a non-player character in the game editing scene; configuring interaction attributes of non-player characters; in the game stage, after the designated scene event is triggered, controlling the non-player character to execute the interaction event in the game scene corresponding to the game editing scene based on the interaction attribute; the game edit scene created with the non-player character is saved. In the mode, by constructing the game editing scene, the non-player characters are created in the game editing scene, the interaction attribute is configured for the non-player characters, and in the game stage, the non-player characters in the game scene corresponding to the game editing scene can execute various interaction events, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.

Description

Game editing method, game control device and electronic equipment
Technical Field
The disclosure relates to the technical field of games, and in particular relates to a game editing method, a game control device and electronic equipment.
Background
In some gaming systems, a player may create a scene map in an empty scene. In particular, various components such as terrain, decoration, objects, functions, institutions, combinations and the like can be arranged in the game scene, and the arranged components form a complete scene map. After the scene map is online, other players can execute game tasks, conduct game break-through and the like in the scene map. The interaction between the player and the scene map is limited to the interaction between the player and the components, the interaction mode is single, the utilization rate of the scene map on line is low, the player is not benefited to actively create the scene map, the utilization rate of resources of the game system is low, and idle running and waste of game service resources are caused.
Disclosure of Invention
Accordingly, an object of the present disclosure is to provide a game editing method, a game control device, and an electronic apparatus, by constructing a game editing scene, creating a non-player character in the game editing scene, and configuring interaction properties for the non-player character, so that the non-player character in the game scene corresponding to the game editing scene can execute various interaction events during a game stage, so as to enrich the interaction modes of players, improve the utilization rate of the game scene edited by the players, the enthusiasm of the players for creating the game scene, and the resource utilization rate of a game system, and avoid idle running and wasting of game service resources.
In a first aspect, an embodiment of the present disclosure provides a game editing method, including: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene; in response to a save operation for the non-player character, the game edit scene created with the non-player character is saved.
In a second aspect, an embodiment of the present invention provides a game control method, including: displaying a game scene on a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character; in response to the specified scene event being triggered, an interaction event for the non-player character is determined based on the interaction attribute, and the non-player character is controlled to perform the interaction event in the game scene.
In a third aspect, an embodiment of the present invention provides a game editing device, including: the creation module is used for responding to the creation instruction of the non-player character, constructing a game editing scene and creating the non-player character in the game editing scene; the configuration module is used for responding to attribute configuration operation aiming at the non-player character and determining interaction attribute of the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene; and the storage module is used for responding to the storage operation aiming at the non-player character and storing the game editing scene created with the non-player character.
In a fourth aspect, an embodiment of the present invention provides a game control apparatus including: the display module is used for displaying the game scene on the graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character; and the execution module is used for responding to the trigger of the appointed scene event, determining the interaction event of the non-player character based on the interaction attribute and controlling the non-player character to execute the interaction event in the game scene.
In a fifth aspect, an embodiment of the present invention provides an electronic device, including a processor and a memory, the memory storing computer-executable instructions executable by the processor, the processor executing the computer-executable instructions to implement the game editing method of any one of the first aspects or to implement the game control method of any one of the second aspects.
In a sixth aspect, embodiments of the present invention provide a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement the game editing method of any one of the first aspects, or the game control method of any one of the second aspects.
The embodiment of the disclosure brings the following beneficial effects:
the present disclosure provides a game editing method, a game control method, a device and an electronic apparatus, which construct a game editing scene, and create a non-player character in the game editing scene; configuring interaction attributes of non-player characters; in the game stage, after the designated scene event is triggered, controlling the non-player character to execute the interaction event in the game scene corresponding to the game editing scene based on the interaction attribute; the game edit scene created with the non-player character is saved. In the mode, by constructing the game editing scene, the non-player characters are created in the game editing scene, the interaction attribute is configured for the non-player characters, and in the game stage, the non-player characters in the game scene corresponding to the game editing scene can execute various interaction events, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.
Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the disclosure. The objectives and other advantages of the disclosure will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a game editing method provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a graphical user interface provided by an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an interactive content selection window according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of another interactive content selection window according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of an interactive content configuration window according to an embodiment of the disclosure;
FIG. 6 is a schematic diagram of another interactive content configuration window according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram of a content creation window provided by an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of another interactive content configuration window according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of an event configuration window provided by an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of another interactive content configuration window according to an embodiment of the present disclosure;
FIG. 11 is a flow chart of a game control method provided by an embodiment of the present disclosure;
FIG. 12 is a schematic diagram of a game scene screen at a game play stage according to an embodiment of the present disclosure;
FIG. 13 is a schematic diagram of a game scene screen at another stage of game play provided by an embodiment of the present disclosure;
FIG. 14 is a schematic view of a game editing device according to an embodiment of the present disclosure;
FIG. 15 is a schematic view of a game control device according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the present disclosure will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present disclosure, but not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person skilled in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
Currently in some gaming systems, players may create a scene map in an empty scene. In particular, various components such as terrain, decoration, objects, functions, institutions, combinations and the like can be arranged in the game scene, and the arranged components form a complete scene map. After the scene map is online, other players can execute game tasks, conduct game break-through and the like in the scene map. The interaction between the player and the scene map is limited to the interaction between the player and the components, the interaction mode is single, the utilization rate of the scene map on line is low, the player is not benefited to actively create the scene map, the utilization rate of resources of the game system is low, and idle running and waste of game service resources are caused. Based on the above, the game editing method, the game control device and the electronic equipment provided by the embodiment of the disclosure, the technology can be applied to mobile phones, computers, notebooks, tablet computers and other equipment.
The game editing method and the game control method in one embodiment of the present disclosure may be run on a local terminal device or a server. When the game editing method and the game control method are run on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the game editing method and the game control method are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a game editing method, and a graphical user interface is provided through terminal equipment, wherein the terminal equipment can be the aforementioned local terminal equipment or the aforementioned client equipment in the cloud interaction system. As shown in fig. 1, the method comprises the steps of:
Step S102, responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene;
the above game editing scene refers to a game scene created by a player, and the game editing scene may be an empty game scene or a game scene including a part of initial scene objects, or may be a game scene that the player has created and saved in advance (the game scene may be a scene that has been released or a scene that has not been released). For example, an empty game editing scene may be created in which players may edit various scene content, including creating non-player characters. It is also possible to open an already constructed game editing scene and then create a non-player character in the game editing scene, that is, the player can edit again after saving the game editing scene that has been edited before. Specifically, the non-player character may be created at a target scene position of the game editing scene, where the target scene position may be a preset scene position or a scene position selected by the player.
The embodiment is realized in a game editing stage, specifically, a scene editing control is displayed through providing in a graphical user interface, a game editing scene is constructed in response to triggering operation aiming at the scene editing control, and the game editing scene is displayed through the graphical user interface. Alternatively, a plurality of game scene option identifications to be edited (the option identifications may be thumbnails of the game scenes, names of the game scenes, numbers of the game scenes, etc.) are displayed on the graphical user interface, a designated game scene is selected from the plurality of game scenes to be edited by the option identifications, and a game editing scene of the designated game scene is displayed on the graphical user interface. Through the operation, the game editing stage can be entered.
Illustratively, in the game editing stage, providing a scene display area and a scene editing area on the graphical user interface, wherein the scene display area is used for displaying a scene picture of a game editing scene, and the scene editing area is provided with a creation control for a non-player character; the player clicks the creation control, which displays a plurality of preset characters from which the player can select the non-player character that he wants to create. The scene display area can be the whole graphical user interface, and the scene editing area is overlapped on the scene display area. Alternatively, the scene display area and the scene editing area each occupy half of the graphical user interface, and are arranged left and right.
In a possible manner, if the non-player character includes multiple types, after responding to the triggering operation for creating the control, multiple types of controls are displayed, and responding to the selecting operation for the first type of control, multiple preset characters corresponding to the first type are displayed, and the player can select the non-player character to be created from the multiple preset characters corresponding to the first type. The non-player character generally has an initial attribute, such as a unique identifier of the non-player character (which cannot be modified and is not visible to the player), an initial name, an initial avatar, an occupied value (which cannot be edited by the player but is visible to the player), an initial appearance, an initial size, an initial standby action, an initial display position, an initial collision radius, an initial collision height, an initial generated value (a life value for combat), an initial skill, an initial weight, an initial physical attribute (whether or not the non-player character is subjected to an external force, etc.), an initial interaction attribute (whether or not the non-player character can interact, an initial interaction range, etc.), and the like, before performing attribute configuration.
Step S104, responding to attribute configuration operation aiming at the non-player character, and determining interaction attribute of the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene;
the attribute configuration operation described above may be a trigger operation, a selection operation, an input or editing operation for an input box, or the like for various controls. The above-mentioned interaction properties generally include whether the non-player character is allowed to interact with other scene objects (such as virtual characters controlled by the player), and a specific interaction range, i.e. how the other scene objects trigger interaction events with the non-player character.
The step of controlling the non-player character to execute the interaction event in the game scene corresponding to the game editing scene is usually implemented in a game running stage, specifically, after the game editing stage is finished, that is, after the game editing scene with the non-player character is created is saved, a scene release control for the game editing scene is displayed on a graphical user interface, and the release operation for the game editing scene is completed in response to the triggering operation for the scene release control.
The released game editing scenes are stored in the game server, when the player needs to use the game scenes edited by himself or other players, a scene operation control of the game scenes (the game scenes are the game editing scenes which are built by the player) can be displayed through a graphical user interface, and the game scenes are displayed on the graphical user interface in response to the triggering operation of the scene operation control, wherein the game scenes comprise the non-player characters which are created in advance. Through the operation, the game operation stage can be entered.
The operation trigger control for operating the game editing scene can also be provided in the graphical user interface of the game editing interface, and is used for the player to perform trial operation on the edited game scene, and when the operation trigger control is triggered, the game operation stage is entered.
The interaction event may be one or more. The interactive events may be performing dance actions, playing audio and/or video, displaying specified scene content, entering game play, displaying a scenario dialogue, etc. The specified scene event may be that other virtual characters enter a specified interaction range of the non-player character, or click an interaction control corresponding to the non-player character, etc. For example, in the game running stage, a game scene corresponding to the game editing scene is displayed on the graphical user interface, the game scene includes a non-player character, after the player controls the controlled virtual character to move to the interaction range of the non-player character, whether the non-player character allows interaction or not is determined according to the interaction attribute, if so, at least one interaction event preset by the non-player character is determined, a designated interaction event is selected, the non-player character is controlled to execute the interaction event in the game scene corresponding to the game editing scene, for example, a preset scenario dialogue is displayed, the controlled virtual character is controlled to execute dance actions (such as hand turns) with the non-player character, the controlled virtual character and the non-player character are controlled to enter the designated game scene, the controlled virtual character and the non-player character are controlled to play a game.
Specifically, after the non-player character is created, a plurality of attribute configuration controls are directly displayed on a graphical user interface, or the non-player character in the game scene is clicked, the plurality of attribute configuration controls are displayed on the graphical user interface, different attribute configuration controls provide different attribute configuration functions, a player can perform attribute configuration operation on the different attribute configuration controls, and different interaction attributes are configured for the non-player character, so that the non-player character has personalized interaction effects in the game scene corresponding to the game editing scene in a game running stage, and the game scene has more abundant interaction modes and very creative game playing methods.
Step S106, in response to the save operation for the non-player character, saving the game editing scene created with the non-player character.
Because the game editing scene is edited by the player autonomously, after the non-player characters are created in the game editing scene and the interaction attribute is configured, the interaction attribute of the non-player characters needs to be saved, and meanwhile, the non-player characters are saved, so that the non-player characters are created in the game editing scene, and finally, the game editing scene with the non-player characters is saved.
In the above manner, in the game editing stage, by constructing the game editing scene, the non-player character is created in the game editing scene, the interactive attribute is configured for the non-player character, and in the game running stage, the non-player character in the game scene corresponding to the game editing scene can execute various interactive events, so that the interactive mode of the player is enriched, the utilization rate of the game scene edited by the player, the enthusiasm of the player for creating the game scene and the resource utilization rate of the game system are improved, and the idling and waste of game service resources are avoided.
In step S104, in response to the attribute configuration operation for the non-player character, a step of determining an interaction attribute of the non-player character is performed, which is one possible implementation: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a first configuration control for an interaction event; an interaction event for the non-player character is determined in response to a first configuration operation for the first configuration control.
The first configuration control is used for providing the number of interaction events and the configuration function of the interaction content of each interaction event. The first configuration operation described above includes a click operation, a selection operation, an editing operation, and the like. In this embodiment, the player may configure at least one interaction event for the non-player character.
In one mode, at least one preset interaction event is displayed in response to a triggering operation aiming at a first configuration control; and responding to the selection operation aiming at a plurality of preset interaction events, and determining the selected preset interaction event as the interaction event of the non-player character. In another manner, an event creation control is displayed in response to a triggering operation for the first configuration control; in response to a creation operation for the event creation control, at least one created interaction event is determined, and the at least one created interaction event is determined to be an interaction event of the non-player character.
For example, the player may click the first configuration control directly to display a plurality of preset interaction events, and the interaction content of the interaction events is preset. The player may select one or more of a plurality of preset interaction events, and determine the selected one or more interaction events as interaction events of the non-player character. For another example, the player may click on the first configuration control to add one or more interactive events to the non-player character, and then perform configuration operations (e.g., creating interactive content or selecting from preset interactive content) for the added interactive events to determine the interactive events of the non-player character.
The first configuration control includes: adding a control for the interaction event; in response to a first configuration operation for a first configuration control, determining an interaction event for the non-player character, in one possible implementation:
step 1, responding to a triggering operation of adding a control for an interaction event, configuring a first interaction event for a non-player character, and displaying an event configuration control of the first interaction event;
for example, as shown in fig. 2, the player may click "add interaction" (corresponding to the above-mentioned interaction event adding control), that is, "interaction 1" (corresponding to the above-mentioned event configuration control of the first interaction event) may be displayed above "add interaction". At this time, the interactive content is not configured in the first interactive event configured for the non-player character. That is, during the game play phase, if the first interactive event is triggered, the first interactive event is executed without displaying any content.
It should be noted that, the above-mentioned interactive event adding control displays the number of events of the interactive events that the non-player character has configured, and the number of events of the interactive events that the non-player character is allowed to configure. Illustratively, as shown in FIG. 2, "1" in "Add interaction 1/10" is the number of events of interaction events that the non-player character has configured, and "10 is the number of events of interaction events that the non-player character is allowed to configure. And responding to the fact that the number of the events of the interaction events which are configured by the non-player character is equal to the number of the events of the interaction events which are allowed to be configured by the non-player character, updating the interaction event adding control from a first display mode to a second display mode, wherein the first display mode indicates that the triggering state of the interaction event adding control is triggerable, and the second display mode indicates that the triggering state of the interaction event adding control is triggerable.
In response to a triggering operation of adding a control for an interaction event, configuring a first interaction event for the non-player character, another possible implementation: responding to the triggering operation of adding a control for an interaction event, and displaying an interaction content selection window; and configuring a first interaction event for the non-player character through the interaction content selection window, and determining the interaction content of the first interaction event.
Step 2, responding to the appointed operation of the event configuration control, and displaying an interactive content selection window;
the above-mentioned designating operation may be a clicking operation for the event configuration control, directly displaying the interactive content selection window, or displaying the content configuration control (corresponding to the "interactive content" in fig. 9), and displaying the interactive content selection window after clicking the content configuration control.
And 3, determining the interactive content of the first interactive event through the interactive content selection window.
The interactive content selection window comprises a plurality of preset interactive contents, such as various plot dialogues, various dance actions, various designated game scenes and the like, and further comprises a content new control which can be clicked to create new interactive contents or edit and update the created interactive contents. Specifically, the interactive content of the first interactive event can be directly selected from a plurality of preset interactive contents, or a new control can be created for the first interactive event through the content. The first interaction event may have a plurality of interaction contents, for example, the first interaction event is a scenario dialogue, and the first interaction event may have a plurality of scenario dialogues.
The interactive content selection window comprises an alternative content new control; step 3, determining the interactive content of the first interactive event through the interactive content selection window, which is a possible implementation manner:
responding to the triggering operation of the new control for the alternative content, and displaying an interactive content configuration window;
determining alternative interactive contents of the first interactive event through an interactive content configuration window;
displaying alternative interactive contents in the interactive content selection window, and responding to the selected operation aiming at the alternative interactive contents, and determining the selected alternative interactive contents as the interactive contents of the first interactive event.
Illustratively, as shown in fig. 3, an interactive content selection window is displayed in the graphical user interface, where the interactive content selection window includes an alternative content new control that can be clicked by the player, and an interactive content configuration window is displayed. For example, the interactive content configuration window may display a plurality of preconfigured interactive contents, from which a player may select one or more specified interactive contents, edit the selected specified interactive contents, and finally determine the specified interactive contents or the edited specified interactive contents as candidate interactive contents of the first interactive event, and display the candidate interactive contents in the interactive content selection window. For another example, the interactive content configuration window may display a content creation control, and the player may create new interactive content for the content creation control, and finally determine the new interactive content as an alternative interactive content of the first interactive event, and display the alternative interactive content in the interactive content selection window.
Illustratively, as shown in FIG. 4, the interactive content selection window displays a plurality of alternative interactive content. The player can select one or more alternative interactive contents from the plurality of alternative interactive contents, and after clicking the confirmation control, the selected alternative interactive contents can be determined as the interactive contents of the first interactive event.
The interactive content configuration window comprises a plurality of preset interactive contents; the step of determining the candidate interactive content of the first interactive event through the interactive content configuration window, and one possible implementation manner: and responding to the selection operation aiming at the plurality of preset interactive contents, and determining the selected preset interactive contents as the alternative interactive contents of the first interactive event.
Exemplary, as shown in fig. 5, the method includes a preset interactive content a, a preset interactive content b, a preset interactive content c, and a preset interactive content d. The player can select one or more preset interactive contents, and after clicking the save and leave, the selected preset interactive contents are determined as the alternative interactive contents of the first interactive event, and an interactive content selection window is displayed.
Specifically, the preset interactive content included in the interactive content configuration window is a scenario file which is pre-configured and stored in a designated storage space, and the scenario file includes at least one scenario dialogue; the step of determining the selected preset interactive content as the candidate interactive content of the first interactive event in response to the selection operation for the plurality of preset interactive contents, may be implemented in one possible implementation manner: responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario-targeted dialogue included in the target scenario file from the appointed storage space, and displaying at least one item of scenario-targeted dialogue; and responding to the selection operation of at least one item of target scenario dialogue, and determining the selected target scenario dialogue as the alternative interaction content of the first interaction event.
The specified storage space may be a storage space in the game server or a storage space in a terminal device running the game client. The specific scenario file may be preconfigured by any player and saved to a designated storage space. Illustratively, as shown in fig. 6, a plurality of scenario files (e.g., scenario 1-scenario 4) are displayed on the left side of the graphical user interface, and clicking on the target scenario file (e.g., scenario 1) by the player will display the target scenario dialog included in the target scenario file in the graphical user interface, as shown in fig. 6, dialog 1 and dialog 2. The player can select one or more target scenario dialogues, click "save and leave" to determine the selected target scenario dialogues as the alternative interactive content for the first interactive event, and display an interactive content selection window.
After the step of acquiring at least one item of the target scenario dialogue included in the target scenario file from the designated storage space in response to the triggering operation for the target scenario file and displaying the at least one item of the target scenario dialogue, the method further includes: displaying a dialogue editing control of the target scenario dialogue; the dialog edit control is used to provide editing functionality for editing at least a portion of the following: the dialogue sequence of the target scenario dialogue in the target scenario file, the dialogue content of the target scenario dialogue, the display mode of the target scenario dialogue, the option information related to the target scenario dialogue and the jump information of the option information; and updating the target scenario dialogue in response to the editing operation aiming at the dialogue editing control.
For example, as shown in fig. 6, currently displayed are "dialog 1" and "dialog 2" (i.e., the above-mentioned target scenario dialog) included in the target scenario file (i.e., the "play 1"), and dialog content of "dialog 1" is displayed in the dialog edit box of "dialog 1", specifically, dialog content of dialog 1 in "dialog 1", option information associated with dialog 1, and skip information of option information. The "order" control may set the dialogue order of "dialogue 1" in the target scenario file in the current selection, and if the edit is "2", the dialogue order of "dialogue 1" in the target scenario file is the second dialogue. In the input box below the "speaking content", the currently selected "dialog 1" dialog content is displayed in the initial state, and the player can edit the "dialog 1" dialog content in the input box. The player can also set the display mode of the target scenario dialog (currently selected "dialog 1") according to the "dialog one-time display".
In addition, in the initial state, the dialog edit box of dialog 1 in the graphical user interface also displays option information associated with "dialog 1", including "option 1" and "option 2", and skip information 1 of option information "option 1", and skip information 2 of option information "option 2". Of course, the player can click on the control of ">" on the right side of the option 1 in the display area of the dialogue editing control, edit the jump information of the option 1, click on the control of ">" on the right side of the option 2, and edit the jump information of the option 2. Of course, the option information associated with the target scenario dialog may be added, as shown in fig. 6, where "dialog 1" is associated with two option information (option 1 and option 2), and if the player clicks "add option", a new option information may be displayed in the dialog edit box of dialog 1, and an edit control corresponding to the new option information may also be displayed in the display area of the dialog edit control. The player can edit the information content of the new option information and the skip information of the new option information through the editing control corresponding to the new option information.
In addition, the new dialogue control displayed in the display area of the dialogue editing control can be clicked, after clicking, a dialogue is newly added for the target scenario file, and a dialogue editing box of the new dialogue is displayed on the graphical user interface.
The interactive content configuration window comprises an interactive content creation control; the step of determining, by means of the interactive content configuration window, alternative interactive contents of the first interactive event, one possible implementation: responding to the triggering operation of the interactive content creation control, and displaying a content creation window; in response to a content input operation for the content creation window, the created interactive content is determined, and the created interactive content is determined as an alternative interactive content to the first interactive event.
Illustratively, as shown in FIG. 5, the interactive content configuration window includes an interactive content creation control. Specifically, the player can click on the interactive content creation control to display a content creation window; the player may edit specific interactive content in the content creation window, determine the created interactive content, and determine the created interactive content as an alternative interactive content for the first interactive event.
Specifically, the interactive content is a scenario file; in response to a content input operation for the content creation window, determining the created interactive content as an alternative interactive content to the first interactive event, one possible implementation: displaying scenario creation controls in the content creation window; responding to the creation operation aiming at the scenario creation control, and determining the created new scenario file, at least one new scenario dialogue included in the new scenario file and dialogue information of the new scenario dialogue; in response to a selection operation for at least one new scenario dialogue, the selected new scenario dialogue is determined as an alternative interactive content of the first interactive event.
Illustratively, the content creation window shown in FIG. 7 includes scenario creation controls, i.e., "+", in FIG. 7. The creation operation may be a click operation, a selection operation, an editing operation, or the like.
The step of determining the created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue in response to the creation operation of the scenario creation control, and one possible implementation manner is that: responding to a first creation operation aiming at the scenario creation control, determining the scenario name of the created new scenario file and the number of dialogs of the scenario dialog, and displaying the dialog creation control of the new scenario file; responding to a second creation operation aiming at the dialogue creation control, creating a new scenario dialogue for the new scenario file and dialogue information of the new scenario dialogue; wherein, the virtual character sending out the new scenario dialogue is a non-player character; the dialog information includes at least part of the following: the conversation sequence of the new scenario conversation in the new scenario file, the conversation content of the new scenario conversation, the display mode of the new scenario conversation, the option information related to the new scenario conversation and the jump information of the option information.
Specifically, responding to trigger operation of creating a control for a scenario, displaying a file initial identifier of a new scenario file, and displaying a file editing control; responding to the editing operation aiming at the file editing control, determining the scenario name of the created new scenario file and the number of dialogues of the scenario dialogue, and displaying the dialogue creating control of the new scenario file; responding to the triggering operation aiming at the dialogue creation control, and displaying dialogue edit boxes, wherein the number of the dialogue edit boxes is the same as that of the dialogues; responding to the editing operation aiming at the dialogue editing box, creating a new scenario dialogue for the new scenario document, and creating dialogue sequence of the new scenario dialogue in the new scenario document, dialogue content of the new scenario dialogue, display mode of the new scenario dialogue, option information related to the new scenario dialogue and jump information of the option information.
Illustratively, as shown in fig. 7, a player clicking on a scenario creation control, i.e., "+" in fig. 7, may display a file initial icon "play" of a created new scenario file to the right of "+" and then display a file editing control including "number of dialogs", "file name", and "open" (corresponding to the dialog creation control described above). After the player sets the number of completed dialogs and the file name, the player can click to open and display a dialog edit box, for example, as shown in fig. 8, wherein the dialog edit box comprises a plurality of dialogs, each dialog edit box comprises a dialog content input box and option information related to the dialog content and a skip information input box of the option information, and the player can input the dialog content included in the dialog 1 in the dialog edit box of the dialog 1 or input the dialog content included in the dialog 1 under the "speaking content" displayed in the display area of the dialog edit control.
The display area of the dialogue editing control further comprises an 'adding option' control, and when a player clicks the 'adding option' control, new option information can be associated with dialogue content included in the 'dialogue 1', an option input box is newly displayed in the dialogue editing box of the dialogue 1, and meanwhile, an option editing control is newly displayed in the display area of the dialogue editing control, and particularly the upper area of the 'adding option'. The player can input the option information in the option input box in the dialogue editing box of the dialogue 1, and can also edit the option information and the skip information of the option information aiming at the option input control in the display area of the dialogue editing control. The order of the dialog of "dialog 1" in the new scenario file may also be edited via the "order" control. The player can also set the display mode of the dialogue content included in the dialogue 1 according to the dialogue one-time display.
In addition, a new dialogue control displayed in the display area of the dialogue editing control can be clicked, after clicking, a dialogue of a scenario is newly added for the new scenario file, and a dialogue editing box of the new dialogue is displayed on the graphical user interface.
The option information is used for providing one or more of the following game events based on the jump information of the option information: ending dialogue, displaying other scenario dialogue in the current scenario file, displaying scenario dialogue in scenario files other than the current scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, and displaying appointed scene content.
After the step of determining the created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue in response to the creation operation for the scenario creation control, the method further includes: and responding to the preservation operation aiming at the new scenario file, creating the new scenario file in the appointed storage space, and preserving the new scenario dialogue and the new scenario dialogue information into the new scenario file.
Illustratively, as shown in fig. 8, clicking "save and leave" can create a new scenario file in the designated storage space, and save the new scenario dialogue and the new scenario dialogue information into the new scenario file.
The step of displaying the interactive content selection window in response to the specified operation of the configuration control for the event, may be one possible implementation: responding to triggering operation of an event configuration control, and displaying an event editing window aiming at a first interaction event; the event editing window comprises a content adding control for the first interaction event; responding to triggering operation aiming at the content adding control, configuring first interactive content for a first interactive event, and displaying a content configuration control of the first interactive content; wherein, no information is configured in the first interactive content; and responding to the triggering operation of the content configuration control, and displaying an interactive content selection window.
The player clicks "interaction 1" as shown in fig. 2, i.e., the above-described response to the triggering operation of the event configuration control. An event editing window for the first interactive event is then displayed. Illustratively, as shown in FIG. 9, a window is edited for an event of "interaction 1". The event editing window includes a content addition control for a first interactive event ("interaction 1"), and clicking on the content addition control displays a content configuration control (corresponding to the "interactive content" control in fig. 9) of the first interactive content over the content addition control. In general, the first interaction event may set a plurality of interaction contents. "interactive contents 1/4" as shown in fig. 9, wherein "1" indicates that one interactive content has been set for the first interactive event and "4" indicates the total number of interactive contents allowed to be set for the first interactive event.
The event editing window comprises an event type configuration control for a first interaction event; the method further comprises the following steps: and responding to the type configuration operation of the event type configuration control, and determining the interaction type of the first interaction event.
Illustratively, as shown in FIG. 9, the player may click on the selection box to the right of "type of interaction," displaying multiple types of interactions, such as a scenario dialog, a game scene, a scene video, a game action, and so forth.
The event editing window comprises an interaction name configuration control for a first interaction event; the method further comprises the following steps: and responding to the name configuration operation aiming at the interaction name configuration control, and determining the interaction name of the first interaction event.
Illustratively, as shown in FIG. 9, the player may click on a selection box to the right of "interactive names" to display a plurality of interactive names, such as chat, scene, video, action, task, etc.
The event editing window comprises an interaction identification configuration control for a first interaction event; the method further comprises the following steps: and responding to the identifier configuration operation aiming at the interaction identifier configuration control, and determining the interaction identifier as the first interaction event.
For example, as shown in fig. 9, the player may click on the identification picture on the right side of the "interactive identification", display a plurality of interactive identifications, and the player selects one of the interactive identifications, and determines the interactive identification as the interactive identification of the first interactive event.
Displaying the alternative interactive content in the interactive content selection window, one possible implementation: determining whether the first alternative interactive content is configured in the first interactive event in the alternative interactive content; if yes, displaying a first alternative interactive content in the interactive content selection window in a first display mode; the first display mode is used for indicating that the first alternative interactive content is in an unselected state; if not, displaying the first alternative interactive content in the interactive content selection window in a second display mode; the second display mode is used for indicating that the first alternative interactive content is in a selectable state.
Illustratively, as shown in FIG. 10, the alternate interactive content displayed in the gray box has been configured to the first interactive event and cannot be selected again. The alternative interactive content which is normally displayed is not configured to the first interactive event, and can be selected.
After the step of configuring a first interaction event for the non-player character in response to the triggering operation of the event adding control and displaying the event configuring control of the first interaction event, the method further includes: and canceling the first interaction event configured for the non-player character in response to the deletion operation of the event configuration control for the first interaction event, and deleting the event configuration control of the first interaction event.
Exemplary, as shown in fig. 2, clicking the "—" control on the left side of "interaction 1" can cancel the first interaction event configured for the non-player character, and the interaction content configured for the first interaction event, and delete the event configuration control of the first interaction event.
After the steps of configuring the first interactive content for the first interactive event and displaying the content configuration control of the first interactive content in response to the triggering operation for the content addition control, the method further includes: and responding to the deleting operation of the content configuration control for the first interactive content, canceling the first interactive content configured for the first interactive event, and deleting the content configuration control of the first interactive content.
Exemplary, as shown in fig. 9, clicking on the "—" control on the right side of the "interactive content" (i.e., the content configuration control of the first interactive content described above) can cancel the interactive content configured for the first interactive event, and delete the content configuration control of the interactive content.
The step of determining the interaction attribute of the non-player character in response to the attribute configuration operation for the non-player character, may be implemented as follows: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a second configuration control regarding the interaction range; responsive to a second configuration operation for the second configuration control, an interaction range of the non-player character is determined.
The second configuration control includes: a distance configuration control; one possible implementation is: responding to the distance configuration operation aiming at the distance configuration control, and determining the interaction distance of the non-player character; determining the interaction range of the non-player character according to the interaction distance and a preset interaction center; the geometric center of the non-player character is a preset interaction center.
For example, as shown in fig. 2, the input box on the right side of the "interaction radius" inputs the interaction distance, and then the circular area formed by taking the preset interaction center as the center and the interaction distance as the radius is determined as the interaction range of the non-player character.
The second configuration control includes: a control is configured in the center; the method further comprises the following steps: and responding to the center configuration operation aiming at the center configuration control, and adjusting a preset interaction center.
Specifically, in the initial state, the preset interaction center is located at the center point of the capsule body corresponding to the non-player character (the center point is [0,0 ]). Responding to the triggering operation aiming at the center configuration control, and displaying an offset input frame of the three-dimensional coordinate axis; and responding to the input operation aiming at the input box, determining the offset of the three-dimensional coordinate axis, and adjusting a preset interaction center based on the offset. For example, as shown in fig. 2, the player may click on the center configuration control, display offset input boxes of the three-dimensional coordinate axis, and input a value to be offset in each input box, so as to adjust the preset interaction center.
The method further comprises the following steps: providing a first interactive control through a graphical user interface; responding to triggering operation aiming at the first interaction control, and controlling and generating game scene information corresponding to the game editing scene; the game scene information comprises component information of edited components in a game editing scene, wherein the edited components are editing components responding to editing operations in a plurality of editable components included in an editing window; controlling to transmit game scene information to a server; the server is configured to be in communication connection with the terminal equipment, the terminal equipment is configured with a game program, the terminal equipment is configured to acquire game scene information from the server, and a corresponding game scene is generated according to the game scene information through the game program.
The first interactive control is used for publishing the game editing scene so that other players can start the game editing scene on the game interface. In practical application, the graphical user interface includes a first interaction control, after the first interaction control is triggered by the terminal device, game scene information corresponding to the game editing scene can be generated, the game scene information can be stored in a preset position, the preset position can be a map file, the map file can store not only the game scene information, but also other map information (including but not limited to screenshot, map name, log and the like). The map file is uploaded to the server after storing the game scene information. After the server checks, the game scene generated by the game scene information can be released into a preset map pool, so that the terminal equipment connected with the server can download the corresponding game scene information from the server, generate the corresponding game scene according to the game scene information through a game program, and then play game experience in the game scene. The method can release game scene information in the game editor and is experienced by other players, so that a rapid UGC (User Generated Content, user generated content, namely user original content) function is realized.
The embodiment of the invention provides a game control method, as shown in fig. 11, comprising the following steps:
step S1102, displaying a game scene on a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character;
the game scene may be a game editing scene created and released in advance by each player. The game editing scene can be created by the user, and if the game editing scene is created by the user but not released, the game editing scene can be opened based on the preview function, and a scene picture of the game editing scene is displayed on the graphical user interface.
In step S1104, in response to the designated scene event being triggered, an interaction event of the non-player character is determined based on the interaction attribute, and the non-player character is controlled to execute the interaction event in the game scene.
The interaction event may be one or more. The interactive events may be performing dance actions, playing audio and/or video, displaying specified scene content, entering game play, displaying a scenario dialogue, etc. The specified scene event may be that other virtual characters enter a specified interaction range of the non-player character, or click an interaction control corresponding to the non-player character, etc. For example, in the game running stage, a game scene corresponding to the game editing scene is displayed on the graphical user interface, the game scene includes a non-player character, after the player controls the controlled virtual character to move to the interaction range of the non-player character, whether the non-player character allows interaction or not is determined according to the interaction attribute, if so, at least one interaction event preset by the non-player character is determined, a designated interaction event is selected, the non-player character is controlled to execute the interaction event in the game scene corresponding to the game editing scene, for example, a preset scenario dialogue is displayed, the controlled virtual character is controlled to execute dance actions (such as hand turns) with the non-player character, the controlled virtual character and the non-player character are controlled to enter the designated game scene, the controlled virtual character and the non-player character are controlled to play a game.
In the above manner, in the game editing stage, by constructing the game editing scene, the non-player character is created in the game editing scene, the interactive attribute is configured for the non-player character, and in the game running stage, the non-player character in the game scene corresponding to the game editing scene can execute various interactive events, so that the interactive mode of the player is enriched, the utilization rate of the game scene edited by the player, the enthusiasm of the player for creating the game scene and the resource utilization rate of the game system are improved, and the idling and waste of game service resources are avoided.
In response to the specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, controlling the non-player character to perform the step of the interaction event in the game scene, one possible implementation: responding to the interaction operation of the target virtual character and the non-player character, determining at least one interaction event pre-configured by the non-player character, and displaying a trigger control corresponding to the interaction event; the triggering control comprises the following steps: an interaction identifier and an interaction name of the interaction event; and responding to the triggering operation aiming at the target triggering control, and executing the interaction event corresponding to the target triggering control.
The interaction operation may be that the target virtual character moves to a designated range of the non-player character, or that the player controls the target virtual character to contact the non-player character, or that a trigger operation is performed on the interaction control, etc. The target virtual character is usually a controlled virtual character controlled by a player, and may be a non-player character.
The interaction attribute comprises an interaction range; in one possible way: and responding to the movement of the target virtual character into the interaction range of the non-player character, determining the interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
As shown in fig. 12, the interactive event is a scenario dialog, and after the target virtual character moves to the designated range of the non-player character, a trigger control corresponding to at least one pre-configured interactive event of the non-player character is displayed, where the trigger control includes four trigger controls, and the trigger controls include an interactive identifier and an interactive name (chat) of the interactive event. The player can slide the trigger controls, view the interaction event, click the trigger operation of the target trigger control, and display the scenario dialogue corresponding to the target trigger control.
The interaction event corresponding to the target trigger control is a scenario dialogue; executing the interaction event corresponding to the target trigger control, and one possible implementation manner is as follows: acquiring a target scenario file corresponding to the interaction event from the appointed storage space; wherein, the target scenario file comprises at least one scenario dialogue; determining a scenario dialogue from a target scenario file, and acquiring dialogue information of the scenario dialogue; wherein, the dialogue information comprises at least part of the following contents: the dialogue sequence of the scenario dialogue in the target scenario file, the dialogue content of the target dialogue, the display mode of the target dialogue, the option information related to the target dialogue and the jump information of the option information; and controlling and displaying the scenario dialogue according to the dialogue information until the scenario dialogue in the target scenario file is displayed.
After the target scenario file is triggered and starts to play, starting to play scenario dialogues in the target scenario file, presetting dialogue information for each scenario dialogue, and displaying scenario animation dialogues corresponding to the scenario dialogues in a graphical user interface according to the dialogue information. When the scenario animated dialogue is displayed, a first target dialogue (scenario dialogue included in the target scenario file) may be displayed in association with the non-player character, and at the same time, a character identification of the non-player character, for example, a character name, a character head, a character appearance, and the like may also be displayed.
Determining a scenario dialogue from a target scenario file, one possible implementation: determining a first scenario dialogue based on the dialogue sequence of scenario dialogues in the target scenario file; or determining the next scenario dialogue based on the jump information corresponding to the selected option information in the current scenario dialogue.
The step of controlling and displaying the scenario dialogue according to the dialogue information, and one possible implementation mode is as follows: displaying character identifications and/or character names of non-player characters; and displaying dialogue contents of the scenario dialogue according to the display mode of the scenario dialogue.
After the step of displaying the dialogue content of the scenario dialogue according to the display mode of the scenario dialogue, the method further comprises the following steps: determining associated option information and jump information of the option information in the scenario dialogue, and displaying the option information and the jump information of the option information; responding to triggering operation aiming at appointed option information in the option information, and triggering a game event corresponding to the appointed option information based on jump information of the appointed option information; wherein the game event includes one or more of: ending dialogue, displaying scenario dialogue outside target dialogue in target scenario file, displaying scenario dialogue outside target scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, displaying appointed scene content.
In the game running stage, when the scenario dialogue animation corresponding to the first target dialogue is displayed, the scenario dialogue animation comprises a non-player character identifier corresponding to non-player character information. The non-player character identification includes, for example, a character name, character avatar, etc. of the non-player character. If the non-player character corresponding to the non-player character information is located in the game scene, the first target session may also be displayed in association with the scene location where the non-player character is located.
As in fig. 13, during the game play phase, a scenario dialog animation of the first target dialog is displayed in the graphical user interface, including the non-player character identification, i.e., the avatar of character 1 and the name of character 1, and dialog content of the first target dialog, along with associated option information.
Corresponding to the above method embodiment, an embodiment of the present invention provides a game editing device, as shown in fig. 14, including:
a creation module 141 for constructing a game editing scene in response to a creation instruction of the non-player character, and creating the non-player character in the game editing scene;
a configuration module 142 for determining interaction properties of the non-player character in response to a property configuration operation for the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene;
A saving module 143 for saving the game edit scene created with the non-player character in response to the save operation for the non-player character.
The game editing device constructs a game editing scene and creates a non-player character in the game editing scene; configuring interaction attributes of non-player characters; in the game stage, after the designated scene event is triggered, controlling the non-player character to execute the interaction event in the game scene corresponding to the game editing scene based on the interaction attribute; the game edit scene created with the non-player character is saved. In the mode, by constructing the game editing scene, the non-player characters are created in the game editing scene, the interaction attribute is configured for the non-player characters, and in the game stage, the non-player characters in the game scene corresponding to the game editing scene can execute various interaction events, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.
The configuration module is further used for: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a first configuration control for an interaction event; an interaction event for the non-player character is determined in response to a first configuration operation for the first configuration control.
The first configuration control comprises: adding a control for the interaction event; the configuration module is further used for: responding to the triggering operation of adding a control for an interaction event, configuring a first interaction event for a non-player character, and displaying an event configuration control of the first interaction event; responding to the appointed operation of the event configuration control, and displaying an interactive content selection window; and determining the interactive content of the first interactive event through the interactive content selection window.
The interactive content selection window comprises an alternative content new control; the configuration module is further used for: responding to the triggering operation of the new control for the interactive content, and displaying an interactive content configuration window; determining alternative interactive contents of the first interactive event through an interactive content configuration window; displaying alternative interactive contents in the interactive content selection window, and responding to the selected operation aiming at the alternative interactive contents, and determining the selected alternative interactive contents as the interactive contents of the first interactive event.
The interactive content configuration window comprises a plurality of preset interactive contents; the configuration module is further used for: and responding to the selection operation aiming at the plurality of preset interactive contents, and determining the selected preset interactive contents as the alternative interactive contents of the first interactive event.
The interactive content configuration window comprises preset interactive content which is a scenario file pre-configured and stored in a designated storage space, wherein the scenario file comprises at least one scenario dialogue; the configuration module is further used for: responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario-targeted dialogue included in the target scenario file from the appointed storage space, and displaying at least one item of scenario-targeted dialogue; and responding to the selection operation of at least one item of target scenario dialogue, and determining the selected target scenario dialogue as the alternative interaction content of the first interaction event.
The device also comprises a dialogue updating module for: displaying a dialogue editing control of the target scenario dialogue; the dialog edit control is used to provide editing functionality for editing at least a portion of the following: the dialogue sequence of the target scenario dialogue in the target scenario file, the dialogue content of the target scenario dialogue, the display mode of the target scenario dialogue, the option information related to the target scenario dialogue and the jump information of the option information; and updating the target scenario dialogue in response to the editing operation aiming at the dialogue editing control.
The interactive content configuration window comprises an interactive content creation control; the configuration module is further used for: responding to the triggering operation of the interactive content creation control, and displaying a content creation window; in response to a content input operation for the content creation window, the created interactive content is determined, and the created interactive content is determined as an alternative interactive content to the first interactive event.
The interactive content is a scenario file; the configuration module is further used for: displaying scenario creation controls in the content creation window; responding to the creation operation aiming at the scenario creation control, and determining the created new scenario file, at least one new scenario dialogue included in the new scenario file and dialogue information of the new scenario dialogue; in response to a selection operation for at least one new scenario dialogue, the selected new scenario dialogue is determined as an alternative interactive content of the first interactive event.
The configuration module is further used for: responding to a first creation operation aiming at the scenario creation control, determining the scenario name of the created new scenario file and the number of dialogs of the scenario dialog, and displaying the dialog creation control of the new scenario file; responding to a second creation operation aiming at the dialogue creation control, creating a new scenario dialogue for the new scenario file and dialogue information of the new scenario dialogue; wherein, the virtual character sending out the new scenario dialogue is a non-player character; the dialog information includes at least part of the following: the conversation sequence of the new scenario conversation in the new scenario file, the conversation content of the new scenario conversation, the display mode of the new scenario conversation, the option information related to the new scenario conversation and the jump information of the option information.
The option information is used for providing one or more of the following game events based on the jump information of the option information: ending dialogue, displaying other scenario dialogue in the current scenario file, displaying scenario dialogue in scenario files other than the current scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, and displaying appointed scene content.
The device also comprises a file storage module for: and responding to the preservation operation aiming at the new scenario file, creating the new scenario file in the appointed storage space, and preserving the new scenario dialogue and the new scenario dialogue information into the new scenario file.
The configuration module is further used for: responding to triggering operation of an event configuration control, and displaying an event editing window aiming at a first interaction event; the event editing window comprises a content adding control for the first interaction event; responding to triggering operation aiming at the content adding control, configuring first interactive content for a first interactive event, and displaying a content configuration control of the first interactive content; wherein, no information is configured in the first interactive content; and responding to the triggering operation of the content configuration control, and displaying an interactive content selection window.
The event editing window comprises an event type configuration control for a first interaction event; the device further comprises an interaction type determining module, which is used for: and responding to the type configuration operation of the event type configuration control, and determining the interaction type of the first interaction event.
The event editing window comprises an interaction name configuration control for a first interaction event; the device further comprises an interaction name determining module, which is used for: and responding to the name configuration operation aiming at the interaction name configuration control, and determining the interaction name of the first interaction event.
The event editing window comprises an interaction identification configuration control for a first interaction event; the device further comprises an interaction identification determining module, which is used for: and responding to the identifier configuration operation aiming at the interaction identifier configuration control, and determining the interaction identifier as the first interaction event.
The configuration module is further used for: determining whether the first alternative interactive content is configured in the first interactive event in the alternative interactive content; if yes, displaying a first alternative interactive content in the interactive content selection window in a first display mode; the first display mode is used for indicating that the first alternative interactive content is in an unselected state; if not, displaying the first alternative interactive content in the interactive content selection window in a second display mode; the second display mode is used for indicating that the first alternative interactive content is in a selectable state.
The device further comprises an interaction event deleting module, which is used for: and canceling the first interaction event configured for the non-player character in response to the deletion operation of the event configuration control for the first interaction event, and deleting the event configuration control of the first interaction event.
The device further comprises an interactive content deleting module, which is used for: and responding to the deleting operation of the content configuration control for the first interactive content, canceling the first interactive content configured for the first interactive event, and deleting the content configuration control of the first interactive content.
The configuration module is further used for: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a second configuration control regarding the interaction range; responsive to a second configuration operation for the second configuration control, an interaction range of the non-player character is determined.
The second configuration control comprises: a distance configuration control; the configuration module is further used for: responding to the distance configuration operation aiming at the distance configuration control, and determining the interaction distance of the non-player character; determining the interaction range of the non-player character according to the interaction distance and a preset interaction center; the geometric center of the non-player character is a preset interaction center.
The second configuration control comprises: a control is configured in the center; and responding to the center configuration operation aiming at the center configuration control, and adjusting a preset interaction center.
The configuration module is further used for: and determining a circular area formed by taking the preset interaction center as a circle center and the interaction distance as a radius as the interaction range of the non-player character.
The device further comprises an information sending module, used for: providing a first interactive control through a graphical user interface; responding to triggering operation aiming at the first interaction control, and controlling and generating game scene information corresponding to the game editing scene; the game scene information comprises component information of edited components in a game editing scene, wherein the edited components are editing components responding to editing operations in a plurality of editable components included in an editing window; controlling to transmit game scene information to a server; the server is configured to be in communication connection with the terminal equipment, the terminal equipment is configured with a game program, the terminal equipment is configured to acquire game scene information from the server, and a corresponding game scene is generated according to the game scene information through the game program.
The game editing device provided by the embodiment of the disclosure has the same technical characteristics as the game editing method provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
In accordance with the above embodiment of the game control method, an embodiment of the present invention provides a game control device, as shown in fig. 15, including:
a display module 151 for displaying a game scene at a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character;
the execution module 152 is configured to determine an interaction event of the non-player character based on the interaction attribute in response to the designated scene event being triggered, and control the non-player character to execute the interaction event in the game scene.
According to the game control device, the non-player characters are created in the game editing scene by constructing the game editing scene, the interaction attribute is configured for the non-player characters, and various interaction events can be executed by the non-player characters in the game scene corresponding to the game editing scene in the game stage, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.
The execution module is further used for: responding to the interaction operation of the target virtual character and the non-player character, determining at least one interaction event pre-configured by the non-player character, and displaying a trigger control corresponding to the interaction event; the triggering control comprises the following steps: an interaction identifier and an interaction name of the interaction event; and responding to the triggering operation aiming at the target triggering control, and executing the interaction event corresponding to the target triggering control.
The interactive event corresponding to the target trigger control is a scenario dialogue; the execution module is further used for: acquiring a target scenario file corresponding to the interaction event from the appointed storage space; wherein, the target scenario file comprises at least one scenario dialogue; determining a scenario dialogue from a target scenario file, and acquiring dialogue information of the scenario dialogue; wherein, the dialogue information comprises at least part of the following contents: the dialogue sequence of the scenario dialogue in the target scenario file, the dialogue content of the target dialogue, the display mode of the target dialogue, the option information related to the target dialogue and the jump information of the option information; and controlling and displaying the scenario dialogue according to the dialogue information until the scenario dialogue in the target scenario file is displayed.
The execution module is further used for: determining a first scenario dialogue based on the dialogue sequence of scenario dialogues in the target scenario file; or determining the next scenario dialogue based on the jump information corresponding to the selected option information in the current scenario dialogue.
The execution module is further used for: displaying character identifications and/or character names of non-player characters; and displaying dialogue contents of the scenario dialogue according to the display mode of the scenario dialogue.
The device also comprises a game event triggering module for: determining associated option information and jump information of the option information in the scenario dialogue, and displaying the option information and the jump information of the option information; responding to triggering operation aiming at appointed option information in the option information, and triggering a game event corresponding to the appointed option information based on jump information of the appointed option information; wherein the game event includes one or more of: ending dialogue, displaying scenario dialogue outside target dialogue in target scenario file, displaying scenario dialogue outside target scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, displaying appointed scene content.
The interaction attributes include: an interaction range; the execution module is further used for: and responding to the movement of the target virtual character into the interaction range of the non-player character, determining the interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
The game control device provided by the embodiment of the disclosure has the same technical characteristics as the game control method provided by the embodiment, so that the same technical problems can be solved, and the same technical effects can be achieved.
The present embodiment also provides an electronic device including a processor and a memory storing machine-executable instructions executable by the processor, the processor executing the machine-executable instructions to implement the above-described game editing method and game control method. The electronic device may be a server or a terminal device.
Referring to fig. 16, the electronic device includes a processor 100 and a memory 101, the memory 101 storing machine executable instructions that can be executed by the processor 100, the processor 100 executing the machine executable instructions to implement the above-described game editing method and game control method.
Further, the electronic device shown in fig. 16 further includes a bus 102 and a communication interface 103, and the processor 100, the communication interface 103, and the memory 101 are connected through the bus 102.
The memory 101 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 103 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 102 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in fig. 16, but not only one bus or one type of bus.
The processor 100 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 100 or by instructions in the form of software. The processor 100 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The various methods, steps and logic blocks of the disclosure in the embodiments of the disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 101, and the processor 100 reads the information in the memory 101 and, in combination with its hardware, performs the steps of the method of the previous embodiment.
The processor in the electronic device may implement the following operations in the game editing method by executing machine-executable instructions:
responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene; in response to a save operation for the non-player character, the game edit scene created with the non-player character is saved.
Responsive to an attribute configuration operation for the non-player character, determining an interactive attribute for the non-player character, comprising: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a first configuration control for an interaction event; an interaction event for the non-player character is determined in response to a first configuration operation for the first configuration control.
The first configuration control comprises: adding a control for the interaction event; responsive to a first configuration operation for a first configuration control, determining an interaction event for the non-player character, comprising: responding to the triggering operation of adding a control for an interaction event, configuring a first interaction event for a non-player character, and displaying an event configuration control of the first interaction event; responding to the appointed operation of the event configuration control, and displaying an interactive content selection window; and determining the interactive content of the first interactive event through the interactive content selection window.
The interactive content selection window comprises an alternative content new control; the step of determining the interactive content of the first interactive event through the interactive content selection window comprises the following steps: responding to the triggering operation of the new control for the interactive content, and displaying an interactive content configuration window; determining alternative interactive contents of the first interactive event through an interactive content configuration window; displaying alternative interactive contents in the interactive content selection window, and responding to the selected operation aiming at the alternative interactive contents, and determining the selected alternative interactive contents as the interactive contents of the first interactive event.
The interactive content configuration window comprises a plurality of preset interactive contents; the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps: and responding to the selection operation aiming at the plurality of preset interactive contents, and determining the selected preset interactive contents as the alternative interactive contents of the first interactive event.
The interactive content configuration window comprises preset interactive content which is pre-configured and stored in a specified storage space, wherein the scenario file comprises at least one scenario dialogue; responsive to a selection operation for a plurality of preset interactive contents, determining the selected preset interactive contents as candidate interactive contents for the first interactive event, comprising: responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario-targeted dialogue included in the target scenario file from the appointed storage space, and displaying at least one item of scenario-targeted dialogue; and responding to the selection operation of at least one item of target scenario dialogue, and determining the selected target scenario dialogue as the alternative interaction content of the first interaction event.
After the steps of responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario marking dialogue included in the target scenario file from the appointed storage space and displaying the at least one item of scenario marking dialogue, the method further comprises the following steps: displaying a dialogue editing control of the target scenario dialogue; the dialog edit control is used to provide editing functionality for editing at least a portion of the following: the dialogue sequence of the target scenario dialogue in the target scenario file, the dialogue content of the target scenario dialogue, the display mode of the target scenario dialogue, the option information related to the target scenario dialogue and the jump information of the option information; and updating the target scenario dialogue in response to the editing operation aiming at the dialogue editing control.
The interactive content configuration window comprises an interactive content creation control; the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps: responding to the triggering operation of the interactive content creation control, and displaying a content creation window; in response to a content input operation for the content creation window, the created interactive content is determined, and the created interactive content is determined as an alternative interactive content to the first interactive event.
The interactive content is a scenario file; in response to a content input operation for a content creation window, determining the created interactive content, the step of determining the created interactive content as an alternative interactive content to the first interactive event, comprising: displaying scenario creation controls in the content creation window; responding to the creation operation aiming at the scenario creation control, and determining the created new scenario file, at least one new scenario dialogue included in the new scenario file and dialogue information of the new scenario dialogue; in response to a selection operation for at least one new scenario dialogue, the selected new scenario dialogue is determined as an alternative interactive content of the first interactive event.
Responsive to a creation operation for the scenario creation control, determining a created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue, including: responding to a first creation operation aiming at the scenario creation control, determining the scenario name of the created new scenario file and the number of dialogs of the scenario dialog, and displaying the dialog creation control of the new scenario file; responding to a second creation operation aiming at the dialogue creation control, creating a new scenario dialogue for the new scenario file and dialogue information of the new scenario dialogue; wherein, the virtual character sending out the new scenario dialogue is a non-player character; the dialog information includes at least part of the following: the conversation sequence of the new scenario conversation in the new scenario file, the conversation content of the new scenario conversation, the display mode of the new scenario conversation, the option information related to the new scenario conversation and the jump information of the option information.
The option information is used to provide one or more of the following game events based on the skip information of the option information: ending dialogue, displaying other scenario dialogue in the current scenario file, displaying scenario dialogue in scenario files other than the current scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, and displaying appointed scene content.
After the step of determining the created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue in response to the creation operation for the scenario creation control, the method further includes: and responding to the preservation operation aiming at the new scenario file, creating the new scenario file in the appointed storage space, and preserving the new scenario dialogue and the new scenario dialogue information into the new scenario file.
Responsive to a designated operation for the event configuration control, displaying an interactive content selection window, comprising: responding to triggering operation of an event configuration control, and displaying an event editing window aiming at a first interaction event; the event editing window comprises a content adding control for the first interaction event; responding to triggering operation aiming at the content adding control, configuring first interactive content for a first interactive event, and displaying a content configuration control of the first interactive content; wherein, no information is configured in the first interactive content; and responding to the triggering operation of the content configuration control, and displaying an interactive content selection window.
The event editing window comprises an event type configuration control for the first interaction event; the method further comprises the steps of: and responding to the type configuration operation of the event type configuration control, and determining the interaction type of the first interaction event.
The event editing window comprises an interaction name configuration control for the first interaction event; the method further comprises the steps of: and responding to the name configuration operation aiming at the interaction name configuration control, and determining the interaction name of the first interaction event.
The event editing window comprises an interaction identification configuration control for a first interaction event; the method further comprises the steps of: and responding to the identifier configuration operation aiming at the interaction identifier configuration control, and determining the interaction identifier as the first interaction event.
The step of displaying the alternative interactive contents in the interactive content selection window comprises the following steps: determining whether the first alternative interactive content is configured in the first interactive event in the alternative interactive content; if yes, displaying a first alternative interactive content in the interactive content selection window in a first display mode; the first display mode is used for indicating that the first alternative interactive content is in an unselected state; if not, displaying the first alternative interactive content in the interactive content selection window in a second display mode; the second display mode is used for indicating that the first alternative interactive content is in a selectable state.
After the steps of configuring the first interaction event for the non-player character and displaying the event configuration control of the first interaction event in response to the triggering operation of the event adding control, the method further comprises: and canceling the first interaction event configured for the non-player character in response to the deletion operation of the event configuration control for the first interaction event, and deleting the event configuration control of the first interaction event.
After the steps of configuring the first interactive content for the first interactive event and displaying the content configuration control of the first interactive content in response to the triggering operation for the content addition control, the method further includes: and responding to the deleting operation of the content configuration control for the first interactive content, canceling the first interactive content configured for the first interactive event, and deleting the content configuration control of the first interactive content.
Responsive to an attribute configuration operation for the non-player character, determining an interactive attribute for the non-player character, comprising: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a second configuration control regarding the interaction range; responsive to a second configuration operation for the second configuration control, an interaction range of the non-player character is determined.
The second configuration control comprises: a distance configuration control; responsive to a first configuration operation for a first configuration control, determining an interaction range of the non-player character, comprising: responding to the distance configuration operation aiming at the distance configuration control, and determining the interaction distance of the non-player character; determining the interaction range of the non-player character according to the interaction distance and a preset interaction center; the geometric center of the non-player character is a preset interaction center.
The second configuration control comprises: a control is configured in the center; and responding to the center configuration operation aiming at the center configuration control, and adjusting a preset interaction center.
According to the interaction distance and the preset interaction center, determining the interaction range of the non-player character comprises the following steps: and determining a circular area formed by taking the preset interaction center as a circle center and the interaction distance as a radius as the interaction range of the non-player character.
The method further comprises the following steps: providing a first interactive control through a graphical user interface; responding to triggering operation aiming at the first interaction control, and controlling and generating game scene information corresponding to the game editing scene; the game scene information comprises component information of edited components in a game editing scene, wherein the edited components are editing components responding to editing operations in a plurality of editable components included in an editing window; controlling to transmit game scene information to a server; the server is configured to be in communication connection with the terminal equipment, the terminal equipment is configured with a game program, the terminal equipment is configured to acquire game scene information from the server, and a corresponding game scene is generated according to the game scene information through the game program.
The following operations in the above game control method may also be implemented:
Displaying a game scene on a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character; in response to the specified scene event being triggered, an interaction event for the non-player character is determined based on the interaction attribute, and the non-player character is controlled to perform the interaction event in the game scene.
Responsive to the specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene, comprising: responding to the interaction operation of the target virtual character and the non-player character, determining at least one interaction event pre-configured by the non-player character, and displaying a trigger control corresponding to the interaction event; the triggering control comprises the following steps: an interaction identifier and an interaction name of the interaction event; and responding to the triggering operation aiming at the target triggering control, and executing the interaction event corresponding to the target triggering control.
The interactive event corresponding to the target trigger control is a scenario dialogue; the step of executing the interaction event corresponding to the target trigger control comprises the following steps: acquiring a target scenario file corresponding to the interaction event from the appointed storage space; wherein, the target scenario file comprises at least one scenario dialogue; determining a scenario dialogue from a target scenario file, and acquiring dialogue information of the scenario dialogue; wherein, the dialogue information comprises at least part of the following contents: the dialogue sequence of the scenario dialogue in the target scenario file, the dialogue content of the target dialogue, the display mode of the target dialogue, the option information related to the target dialogue and the jump information of the option information; and controlling and displaying the scenario dialogue according to the dialogue information until the scenario dialogue in the target scenario file is displayed.
The step of determining a scenario dialogue from a target scenario file comprises: determining a first scenario dialogue based on the dialogue sequence of scenario dialogues in the target scenario file; or determining the next scenario dialogue based on the jump information corresponding to the selected option information in the current scenario dialogue.
According to the dialogue information, controlling the step of displaying the scenario dialogue, comprising the following steps: displaying character identifications and/or character names of non-player characters; and displaying dialogue contents of the scenario dialogue according to the display mode of the scenario dialogue.
After the step of displaying the dialogue content of the scenario dialogue according to the display mode of the scenario dialogue, the method further comprises the following steps: determining associated option information and jump information of the option information in the scenario dialogue, and displaying the option information and the jump information of the option information; responding to triggering operation aiming at appointed option information in the option information, and triggering a game event corresponding to the appointed option information based on jump information of the appointed option information; wherein the game event includes one or more of: ending dialogue, displaying scenario dialogue outside target dialogue in target scenario file, displaying scenario dialogue outside target scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, displaying appointed scene content.
The interaction attributes include: an interaction range; responsive to the specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene, comprising: and responding to the movement of the target virtual character into the interaction range of the non-player character, determining the interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
In the mode, the non-player characters are created in the game editing scene by constructing the game editing scene, the interaction attribute is configured for the non-player characters, and various interaction events can be executed by the non-player characters in the game scene corresponding to the game editing scene in the game stage, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.
The present embodiment also provides a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the above-described game editing method and game control method.
The machine-executable instructions stored on the machine-readable storage medium may implement the following operations in the game editing method by executing the machine-executable instructions:
responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; wherein, the interactive attribute is used for: the method comprises the steps that the method is related to a designated scene event in a game editing scene, after the designated scene event is triggered, an interaction event of a non-player character is determined based on interaction attributes, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene; in response to a save operation for the non-player character, the game edit scene created with the non-player character is saved.
Responsive to an attribute configuration operation for the non-player character, determining an interactive attribute for the non-player character, comprising: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a first configuration control for an interaction event; an interaction event for the non-player character is determined in response to a first configuration operation for the first configuration control.
The first configuration control comprises: adding a control for the interaction event; responsive to a first configuration operation for a first configuration control, determining an interaction event for the non-player character, comprising: responding to the triggering operation of adding a control for an interaction event, configuring a first interaction event for a non-player character, and displaying an event configuration control of the first interaction event; responding to the appointed operation of the event configuration control, and displaying an interactive content selection window; and determining the interactive content of the first interactive event through the interactive content selection window.
The interactive content selection window comprises an alternative content new control; the step of determining the interactive content of the first interactive event through the interactive content selection window comprises the following steps: responding to the triggering operation of the new control for the interactive content, and displaying an interactive content configuration window; determining alternative interactive contents of the first interactive event through an interactive content configuration window; displaying alternative interactive contents in the interactive content selection window, and responding to the selected operation aiming at the alternative interactive contents, and determining the selected alternative interactive contents as the interactive contents of the first interactive event.
The interactive content configuration window comprises a plurality of preset interactive contents; the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps: and responding to the selection operation aiming at the plurality of preset interactive contents, and determining the selected preset interactive contents as the alternative interactive contents of the first interactive event.
The interactive content configuration window comprises preset interactive content which is pre-configured and stored in a specified storage space, wherein the scenario file comprises at least one scenario dialogue; responsive to a selection operation for a plurality of preset interactive contents, determining the selected preset interactive contents as candidate interactive contents for the first interactive event, comprising: responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario-targeted dialogue included in the target scenario file from the appointed storage space, and displaying at least one item of scenario-targeted dialogue; and responding to the selection operation of at least one item of target scenario dialogue, and determining the selected target scenario dialogue as the alternative interaction content of the first interaction event.
After the steps of responding to the triggering operation aiming at the target scenario file, acquiring at least one item of scenario marking dialogue included in the target scenario file from the appointed storage space and displaying the at least one item of scenario marking dialogue, the method further comprises the following steps: displaying a dialogue editing control of the target scenario dialogue; the dialog edit control is used to provide editing functionality for editing at least a portion of the following: the dialogue sequence of the target scenario dialogue in the target scenario file, the dialogue content of the target scenario dialogue, the display mode of the target scenario dialogue, the option information related to the target scenario dialogue and the jump information of the option information; and updating the target scenario dialogue in response to the editing operation aiming at the dialogue editing control.
The interactive content configuration window comprises an interactive content creation control; the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps: responding to the triggering operation of the interactive content creation control, and displaying a content creation window; in response to a content input operation for the content creation window, the created interactive content is determined, and the created interactive content is determined as an alternative interactive content to the first interactive event.
The interactive content is a scenario file; in response to a content input operation for a content creation window, determining the created interactive content, the step of determining the created interactive content as an alternative interactive content to the first interactive event, comprising: displaying scenario creation controls in the content creation window; responding to the creation operation aiming at the scenario creation control, and determining the created new scenario file, at least one new scenario dialogue included in the new scenario file and dialogue information of the new scenario dialogue; in response to a selection operation for at least one new scenario dialogue, the selected new scenario dialogue is determined as an alternative interactive content of the first interactive event.
Responsive to a creation operation for the scenario creation control, determining a created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue, including: responding to a first creation operation aiming at the scenario creation control, determining the scenario name of the created new scenario file and the number of dialogs of the scenario dialog, and displaying the dialog creation control of the new scenario file; responding to a second creation operation aiming at the dialogue creation control, creating a new scenario dialogue for the new scenario file and dialogue information of the new scenario dialogue; wherein, the virtual character sending out the new scenario dialogue is a non-player character; the dialog information includes at least part of the following: the conversation sequence of the new scenario conversation in the new scenario file, the conversation content of the new scenario conversation, the display mode of the new scenario conversation, the option information related to the new scenario conversation and the jump information of the option information.
The option information is used to provide one or more of the following game events based on the skip information of the option information: ending dialogue, displaying other scenario dialogue in the current scenario file, displaying scenario dialogue in scenario files other than the current scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, and displaying appointed scene content.
After the step of determining the created new scenario file, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue in response to the creation operation for the scenario creation control, the method further includes: and responding to the preservation operation aiming at the new scenario file, creating the new scenario file in the appointed storage space, and preserving the new scenario dialogue and the new scenario dialogue information into the new scenario file.
Responsive to a designated operation for the event configuration control, displaying an interactive content selection window, comprising: responding to triggering operation of an event configuration control, and displaying an event editing window aiming at a first interaction event; the event editing window comprises a content adding control for the first interaction event; responding to triggering operation aiming at the content adding control, configuring first interactive content for a first interactive event, and displaying a content configuration control of the first interactive content; wherein, no information is configured in the first interactive content; and responding to the triggering operation of the content configuration control, and displaying an interactive content selection window.
The event editing window comprises an event type configuration control for the first interaction event; the method further comprises the steps of: and responding to the type configuration operation of the event type configuration control, and determining the interaction type of the first interaction event.
The event editing window comprises an interaction name configuration control for the first interaction event; the method further comprises the steps of: and responding to the name configuration operation aiming at the interaction name configuration control, and determining the interaction name of the first interaction event.
The event editing window comprises an interaction identification configuration control for a first interaction event; the method further comprises the steps of: and responding to the identifier configuration operation aiming at the interaction identifier configuration control, and determining the interaction identifier as the first interaction event.
The step of displaying the alternative interactive contents in the interactive content selection window comprises the following steps: determining whether the first alternative interactive content is configured in the first interactive event in the alternative interactive content; if yes, displaying a first alternative interactive content in the interactive content selection window in a first display mode; the first display mode is used for indicating that the first alternative interactive content is in an unselected state; if not, displaying the first alternative interactive content in the interactive content selection window in a second display mode; the second display mode is used for indicating that the first alternative interactive content is in a selectable state.
After the steps of configuring the first interaction event for the non-player character and displaying the event configuration control of the first interaction event in response to the triggering operation of the event adding control, the method further comprises: and canceling the first interaction event configured for the non-player character in response to the deletion operation of the event configuration control for the first interaction event, and deleting the event configuration control of the first interaction event.
After the steps of configuring the first interactive content for the first interactive event and displaying the content configuration control of the first interactive content in response to the triggering operation for the content addition control, the method further includes: and responding to the deleting operation of the content configuration control for the first interactive content, canceling the first interactive content configured for the first interactive event, and deleting the content configuration control of the first interactive content.
Responsive to an attribute configuration operation for the non-player character, determining an interactive attribute for the non-player character, comprising: providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a second configuration control regarding the interaction range; responsive to a second configuration operation for the second configuration control, an interaction range of the non-player character is determined.
The second configuration control comprises: a distance configuration control; responsive to a first configuration operation for a first configuration control, determining an interaction range of the non-player character, comprising: responding to the distance configuration operation aiming at the distance configuration control, and determining the interaction distance of the non-player character; determining the interaction range of the non-player character according to the interaction distance and a preset interaction center; the geometric center of the non-player character is a preset interaction center.
The second configuration control comprises: a control is configured in the center; and responding to the center configuration operation aiming at the center configuration control, and adjusting a preset interaction center.
According to the interaction distance and the preset interaction center, determining the interaction range of the non-player character comprises the following steps: and determining a circular area formed by taking the preset interaction center as a circle center and the interaction distance as a radius as the interaction range of the non-player character.
The method further comprises the following steps: providing a first interactive control through a graphical user interface; responding to triggering operation aiming at the first interaction control, and controlling and generating game scene information corresponding to the game editing scene; the game scene information comprises component information of edited components in a game editing scene, wherein the edited components are editing components responding to editing operations in a plurality of editable components included in an editing window; controlling to transmit game scene information to a server; the server is configured to be in communication connection with the terminal equipment, the terminal equipment is configured with a game program, the terminal equipment is configured to acquire game scene information from the server, and a corresponding game scene is generated according to the game scene information through the game program.
The following operations in the above game control method may also be implemented:
displaying a game scene on a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining interaction properties of the non-player character in response to the property configuration operation for the non-player character; responding to the save operation aiming at the non-player character, and saving the game editing scene created with the non-player character; in response to the specified scene event being triggered, an interaction event for the non-player character is determined based on the interaction attribute, and the non-player character is controlled to perform the interaction event in the game scene.
Responsive to the specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene, comprising: responding to the interaction operation of the target virtual character and the non-player character, determining at least one interaction event pre-configured by the non-player character, and displaying a trigger control corresponding to the interaction event; the triggering control comprises the following steps: an interaction identifier and an interaction name of the interaction event; and responding to the triggering operation aiming at the target triggering control, and executing the interaction event corresponding to the target triggering control.
The interactive event corresponding to the target trigger control is a scenario dialogue; the step of executing the interaction event corresponding to the target trigger control comprises the following steps: acquiring a target scenario file corresponding to the interaction event from the appointed storage space; wherein, the target scenario file comprises at least one scenario dialogue; determining a scenario dialogue from a target scenario file, and acquiring dialogue information of the scenario dialogue; wherein, the dialogue information comprises at least part of the following contents: the dialogue sequence of the scenario dialogue in the target scenario file, the dialogue content of the target dialogue, the display mode of the target dialogue, the option information related to the target dialogue and the jump information of the option information; and controlling and displaying the scenario dialogue according to the dialogue information until the scenario dialogue in the target scenario file is displayed.
The step of determining a scenario dialogue from a target scenario file comprises: determining a first scenario dialogue based on the dialogue sequence of scenario dialogues in the target scenario file; or determining the next scenario dialogue based on the jump information corresponding to the selected option information in the current scenario dialogue.
According to the dialogue information, controlling the step of displaying the scenario dialogue, comprising the following steps: displaying character identifications and/or character names of non-player characters; and displaying dialogue contents of the scenario dialogue according to the display mode of the scenario dialogue.
After the step of displaying the dialogue content of the scenario dialogue according to the display mode of the scenario dialogue, the method further comprises the following steps: determining associated option information and jump information of the option information in the scenario dialogue, and displaying the option information and the jump information of the option information; responding to triggering operation aiming at appointed option information in the option information, and triggering a game event corresponding to the appointed option information based on jump information of the appointed option information; wherein the game event includes one or more of: ending dialogue, displaying scenario dialogue outside target dialogue in target scenario file, displaying scenario dialogue outside target scenario file, playing appointed video, executing appointed game task, obtaining appointed game rewards, displaying appointed scene content.
The interaction attributes include: an interaction range; responsive to the specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene, comprising: and responding to the movement of the target virtual character into the interaction range of the non-player character, determining the interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
In the mode, the non-player characters are created in the game editing scene by constructing the game editing scene, the interaction attribute is configured for the non-player characters, and various interaction events can be executed by the non-player characters in the game scene corresponding to the game editing scene in the game stage, so that the interaction modes of players are enriched, the utilization rate of the game scene edited by the players, the enthusiasm of the game scene created by the players and the resource utilization rate of a game system are improved, and the idling and waste of game service resources are avoided.
The computer program product of the game editing method, the game control device and the game system provided in the embodiments of the present disclosure includes a computer readable storage medium storing program codes, and the instructions included in the program codes may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments and will not be repeated herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In addition, in the description of the embodiments of the present disclosure, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the terms in this disclosure will be understood by those skilled in the art in the specific case.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in essence or a part contributing to the prior art or a part of the technical solution, or in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present disclosure, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present disclosure and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the foregoing examples are merely illustrative of specific embodiments of the present disclosure, and are not intended to limit the scope of the disclosure, although the disclosure has been described in detail with reference to the foregoing examples, it will be understood by those skilled in the art that: any person skilled in the art, within the technical scope of the disclosure of the present disclosure, may modify or easily conceive changes to the technical solutions described in the foregoing embodiments, or make equivalent substitutions for some of the technical features thereof; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the disclosure, and are intended to be included within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (35)

1. A game editing method, the method comprising:
responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene;
determining an interaction attribute of the non-player character in response to an attribute configuration operation for the non-player character; wherein the interaction attribute is for: the interaction event of the non-player character is determined based on the interaction attribute after the specified scene event is triggered, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene;
And saving the game editing scene created with the non-player character in response to a save operation for the non-player character.
2. The method of claim 1, wherein the step of determining the interaction attribute of the non-player character in response to an attribute configuration operation for the non-player character comprises:
providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a first configuration control for the interaction event;
responsive to a first configuration operation for the first configuration control, an interaction event for the non-player character is determined.
3. The method of claim 2, wherein the first configuration control comprises: adding a control for the interaction event;
responsive to a first configuration operation for the first configuration control, determining an interaction event for the non-player character, comprising:
responding to the triggering operation of adding a control for the interaction event, configuring a first interaction event for the non-player character, and displaying an event configuration control of the first interaction event;
responding to the appointed operation of the event configuration control, and displaying an interactive content selection window;
And determining the interactive content of the first interactive event through the interactive content selection window.
4. The method of claim 3, wherein the interactive content selection window comprises an alternate content creation control;
the step of determining the interactive content of the first interactive event through the interactive content selection window comprises the following steps:
responding to the triggering operation of the new control for the interactive content, and displaying an interactive content configuration window;
determining alternative interactive contents of the first interactive event through the interactive content configuration window;
displaying the alternative interactive content in the interactive content selection window, and responding to the selected operation aiming at the alternative interactive content, and determining the selected alternative interactive content as the interactive content of the first interactive event.
5. The method of claim 4, wherein the interactive content configuration window comprises a plurality of preset interactive contents;
the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps:
and responding to the selection operation of the plurality of preset interactive contents, and determining the selected preset interactive contents as the alternative interactive contents of the first interactive event.
6. The method of claim 5, wherein the interactive content configuration window includes preset interactive content that is a scenario file that is preconfigured and stored in a designated storage space, and the scenario file includes at least one scenario dialogue;
responsive to a selection operation for the plurality of preset interactive contents, determining the selected preset interactive contents as candidate interactive contents for the first interactive event, including:
responding to triggering operation aiming at a target scenario file, acquiring at least one item of scenario dialogue included in the target scenario file from the appointed storage space, and displaying the at least one item of scenario dialogue;
and responding to the selection operation of the at least one item of target scenario dialogue, and determining the selected target scenario dialogue as the alternative interaction content of the first interaction event.
7. The method of claim 6, wherein, in response to a trigger operation for a target scenario file, after the steps of acquiring at least one item of a scenario-targeted dialogue included in the target scenario file from the designated storage space and displaying the at least one item of scenario-targeted dialogue, the method further comprises:
Displaying a dialogue editing control of the target scenario dialogue; the dialog edit control is used to provide editing functionality for editing at least part of the following: the dialogue sequence of the target scenario dialogue in the target scenario file, the dialogue content of the target scenario dialogue, the display mode of the target scenario dialogue, the option information related to the target scenario dialogue and the jump information of the option information;
and responding to the editing operation aiming at the dialogue editing control, and updating the target scenario dialogue.
8. The method of claim 4, wherein the interactive content configuration window comprises an interactive content creation control;
the step of determining the alternative interactive content of the first interactive event through the interactive content configuration window comprises the following steps:
responding to the triggering operation of the interactive content creation control, and displaying a content creation window;
and responding to the content input operation aiming at the content creation window, determining the created interactive content, and determining the created interactive content as the alternative interactive content of the first interactive event.
9. The method of claim 8, wherein the interactive content is a scenario file;
The step of determining the created interactive content as the alternative interactive content of the first interactive event in response to the content input operation for the content creation window includes:
displaying a scenario creation control in the content creation window;
responding to the creation operation aiming at the scenario creation control, and determining a created new scenario file, at least one new scenario dialogue included in the new scenario file and dialogue information of the new scenario dialogue;
and responding to the selection operation of the at least one new scenario dialogue, and determining the selected new scenario dialogue as the alternative interaction content of the first interaction event.
10. The method of claim 8, wherein the step of determining the created new scenario file, at least one new scenario dialog included in the new scenario file, and dialog information of the new scenario dialog in response to a creation operation for the scenario creation control, comprises:
responding to a first creation operation aiming at the scenario creation control, determining the scenario name of the created new scenario file and the number of dialogs of the scenario dialog, and displaying the dialog creation control of the new scenario file;
Responding to a second creation operation aiming at the dialogue creation control, creating a new scenario dialogue for the new scenario file and dialogue information of the new scenario dialogue; wherein, the virtual character sending out the new scenario dialogue is the non-player character; the dialogue information includes at least part of the following: the conversation sequence of the new scenario conversation in the new scenario file, the conversation content of the new scenario conversation, the display mode of the new scenario conversation, the option information related to the new scenario conversation and the jump information of the option information.
11. The method according to claim 7 or 10, wherein the option information is used to provide one or more of the following game events based on the skip information of the option information: ending dialogue, displaying other scenario dialogue in the current scenario file, displaying scenario dialogue in scenario files other than the current scenario file, playing appointed video, executing appointed game task, acquiring appointed game rewards, and displaying appointed scene content.
12. The method of claim 8, wherein after the step of determining a new scenario file created, at least one new scenario dialogue included in the new scenario file, and dialogue information of the new scenario dialogue in response to a creation operation for the scenario creation control, the method further comprises:
And responding to the preservation operation aiming at the new scenario file, creating the new scenario file in a designated storage space, and preserving the new scenario dialogue and the new scenario dialogue information into the new scenario file.
13. A method according to claim 3, wherein the step of displaying an interactive content selection window in response to a designated operation of the configuration control for the event comprises:
responding to the triggering operation of the event configuration control, and displaying an event editing window for the first interaction event; the event editing window comprises a content adding control for the first interactive event;
responding to the triggering operation of the content adding control, configuring first interactive content for the first interactive event, and displaying a content configuration control of the first interactive content; wherein, the first interactive content is not configured with information;
and responding to the triggering operation of the content configuration control, and displaying an interactive content selection window.
14. The method of claim 13, wherein the event editing window includes an event type configuration control for the first interactive event;
the method further comprises the steps of:
And responding to the type configuration operation aiming at the event type configuration control, and determining the interaction type of the first interaction event.
15. The method of claim 13, wherein the event editing window includes an interaction name configuration control for the first interaction event;
the method further comprises the steps of:
and responding to the name configuration operation aiming at the interaction name configuration control, and determining the interaction name of the first interaction event.
16. The method of claim 13, wherein the event editing window includes an interactive identification configuration control for the first interactive event;
the method further comprises the steps of:
and responding to the identification configuration operation aiming at the interaction identification configuration control, and determining the interaction identification of the first interaction event.
17. The method of claim 4, wherein displaying the alternative interactive content in the interactive content selection window comprises:
determining whether a first alternative interactive content is configured in the first interactive event in the alternative interactive content;
if yes, displaying the first alternative interactive content in the interactive content selection window in a first display mode; the first display mode is used for indicating that the first alternative interactive content is in an unselected state;
If not, displaying the first alternative interactive content in the interactive content selection window in a second display mode; the second display mode is used for indicating that the first alternative interactive content is in a selectable state.
18. The method of claim 8, wherein, in response to a triggering operation for the event addition control, after the steps of configuring a first interactive event for the non-player character and displaying an event configuration control for the first interactive event, the method further comprises:
and canceling the first interaction event configured for the non-player character in response to the deletion operation of the event configuration control for the first interaction event, and deleting the event configuration control for the first interaction event.
19. The method of claim 13, wherein, in response to the triggering operation for the content addition control, configuring first interactive content for the first interactive event and displaying the content configuration control for the first interactive content, the method further comprises:
and canceling the first interactive content configured for the first interactive event and deleting the content configuration control of the first interactive content in response to the deleting operation of the content configuration control of the first interactive content.
20. The method of claim 1, wherein the step of determining the interaction attribute of the non-player character in response to an attribute configuration operation for the non-player character comprises:
providing a plurality of configuration controls at a graphical user interface; the configuration control comprises: a second configuration control regarding the interaction range;
and responding to a second configuration operation aiming at the second configuration control, and determining the interaction range of the non-player character.
21. The method of claim 20, wherein the second configuration control comprises: a distance configuration control;
responsive to a second configuration operation for the second configuration control, determining an interaction range of the non-player character, comprising:
responding to the distance configuration operation aiming at the distance configuration control, and determining the interaction distance of the non-player character;
determining the interaction range of the non-player character according to the interaction distance and a preset interaction center; the geometric center of the non-player character is the preset interaction center.
22. The method of claim 21, wherein the second configuration control comprises: a control is configured in the center; the method further comprises the steps of:
And responding to the center configuration operation aiming at the center configuration control, and adjusting the preset interaction center.
23. The method of claim 21, wherein the step of determining the interaction range of the non-player character according to the interaction distance and the preset interaction center comprises:
and determining a circular area formed by taking the preset interaction center as a circle center and the interaction distance as a radius as an interaction range of the non-player character.
24. The method according to claim 1, wherein the method further comprises:
providing a first interactive control through a graphical user interface;
responding to triggering operation aiming at a first interaction control, and controlling and generating game scene information corresponding to the game editing scene; the game scene information comprises component information of edited components in the game editing scene, wherein the edited components are editing components responding to editing operations in a plurality of editable components included in an editing window;
controlling to transmit the game scene information to a server; the server is configured to be in communication connection with the terminal equipment, the terminal equipment is configured with a game program, the terminal equipment is configured to acquire the game scene information from the server, and a corresponding game scene is generated according to the game scene information through the game program.
25. A game control method, the method comprising:
displaying a game scene on a graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining an interaction attribute of the non-player character in response to an attribute configuration operation for the non-player character; storing the game editing scene created with the non-player character in response to a save operation for the non-player character;
responsive to a specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, controlling the non-player character to execute the interaction event in the game scene.
26. The method of claim 25, wherein responsive to a specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene comprises:
responding to the interaction operation of the target virtual character and the non-player character, determining at least one interaction event preconfigured by the non-player character, and displaying a trigger control corresponding to the interaction event; the trigger control comprises: the interaction identification and the interaction name of the interaction event;
And responding to the triggering operation aiming at the target triggering control, and executing the interaction event corresponding to the target triggering control.
27. The method of claim 26, wherein the interaction event corresponding to the target trigger control is a scenario dialogue;
the step of executing the interaction event corresponding to the target trigger control comprises the following steps:
acquiring a target scenario file corresponding to the interaction event from a designated storage space; wherein, the target scenario file comprises at least one scenario dialogue;
determining a scenario dialogue from the target scenario file, and acquiring dialogue information of the scenario dialogue; wherein, the dialogue information comprises at least part of the following contents: the dialogue sequence of the scenario dialogue in the target scenario file, the dialogue content of the target dialogue, the display mode of the target dialogue, the option information related to the target dialogue and the jump information of the option information;
and controlling and displaying the scenario dialogue according to the dialogue information until the scenario dialogue in the target scenario file is displayed.
28. The method of claim 27, wherein the step of determining a scenario dialog from the target scenario file comprises:
Determining a first scenario dialogue based on the dialogue sequence of scenario dialogues in the target scenario file;
or determining the next scenario dialogue based on the jump information corresponding to the selected option information in the current scenario dialogue.
29. The method of claim 27, wherein the step of controlling the displaying of the scenario dialog according to the dialog information comprises:
displaying the character identification and/or the character name of the non-player character;
and displaying the dialogue content of the scenario dialogue according to the display mode of the scenario dialogue.
30. The method of claim 29, wherein after the step of displaying the dialogue content of the scenario dialogue in accordance with the display mode of the scenario dialogue, the method further comprises:
determining that the associated option information and the jump information of the option information exist in the scenario dialogue, and displaying the option information and the jump information of the option information;
responding to triggering operation aiming at specified option information in the option information, and triggering a game event corresponding to the specified option information based on jump information of the specified option information; wherein the game event includes one or more of: ending dialogue, displaying the scenario dialogue outside the target dialogue in the target scenario file, displaying the scenario dialogue outside the target scenario file, playing the appointed video, executing the appointed game task, acquiring the appointed game rewards and displaying the appointed scene content.
31. The method of claim 25, wherein the interaction attribute comprises: an interaction range;
responsive to a specified scene event being triggered, determining an interaction event for the non-player character based on the interaction attribute, the step of controlling the non-player character to perform the interaction event in the game scene comprising:
and responding to the movement of the target virtual character into the interaction range of the non-player character, determining an interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
32. A game editing device, the device comprising:
the creation module is used for responding to the creation instruction of the non-player character, constructing a game editing scene and creating the non-player character in the game editing scene;
the configuration module is used for responding to attribute configuration operation aiming at the non-player character and determining interaction attribute of the non-player character; wherein the interaction attribute is for: the interaction event of the non-player character is determined based on the interaction attribute after the specified scene event is triggered, and the non-player character is controlled to execute the interaction event in a game scene corresponding to the game editing scene;
And the storage module is used for responding to the storage operation aiming at the non-player character and storing the game editing scene created with the non-player character.
33. A game control device, the device comprising:
the display module is used for displaying the game scene on the graphical user interface; the game scene is edited in advance by the following modes: responding to a creation instruction of the non-player character, constructing a game editing scene, and creating the non-player character in the game editing scene; determining an interaction attribute of the non-player character in response to an attribute configuration operation for the non-player character; storing the game editing scene created with the non-player character in response to a save operation for the non-player character;
and the execution module is used for responding to the trigger of the appointed scene event, determining the interaction event of the non-player character based on the interaction attribute, and controlling the non-player character to execute the interaction event in the game scene.
34. An electronic device comprising a processor and a memory, the memory storing computer-executable instructions executable by the processor to implement the game editing method of any of claims 1-24 or the game control method of any of claims 25-31.
35. A computer readable storage medium storing computer executable instructions which, when invoked and executed by a processor, cause the processor to implement the game editing method of any one of claims 1 to 24 or the game control method of any one of claims 25 to 31.
CN202310858637.3A 2023-07-12 2023-07-12 Game editing method, game control device and electronic equipment Pending CN117138346A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310858637.3A CN117138346A (en) 2023-07-12 2023-07-12 Game editing method, game control device and electronic equipment
PCT/CN2024/100376 WO2025011295A1 (en) 2023-07-12 2024-06-20 Game editing method and apparatus, game control method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310858637.3A CN117138346A (en) 2023-07-12 2023-07-12 Game editing method, game control device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117138346A true CN117138346A (en) 2023-12-01

Family

ID=88903412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310858637.3A Pending CN117138346A (en) 2023-07-12 2023-07-12 Game editing method, game control device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117138346A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025011295A1 (en) * 2023-07-12 2025-01-16 网易(杭州)网络有限公司 Game editing method and apparatus, game control method and apparatus, and electronic device
CN119701360A (en) * 2024-12-04 2025-03-28 网易(杭州)网络有限公司 Virtual character control method and device, electronic equipment and readable storage medium
CN119896859A (en) * 2024-12-30 2025-04-29 网易(杭州)网络有限公司 Method, device, program product and electronic device for editing character components in games
WO2025124141A1 (en) * 2023-12-13 2025-06-19 网易(杭州)网络有限公司 Game event editing methods and game event editing apparatus, and storage medium and electronic device
WO2025223046A1 (en) * 2024-04-24 2025-10-30 网易(杭州)网络有限公司 Game logic processing method and apparatus for scene component, and device and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025011295A1 (en) * 2023-07-12 2025-01-16 网易(杭州)网络有限公司 Game editing method and apparatus, game control method and apparatus, and electronic device
WO2025124141A1 (en) * 2023-12-13 2025-06-19 网易(杭州)网络有限公司 Game event editing methods and game event editing apparatus, and storage medium and electronic device
WO2025223046A1 (en) * 2024-04-24 2025-10-30 网易(杭州)网络有限公司 Game logic processing method and apparatus for scene component, and device and storage medium
CN119701360A (en) * 2024-12-04 2025-03-28 网易(杭州)网络有限公司 Virtual character control method and device, electronic equipment and readable storage medium
CN119896859A (en) * 2024-12-30 2025-04-29 网易(杭州)网络有限公司 Method, device, program product and electronic device for editing character components in games

Similar Documents

Publication Publication Date Title
CN117138346A (en) Game editing method, game control device and electronic equipment
JP7707499B2 (en) Barrage processing method, device, electronic device, and program
CN111124401B (en) Method, device, terminal equipment and storage medium for editing multimedia in game
CN116899218B (en) Game plot generation method, device and electronic device
US20150026573A1 (en) Media Editing and Playing System and Method Thereof
CN112306321B (en) Information display method, device and equipment and computer readable storage medium
WO2024239435A1 (en) Sky background switching method and apparatus in game, and electronic device
CN117138345A (en) Game editing method, game control device and electronic equipment
WO2024239465A1 (en) Method and apparatus for switching sky light in game, and electronic device
CN114130011B (en) Method, device, storage medium and program product for selecting objects in virtual scenes
CN117462955A (en) Game editing method, device and electronic device
WO2025236901A1 (en) Operation control method and apparatus for game map, and electronic device
CN113893531A (en) Game role creating method and device, storage medium and computer equipment
WO2025119000A1 (en) Interaction method and apparatus in game, and electronic device
CN118341096A (en) Game matching method, game matching device, storage medium and electronic equipment
CN118217638A (en) Fight command method and device in game and electronic equipment
CN117138357A (en) Message processing methods, devices, electronic equipment and storage media in virtual scenes
WO2025011295A1 (en) Game editing method and apparatus, game control method and apparatus, and electronic device
US11921973B2 (en) Content playback program and content playback device
WO2024239433A1 (en) Method and apparatus for sky background switching in game, and electronic device
CN118543112A (en) Skill editing method and device in game and electronic equipment
CN119909381A (en) Game character editing method, device and electronic device
JP6871715B2 (en) Event control method and event control program
CN117061814A (en) Video playing method, device, equipment, storage medium and program product
HK40038706A (en) Information display method, device and apparatus, and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination