[go: up one dir, main page]

CN115814405A - Video recording method and device in game, electronic equipment and storage medium - Google Patents

Video recording method and device in game, electronic equipment and storage medium Download PDF

Info

Publication number
CN115814405A
CN115814405A CN202211422527.4A CN202211422527A CN115814405A CN 115814405 A CN115814405 A CN 115814405A CN 202211422527 A CN202211422527 A CN 202211422527A CN 115814405 A CN115814405 A CN 115814405A
Authority
CN
China
Prior art keywords
lens
virtual
user interface
graphical user
game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211422527.4A
Other languages
Chinese (zh)
Inventor
耿备
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211422527.4A priority Critical patent/CN115814405A/en
Publication of CN115814405A publication Critical patent/CN115814405A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a video recording method, a video recording device, electronic equipment and a storage medium in a game, wherein the method comprises the following steps: responding to the triggering of a game video recording mode, canceling the display of the interactive control in the graphical user interface, controlling a virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time; responding to touch operation acting on the video recording control area, and controlling the virtual lens to execute corresponding lens adjustment operation according to the touch operation so as to enable the graphical user interface to display a game scene picture determined based on the motion of the virtual lens in real time; and generating corresponding recording data based on the game scene picture displayed in real time.

Description

Video recording method and device in game, electronic equipment and storage medium
Technical Field
The present invention relates to the field of interface interaction technologies, and in particular, to a video recording method in a game, a video recording apparatus in a game, an electronic device, and a computer-readable storage medium.
Background
With the development of game technology, image acquisition of game scenes is supported in many games. In the process of image acquisition of the game content, the player can acquire the image of the game content under the guidance of a User Interface (UI) control displayed on a graphical User Interface. However, in the image capture process of the game, only static capture is supported, that is, the player can only take a picture through the image capture function and cannot record the picture.
Disclosure of Invention
The embodiment of the invention provides a video recording method and device in a game, electronic equipment and a computer readable storage medium, which are used for solving or partially solving the problem that a player cannot record video in the game.
The embodiment of the invention discloses a video control method in a game, which provides a graphical user interface through terminal equipment, wherein the content displayed by the graphical user interface comprises a part of game scene, virtual characters positioned in the game scene and an interactive control, and the method comprises the following steps:
responding to the triggering of a game video recording mode, canceling the display of the interactive control in the graphical user interface, controlling a virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time;
responding to touch operation acting on the video recording control area, and controlling the virtual lens to execute corresponding lens adjustment operation according to the touch operation so as to enable the graphical user interface to display a game scene picture determined based on the motion of the virtual lens in real time;
and generating corresponding recording data based on the game scene picture displayed in real time.
The embodiment of the invention also discloses a video recording device in the game, which provides a graphical user interface through an electronic terminal, wherein the content displayed by the graphical user interface comprises part of game scenes, virtual characters positioned in the game scenes and interactive controls, and the video recording device comprises:
the video recording mode triggering module is used for responding to the triggering of a game video recording mode, canceling the display of the interactive control in the graphical user interface, controlling the virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time;
the video control module is used for responding to touch operation acting on the video control area, controlling the virtual lens to execute corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time;
and the recording module is used for generating corresponding recording data based on the game scene picture displayed in real time.
The embodiment of the invention also discloses electronic equipment which comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory finish mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is configured to implement the method according to the embodiment of the present invention when executing the program stored in the memory.
Also disclosed is a computer-readable storage medium having instructions stored thereon, which, when executed by one or more processors, cause the processors to perform a method according to an embodiment of the invention.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, in the game process, when a player wants to record a game content, a corresponding instruction can be input to trigger the game recording mode, the terminal can respond to the game recording mode trigger to cancel and display the interactive control in the graphical user interface, control the virtual lens of the game to enter the recording mode, and provide a recording control area for controlling the virtual lens in the graphical user interface, wherein the recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time, then responds to the touch operation acting on the recording control area, controls the virtual lens to execute corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time, and then generates corresponding recording data based on the game scene picture displayed in real time, so that in the game process, the player can control the virtual lens through the camera control area to realize the recording of the game picture by providing the corresponding camera control area, and enriches the image acquisition mode in the game process.
Drawings
FIG. 1 is a flowchart illustrating steps of a video recording method in a game according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a graphical user interface in an embodiment of the invention;
FIG. 3 is a schematic view of a game interface provided in an embodiment of the present invention;
fig. 4 is a schematic diagram of a lens movement locus provided in an embodiment of the present invention;
fig. 5 is a schematic diagram of a lens movement locus provided in an embodiment of the present invention;
fig. 6 is a schematic diagram of a lens movement locus provided in an embodiment of the present invention;
fig. 7 is a schematic diagram of a lens movement locus provided in an embodiment of the present invention;
fig. 8 is a schematic diagram of lens steering control provided in an embodiment of the present invention;
fig. 9 is a schematic diagram of lens zoom control provided in the embodiment of the present invention;
fig. 10 is a schematic view of lens panning control provided in an embodiment of the present invention;
FIG. 11 is a schematic view of a lens zoom control provided in an embodiment of the present invention;
fig. 12 is a schematic diagram of lens reset control provided in an embodiment of the present invention;
FIG. 13 is a block diagram showing a video recording apparatus in a game according to an embodiment of the present invention;
fig. 14 is a block diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
As an example, as games are becoming an important way for people to enjoy everyday entertainment, players like to record game content and to collect, distribute, etc. recorded videos. For the recording of game content, in the related technology, the recording can be performed through a third-party application program, and the recording can also be performed through a video recording function carried by the terminal, but for the game, the game can only provide a corresponding picture shooting function, and the video recording cannot be realized. However, in this recording manner, on one hand, the interactive control provided in the game interface for executing the corresponding game function cannot be eliminated, so that the picture corresponding to the recorded video carries the control affecting the game content, which affects the image acquisition instruction, and on the other hand, the interactive manner based on the game itself has the problem of complicated operation.
In contrast, one of the core invention points of the present invention is that in a game process, when a player wants to record a game content, a corresponding instruction may be input to trigger a game recording mode, and then a terminal may respond to the game recording mode trigger to cancel displaying the interactive control in the graphical user interface, control a virtual lens of the game to enter a recording mode, and provide a recording control area for controlling the virtual lens in the graphical user interface, where the recording mode is used to collect and record the content displayed by the graphical user interface in real time, and then respond to a touch operation applied to the recording control area, and control the virtual lens to perform a corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time, and then generate corresponding recording data based on the game scene picture displayed in real time, so that in the game recording process, by providing a corresponding camera control area, the player may control the virtual lens through the camera control area to record the game scene picture, thereby enriching the image collection mode in the game process.
Referring to fig. 1, a flowchart illustrating steps of a video recording method in a game provided in an embodiment of the present invention is shown, where a graphical user interface is provided through an electronic terminal, and content displayed by the graphical user interface includes a part of a game scene, a virtual character located in the game scene, and an interactive control, and the method specifically includes the following steps:
step 101, responding to a game video recording mode trigger, canceling to display the interactive control in the graphical user interface, controlling a virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time;
in the game process, the electronic terminal may run a corresponding game application program and display a corresponding game interface in the graphical user interface, where the game interface may include a part of a game scene, a virtual character located in the game scene, and an interactive control for operating the virtual character, such as a move control, a jump control, a skill control, and the like.
As shown in fig. 2, which is a schematic diagram illustrating a graphical user interface according to an embodiment of the present invention, a graphical user interface 220 may be obtained by executing a software application on a processor of the mobile terminal 210 and rendering the software application on a touch display of the mobile terminal 210, the graphical user interface 220 may include at least one operation control 230, and the graphical user interface 220 may further include a virtual object 240 and a virtual joystick 250.
In a specific implementation, the virtual joystick 250 may be disposed at the lower left of the graphical user interface 220, and controls the game character 240 to displace and/or rotate in the game scene according to the operation received by the virtual joystick 250, and a plurality of operation controls 230 are provided at the lower right of the graphical user interface 220 for providing different control operations to the player. Therefore, in the embodiment of the invention, the left hand can be used for conveniently controlling the game role to perform displacement and visual angle rotation in the game scene, and the right hand can be used for controlling different operation controls to control the virtual object. Besides, the user can control the game role in the game scene through the control icons (virtual rockers, operation controls and the like) displayed on the graphical user interface, and can also control the game role by utilizing functions of a gyroscope, a 3D Touch and the like of the terminal equipment.
In the process of shooting and recording game contents, the quality of the collected data can be effectively improved by ensuring the cleanness of a game interface, otherwise, the display of a game scene is easily shielded due to various interactive controls displayed in the interface, and the effective presentation of the game contents is reduced. Therefore, in the embodiment of the present invention, in response to the game video recording mode triggering, the terminal may adjust the transparency of each interactive control in the graphical user interface until the interactive control is no longer visually displayed in the graphical user interface, and simultaneously shield the control functions of other interactive controls except the virtual joystick for controlling the virtual character to move in the game scene, and divide the graphical user interface into a plurality of invisible video recording control areas for performing different video recording control operations, so that when the player triggers the game video recording mode in the game, the terminal may hide (no longer visually) all the displayed interactive controls, and divide the graphical user interface into a plurality of video recording control areas for performing different control operations.
Optionally, for the triggering of the game video recording mode, the terminal may display a corresponding video recording control in the graphical user interface, so that the player triggers the game video recording mode through the video recording control; in addition, the game video mode may also be triggered by inputting a corresponding touch operation in the graphical user interface, and in the game video mode, the terminal may acquire and record the content displayed in the graphical user interface in real time to obtain corresponding recording data, which is not limited by the present invention.
In an example, referring to fig. 3, a schematic diagram of a game interface provided in an embodiment of the present invention is shown, where a terminal runs a corresponding game application program and displays a corresponding game interface in a graphical user interface, and when a player inputs a corresponding control instruction to trigger a game video recording mode, the terminal may hide all interactive controls displayed in the game interface and divide the graphical user interface into a plurality of video recording control areas for executing different control operations, such as "lens auxiliary operation area" 2, "lens main adjustment operation area," "moving wheel area (i.e., moving control area)" and "photo confirmation area (i.e., video recording end area)" provided in fig. 3, and different control areas may be used to implement different video recording control functions, so that content displayed in the interface may be reduced by hiding the interactive controls, interference of unnecessary content in a video recording process may be reduced, and a cleanness of a video recording picture may be improved, and on the other hand, video recording control may be implemented by dividing different control areas, and an operation mode may be simplified, and convenience of video recording control operation may be improved.
102, responding to a touch operation acting on the video recording control area, and controlling the virtual lens to execute a corresponding lens adjusting operation according to the touch operation so as to enable the graphical user interface to display a game scene picture determined based on the motion of the virtual lens in real time;
in the embodiment of the invention, after the game video recording mode is entered, the terminal can control the virtual lens to move around the virtual character according to the touch operation as the player inputs the corresponding touch operation in the video recording control area, so that the corresponding game scene picture is displayed in the graphical user interface according to the movement of the virtual lens, and the changed game scene picture is recorded to obtain the corresponding recording data.
For the game video, dynamic content may be recorded, for example, when the player controls the virtual character to move in the game scene, the moving process of the virtual character and the change of the game scene may be recorded, and a process that the player controls the virtual character to execute a corresponding game task may also be recorded, which is not limited in the present invention.
In the embodiment of the present invention, after all the interactive controls are hidden, in order to enable the player to control the virtual character, a mobile control area for moving and initiating work on the virtual character may be divided in the control area, so that the player can control the virtual character to move in the game scene through the mobile control area, and meanwhile, video recording in the game can be implemented based on the video recording control area. Optionally, the movement control area may be a control area corresponding to the virtual joystick, in which the terminal makes the virtual joystick no longer visually displayed on the graphical user interface by adjusting the transparency of the virtual joystick, but may retain its corresponding control function, so that the player can control the virtual character to move in the game scene through the movement control area while recording the game video, and specifically, the terminal may control the virtual character to move in the game scene according to the touch operation in response to the touch operation on the virtual joystick, and display a game scene picture determined based on the movement of the virtual character on the graphical user interface in real time.
In specific implementation, the terminal can respond to touch operation for the virtual rocker, control the virtual character to move in a game scene according to the touch operation, acquire a real-time position and a real-time orientation corresponding to a virtual lens in the moving process of the virtual character while moving in the game scene according to the virtual character, and then determine a game scene picture corresponding to the game scene in the graphical user interface according to the real-time position and the real-time orientation so as to display the game scene picture determined based on the movement of the virtual character in the graphical user interface in real time, thereby realizing the function of controlling and recording video at the same time and enriching the image acquisition mode in the game process.
It should be noted that, in a game, a game screen can be presented through a virtual lens, the position of the virtual lens can move along with the movement of a virtual character in a game scene, in a default case, the orientation of the virtual lens can be consistent with the orientation of the virtual character, and along with the change of the orientation of the virtual character by a player, the virtual lens also changes, for example, under normal game control, the player can change the orientation of the virtual character through an orientation control area, a view angle switching control, and the like; in the game recording mode, the player can control the position, orientation, and the like of the virtual lens through the control area.
In the embodiment of the present invention, after entering the game video recording mode, the terminal may divide a video control area in addition to a movement control area corresponding to the virtual joystick on the graphical user interface, where the video control area may include at least a lens control area, a first lens auxiliary area, a second lens auxiliary area, and the like, where the lens control area may be used to control lens parameters of the virtual lens, and the first lens auxiliary area and the second lens auxiliary area may be matched with the lens control area, so as to implement control of the lens parameters of the virtual lens through a composite control operation.
In a specific implementation, the terminal may determine a lens rotation direction corresponding to the lens steering operation in response to the lens steering operation for the lens control area, and control the virtual lens to rotate according to the lens rotation direction, so that the graphical user interface displays a game scene picture determined based on the rotation of the virtual lens in real time; the method can respond to the focal length control operation aiming at the lens control area, determine a target focal length corresponding to the focal length control operation, and adjust the virtual lens from the current focal length to the target focal length so that the graphical user interface displays a game scene picture determined based on the zooming of the virtual lens in real time; or under the condition of continuously acquiring a touch signal for the first lens auxiliary area, determining a first lens translation direction corresponding to the first lens translation operation in response to the first lens translation operation for the lens control area, and controlling the virtual lens to translate in a plane perpendicular to the horizontal plane according to the first lens translation direction, so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time; the method can also be used for responding to a second lens translation operation aiming at the lens control area under the condition of continuously acquiring a touch signal aiming at the second lens auxiliary area, determining a second lens translation direction corresponding to the second lens translation operation, and controlling the virtual lens to translate in a plane parallel to a horizontal plane according to the second lens translation direction so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time; and in response to a lens resetting operation aiming at the first lens auxiliary area and the second lens auxiliary area, resetting the virtual lens in a default state according to the lens resetting operation, so that the graphical user interface displays a game scene picture determined based on the reset virtual lens in real time. In addition, when the player wants to end the video recording, the current game video recording can be ended through the recording ending area in the video recording control area, and specifically, the terminal can end the recording of the game scene picture in response to the double-click touch operation aiming at the recording ending area.
For example, the virtual character may be used as the center of sphere, and the player controls the virtual lens to rotate through the lens control area to display a corresponding game scene picture in the graphical user interface, referring to fig. 4, which shows a schematic diagram of a lens motion trajectory provided in the embodiment of the present invention, and with a certain object (e.g., a virtual character) as the center of sphere, as the player inputs a corresponding control instruction in the lens control area, the terminal may control the virtual lens to rotate around the object to display a corresponding game scene picture in the graphical user interface.
For the focal length, as the focal length of the virtual lens is smaller, the visual field of a game scene picture which can be displayed in the graphical user interface by the terminal can be larger, and richer game contents can be displayed; for example, a virtual character can be used as a reference object, and a player controls a virtual lens to zoom through a lens control area so as to display a corresponding game scene picture in the graphical user interface, and referring to fig. 5, a schematic diagram of a lens motion trajectory provided in the embodiment of the present invention is shown, and with a certain object (e.g., a virtual character) as a reference object, as a player inputs a corresponding control instruction in the lens control area, the terminal can control the virtual lens to move closer to or away from the object to implement zooming, so as to display the corresponding game scene picture in the graphical user interface according to a zooming result.
For the translation of the lens, the translation may include translations under different coordinate systems, specifically, the virtual lens may be located in a spatial coordinate system, wherein xoz may form a horizontal plane, and xoy may form a plane perpendicular to the horizontal plane, and then for the virtual lens, according to different control instructions, it may perform different translation operations, and the different translation operations may correspond to changes in the finder frame view taking content, including keeping the focal length unchanged, and along with the translation of the virtual lens, the terminal may control the finder frame to move up and down and left and right, so as to extract corresponding content from the game scene, and display a corresponding game scene picture in the graphical user interface; in addition, along with the translation of the virtual lens, the terminal can also control the view-finding frame to move forwards or backwards along the depth wind direction of the picture, so that the lens is pushed and pulled to extract corresponding content in the game scene, and the corresponding game scene picture is displayed in the graphical user interface. Referring to fig. 6 and 7, which are schematic diagrams illustrating a lens movement trajectory provided in an embodiment of the present invention, in one case, as a player inputs a corresponding first lens translation operation through cooperation between a first lens auxiliary region and a lens control region, a terminal may control a virtual lens to move up, down, left and right on an xoy plane; in another case, as the player inputs a corresponding second lens panning operation through cooperation between the second lens auxiliary area and the lens control area, the terminal may control the virtual lens to move forward and backward in depth on the xoz plane to display a corresponding game scene picture in the graphical user interface according to the panning result of the lens.
It should be noted that, in the above-mentioned lens rotation, lens translation, lens zooming, etc., the virtual lens may be adjusted by using the virtual character as a reference object, and when the player controls the virtual character to move through the movement control area, the player character and the virtual lens may be kept in synchronization, that is, the virtual lens may move synchronously with the movement of the virtual character while the player controls the virtual character to move in the game scene, and the present invention is not limited thereto.
Through the input mode of the control instruction, in the process of game video recording, on one hand, the terminal can hide an interactive control used for executing control operation, the content displayed in an interface is reduced, the interference of unnecessary content in the process of video recording is reduced, the clean degree of a video recording picture is improved, on the other hand, different control areas are divided, so that in the process of video recording, a player can realize the mobile control of a virtual character and can also realize video recording control, the operation mode is simplified, the convenience of video recording control operation is improved, meanwhile, the function of controlling and recording video is realized based on the mobile control of the virtual character, and the mode of image acquisition in the game process is enriched.
Optionally, after the game video recording mode is triggered, the terminal may provide corresponding video recording teaching contents to the player first, so that the player can know the control functions corresponding to the control areas through the video recording teaching contents conveniently, and the convenience of video recording operation is improved.
Under the condition of triggering video recording or being in a game video recording mode, aiming at the lens control area, a player can input different touch operations to realize different control operations on the virtual lens, such as zooming, translating, pushing and pulling and the like. In a specific implementation, the lens control area may implement rotation and zooming of the virtual lens when video recording control is not performed in cooperation with the lens auxiliary area, and specifically, the terminal may respond to a first sliding operation for the lens control area, and if the first sliding operation is an upward sliding operation, generate an upward turning direction corresponding to the upward sliding operation; if the first sliding operation is a downward sliding operation, a downward steering direction corresponding to the downward sliding operation is generated; if the first sliding operation is a leftward sliding operation, generating a leftward steering direction corresponding to the leftward sliding operation; if the first sliding operation is a right sliding operation, a right steering direction corresponding to the right sliding operation is generated, and referring to fig. 8, a schematic diagram of lens steering control provided in an embodiment of the present invention is shown, and when a player inputs a slide-up operation in a lens control area, a virtual lens may be rotated upward; when the player inputs a slide-down operation in the lens control area, the virtual lens can rotate downwards; when the player inputs a left slide operation in the lens control area, the virtual lens can rotate to the left; when the player inputs a right slide operation in the lens control area, the virtual lens can rotate to the right, and correspondingly, the terminal can display a corresponding game scene picture in the graphical user interface according to the rotation of the virtual lens. In addition, the terminal may also determine a target focal length corresponding to the two-finger zoom operation in response to the two-finger zoom operation on the lens control area, and referring to fig. 9, a schematic diagram of the lens zoom control provided in the embodiment of the present invention is shown, when a player inputs a corresponding two-finger zoom-in operation in the lens control area, the terminal may reduce the focal length, increase a distance between the virtual lens and a reference object, implement reduction of a picture, and present a larger game view for the player; when the player inputs the corresponding double-finger pulling operation in the lens control area, the terminal can increase the focal length, reduce the distance between the virtual lens and the reference object, realize the amplification of the picture and present more detailed game content for the player.
In order to realize more lens control, the virtual lens control system can further comprise a first lens auxiliary area and a second lens auxiliary area besides the lens control area, and a player can input corresponding composite touch operation in the first lens auxiliary area and the lens control area or the second lens auxiliary area and the lens control area at the same time to realize control on different parameters of the virtual lens. Specifically, the terminal may respond to a long-press touch operation for the first lens auxiliary area, acquire a first touch signal corresponding to the long-press touch operation, and respond to a second sliding operation for the lens control area when the first touch signal is continuously detected, and if the second sliding operation is an upward sliding operation, generate an upward translation direction corresponding to the upward sliding operation; if the second sliding operation is a downward sliding operation, generating a downward translation direction corresponding to the downward sliding operation; if the second sliding operation is a leftward sliding operation, generating a leftward translation direction corresponding to the leftward sliding operation; if the second sliding operation is a rightward sliding operation, a rightward translation direction corresponding to the rightward sliding operation is generated, so that a player can translate the virtual lens by inputting a corresponding sliding operation in the lens control area while long-pressing the first lens auxiliary area, and referring to fig. 10, a schematic diagram of lens translation control provided in an embodiment of the present invention is shown, where when a player inputs an upward sliding operation in the lens control area while long-pressing the first lens auxiliary area, the virtual lens can translate upward, so as to achieve upward translation of the finder frame; when the downward sliding operation is input in the lens control area, the virtual lens can translate downwards to realize the downward translation of the view finder; when a left slide operation is input in the lens control area, the virtual lens can translate leftwards, so that the viewing frame translates leftwards; when the right slide operation is input in the lens control area, the virtual lens can translate to the right, so that the view frame translates to the right, correspondingly, the terminal can move the view frame of the virtual lens to the corresponding direction according to the composite operation input by the player in the first lens auxiliary area and the lens control area, so that the lens translation is realized, and the corresponding game scene picture is displayed in the graphical user interface.
In addition, the terminal can also respond to the long-press touch operation on the second lens auxiliary area, acquire a second touch signal corresponding to the long-press touch operation, respond to a second sliding operation on the lens control area under the condition that the second touch signal is continuously detected, and generate a forward translation direction corresponding to the upward sliding operation if the second sliding operation is the upward sliding operation; if the second sliding operation is a downward sliding operation, a backward translation direction corresponding to the downward sliding operation is generated, so that a player can push and pull the virtual lens by inputting a corresponding sliding operation in the lens control area while long-pressing the second lens auxiliary area, and referring to fig. 11, a schematic diagram of the lens push-pull control provided in the embodiment of the present invention is shown, when a player inputs an upward sliding operation in the lens control area while long-pressing the second lens auxiliary area, the virtual lens can move forward, and the viewing frame can move forward in the depth direction; when the sliding-down operation is input in the lens control area, the virtual lens can move backwards to realize that the viewing frame moves backwards along the depth direction, correspondingly, the terminal can move the viewing frame of the virtual lens towards the corresponding direction to realize lens pushing and pulling according to the composite operation input by the player in the second lens auxiliary area and the lens control area, and display the corresponding game scene picture in the graphical user interface.
In addition, for the first lens auxiliary area and the second lens auxiliary area, the first lens auxiliary area and the second lens auxiliary area can be respectively matched with the lens control area, and can also be matched between the auxiliary areas, specifically, the terminal can reset the virtual lens to a default state in response to touch operations simultaneously directed to the first lens auxiliary area and the second lens auxiliary area, and display a corresponding game scene picture in the graphical user interface, referring to fig. 12, which shows a schematic diagram of lens resetting control provided in the embodiment of the present invention, a player can simultaneously click the first lens auxiliary area and the second lens auxiliary area to reset the virtual lens, and for the reset of the virtual lens, the virtual lens can be reset to a default focal length, a default orientation, a default position, and the like, and accordingly, after the virtual lens is reset, the terminal can display a corresponding game scene picture in the graphical user interface.
Through the mutual cooperation between the lens control area and the lens auxiliary control area, a player can realize the rotation, translation, zooming, push-pull and other operations of a lens in a video recording process through different control areas, so that in the process of game video recording, on one hand, the terminal can hide an interactive control used for executing control operation, the content displayed in an interface is reduced, the interference of unnecessary content in the video recording process is reduced, and the cleanness degree of a video recording picture is improved.
And 103, generating corresponding recording data based on the game scene picture displayed in real time.
When a player triggers a game video recording mode, the terminal can record a game scene picture displayed on a graphical user interface in real time, when the player touches a video recording ending area, the terminal can end video recording and generate corresponding video recording data, so that in the process of game video recording, on one hand, the terminal can hide an interactive control used for executing control operation, content displayed in the interface is reduced, interference of unnecessary content in the process of video recording is reduced, and the cleanness of the video recording picture is improved.
It should be noted that the embodiment of the present invention includes but is not limited to the above examples, and it is understood that, under the guidance of the idea of the embodiment of the present invention, a person skilled in the art may also set the method according to actual requirements, and the present invention is not limited to this.
In the embodiment of the invention, in the game process, when a player wants to record a game content, a corresponding instruction can be input to trigger the game recording mode, the terminal can respond to the game recording mode trigger to cancel and display the interactive control in the graphical user interface, control the virtual lens of the game to enter the recording mode, and provide a recording control area for controlling the virtual lens in the graphical user interface, wherein the recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time, then responds to the touch operation acting on the recording control area, controls the virtual lens to execute corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time, and then generates corresponding recording data based on the game scene picture displayed in real time, so that in the game process, the player can control the virtual lens through the camera control area to realize the recording of the game picture by providing the corresponding camera control area, and enriches the image acquisition mode in the game process.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those of skill in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the embodiments of the invention.
Referring to fig. 13, a block diagram of a video recording apparatus in a game provided in an embodiment of the present invention is shown, where a graphical user interface is provided through an electronic terminal, where content displayed by the graphical user interface includes a part of a game scene, a virtual character located in the game scene, and an interaction control for operating the virtual character, and the video recording apparatus may specifically include the following modules:
a video recording mode triggering module 1301, configured to cancel displaying the interactive control in the graphical user interface in response to a game video recording mode trigger, control a virtual lens of the game to enter a video recording mode, and provide a video recording control area for controlling the virtual lens in the graphical user interface, where the video recording mode is used to collect and record content displayed by the graphical user interface in real time;
the video control module 1302 is configured to respond to a touch operation applied to the video control area, and control the virtual lens to execute a corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time;
and a recording module 1303, configured to generate corresponding recording data based on the game scene picture displayed in real time.
In an optional embodiment, the interaction control includes a virtual joystick for controlling the virtual character to move, and the video recording mode triggering module 1301 is specifically configured to:
and responding to the trigger of the game video recording mode, and controlling and adjusting the transparency of the virtual rocker so that the virtual rocker is not visually displayed on the graphical user interface any more.
In an alternative embodiment, the apparatus further comprises:
the movement control module is used for responding to touch operation aiming at the virtual rocker and controlling the virtual character to move in the game scene according to the touch operation;
and the picture display module is used for displaying the game scene picture determined based on the movement of the virtual character on the graphical user interface in real time.
In an alternative embodiment, the video recording control area further includes a lens control area, and the apparatus further includes:
and the lens steering control module is used for responding to lens steering operation aiming at the lens control area, determining a lens rotating direction corresponding to the lens steering operation, and controlling the virtual lens to rotate according to the lens rotating direction so that the graphical user interface displays a game scene picture determined based on the rotation of the virtual lens in real time.
In an alternative embodiment, further comprising:
and the focal length control module is used for responding to the focal length control operation aiming at the lens control area, determining a target focal length corresponding to the focal length control operation, and adjusting the virtual lens from the current focal length to the target focal length so that the graphical user interface displays a game scene picture determined based on the zooming of the virtual lens in real time.
In an alternative embodiment, the video recording control area further includes a lens control area and a first lens auxiliary area, and the apparatus further includes:
and the lens translation control module is used for responding to a first lens translation operation aiming at the lens control area under the condition of continuously acquiring the touch signal aiming at the first lens auxiliary area, determining a first lens translation direction corresponding to the first lens translation operation, and controlling a virtual lens to translate in a plane vertical to a horizontal plane according to the first lens translation direction, so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time.
In an alternative embodiment, the video recording control area further includes a lens control area and a second lens auxiliary area, and the apparatus further includes:
and the lens push-pull control module is used for responding to a second lens translation operation aiming at the lens control area under the condition of continuously acquiring a touch signal aiming at the second lens auxiliary area, determining a second lens translation direction corresponding to the second lens translation operation, and controlling a virtual lens to translate in a plane parallel to a horizontal plane according to the second lens translation direction so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time.
In an alternative embodiment, the video recording control area includes a first lens auxiliary area and a second lens auxiliary area, and the apparatus further includes:
and the lens resetting module is used for responding to lens resetting operation aiming at the first lens auxiliary area and the second lens auxiliary area, resetting the virtual lens in a default state according to the lens resetting operation, and enabling the graphical user interface to display a game scene picture determined based on the reset virtual lens in real time.
In an alternative embodiment, the video recording control area includes a video recording end area, and the apparatus further includes:
and responding to the double-click touch operation aiming at the recording ending area, and ending the recording of the game scene picture.
In an optional embodiment, the lens steering control module is specifically configured to:
responding to a first sliding operation aiming at the lens control area, and if the first sliding operation is an upward sliding operation, generating an upper steering direction corresponding to the upward sliding operation;
if the first sliding operation is a downward sliding operation, generating a lower steering direction corresponding to the downward sliding operation;
if the first sliding operation is a leftward sliding operation, generating a leftward turning direction corresponding to the leftward sliding operation;
and if the first sliding operation is a rightward sliding operation, generating a rightward steering direction corresponding to the rightward sliding operation.
In an optional embodiment, the focus control module is specifically configured to:
and in response to the double-finger zooming operation aiming at the lens control area, determining a target focal length corresponding to the double-finger zooming operation.
In an optional embodiment, the lens translation control module is specifically configured to:
responding to a long-press touch operation aiming at the first lens auxiliary area, and acquiring a first touch signal corresponding to the long-press touch operation;
under the condition that the first touch signal is continuously detected, responding to a second sliding operation aiming at the lens control area, and if the second sliding operation is an upward sliding operation, generating an upward translation direction corresponding to the upward sliding operation;
if the second sliding operation is a downward sliding operation, generating a downward translation direction corresponding to the downward sliding operation;
if the second sliding operation is a leftward sliding operation, generating a leftward translation direction corresponding to the leftward sliding operation;
and if the second sliding operation is a rightward sliding operation, generating a rightward translation direction corresponding to the rightward sliding operation.
In an optional embodiment, the lens push-pull control module is specifically configured to:
responding to a long-press touch operation aiming at the second lens auxiliary area, and acquiring a second touch signal corresponding to the long-press touch operation;
under the condition that the second touch signal is continuously detected, responding to a second sliding operation aiming at the lens control area, and if the second sliding operation is an upward sliding operation, generating a forward translation direction corresponding to the upward sliding operation;
and if the second sliding operation is a downward sliding operation, generating a backward translation direction corresponding to the downward sliding operation.
In an optional embodiment, the lens resetting module is specifically configured to:
and responding to the touch operation aiming at the first lens auxiliary area and the second lens auxiliary area at the same time, resetting the virtual lens in a default state, and displaying a corresponding game scene picture in the graphical user interface.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
In addition, an embodiment of the present invention further provides an electronic device, including: the processor, the memory, and the computer program stored in the memory and capable of running on the processor, when executed by the processor, implement each process of the video recording method embodiment in the game, and can achieve the same technical effect, and are not described herein again to avoid repetition.
The embodiment of the invention also provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program realizes each process of the video recording method embodiment in the game, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Fig. 14 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 1400 includes, but is not limited to: radio frequency unit 1401, network module 1402, audio output unit 1403, input unit 1404, sensor 1405, display unit 1406, user input unit 1407, interface unit 1408, memory 1409, processor 1410, and power supply 1411. It will be understood by those skilled in the art that the electronic device configurations involved in the embodiments of the present invention are not intended to be limiting, and that an electronic device may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1401 may be configured to receive and transmit signals during a message transmission or call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1410; in addition, the uplink data is transmitted to the base station. In general, radio unit 1401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. The radio unit 1401 may also communicate with a network and other devices via a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 1402, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1403 can convert audio data received by the radio frequency unit 1401 or the network module 1402 or stored in the memory 1409 into an audio signal and output as sound. Also, the audio output unit 1403 may also provide audio output related to a specific function performed by the electronic device 1400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1404 is for receiving an audio or video signal. The input Unit 1404 may include a Graphics Processing Unit (GPU) 14041 and a microphone 14042, and the Graphics processor 14041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 1406. The image frames processed by the graphics processor 14041 may be stored in the memory 1409 (or other storage medium) or transmitted via the radio frequency unit 1401 or the network module 1402. The microphone 14042 may receive sounds and be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1401 in case of a phone call mode.
The electronic device 1400 also includes at least one sensor 1405, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 14061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 14061 and/or the backlight when the electronic device 1400 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1405 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 1406 is used to display information input by the user or information provided to the user. The Display unit 1406 may include a Display panel 14061, and the Display panel 14061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 1407 includes a touch panel 14071 and other input devices 14072. The touch panel 14071, also referred to as a touch screen, may collect touch operations by a user (e.g., operations by a user on or near the touch panel 14071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 14071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1410, receives a command from the processor 1410, and executes the command. In addition, the touch panel 14071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 14071, the user input unit 1407 may include other input devices 14072. In particular, the other input devices 14072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein.
Further, the touch panel 14071 may be overlaid on the display panel 14061, and when the touch panel 14071 detects a touch operation on or near the touch panel 14071, the touch operation is transmitted to the processor 1410 to determine the type of the touch event, and then the processor 1410 provides a corresponding visual output on the display panel 14061 according to the type of the touch event. It is understood that in one embodiment, the touch panel 14071 and the display panel 14061 are two independent components to implement the input and output functions of the electronic device, but in some embodiments, the touch panel 14071 and the display panel 14061 can be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 1408 is an interface for connecting an external device to the electronic apparatus 1400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1408 may be used to receive input from an external device (e.g., data information, power, etc.) and transmit the received input to one or more elements within the electronic apparatus 1400 or may be used to transmit data between the electronic apparatus 1400 and the external device.
The memory 1409 may be used to store software programs as well as various data. The memory 1409 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory 1409 can include high speed random access memory and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1410 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 1409 and calling data stored in the memory 1409, thereby performing overall monitoring of the electronic device. Processor 1410 may include one or more processing units; preferably, the processor 1410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1410.
The electronic device 1400 may further include a power source 1411 (e.g., a battery) for supplying power to various components, and preferably, the power source 1411 may be logically connected to the processor 1410 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the electronic device 1400 includes some functional modules that are not shown, and are not described in detail herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (17)

1. A video recording control method in a game is characterized in that a terminal device provides a graphical user interface, the content displayed by the graphical user interface comprises a part of game scene, virtual characters positioned in the game scene and an interactive control, and the method comprises the following steps:
responding to the triggering of a game video recording mode, canceling the display of the interactive control in the graphical user interface, controlling a virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time;
responding to touch operation acting on the video recording control area, and controlling the virtual lens to execute corresponding lens adjustment operation according to the touch operation so as to enable the graphical user interface to display a game scene picture determined based on the motion of the virtual lens in real time;
and generating corresponding recording data based on the game scene picture displayed in real time.
2. The method of claim 1, wherein the interactive control comprises a virtual joystick for controlling movement of the virtual character, and wherein the dismissing the interactive control from the graphical user interface in response to a gameplay mode trigger comprises:
and responding to the trigger of the game video recording mode, and controlling and adjusting the transparency of the virtual rocker so that the virtual rocker is not visually displayed on the graphical user interface any more.
3. The method of claim 2, further comprising:
responding to touch operation of the virtual rocker, and controlling the virtual character to move in the game scene according to the touch operation;
displaying, in real time, a game scene screen determined based on the movement of the virtual character on the graphic user interface.
4. The method of claim 1, wherein the video recording control area comprises a lens control area, and the controlling the virtual lens to perform a corresponding lens adjustment operation according to the touch operation in response to the touch operation applied to the video recording control area, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time comprises:
and responding to the lens steering operation aiming at the lens control area, determining a lens rotating direction corresponding to the lens steering operation, and controlling the virtual lens to rotate according to the lens rotating direction so that the graphical user interface displays a game scene picture determined based on the rotation of the virtual lens in real time.
5. The method according to claim 4, wherein the controlling the virtual lens to perform a corresponding lens adjustment operation according to the touch operation in response to the touch operation applied to the video recording control area, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time comprises:
and responding to the focal length control operation aiming at the lens control area, determining a target focal length corresponding to the focal length control operation, and adjusting the virtual lens from the current focal length to the target focal length so that the graphical user interface displays a game scene picture determined based on the zooming of the virtual lens in real time.
6. The method of claim 1, wherein the video recording control area comprises a shot control area and a first shot auxiliary area, and before generating corresponding recorded data based on the game scene picture displayed in real time, the method further comprises:
under the condition that a touch signal for the first lens auxiliary area is continuously acquired, responding to a first lens translation operation for the lens control area, determining a first lens translation direction corresponding to the first lens translation operation, and controlling a virtual lens to translate in a plane perpendicular to a horizontal plane according to the first lens translation direction, so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time.
7. The method of claim 1, wherein the video recording control area comprises a lens control area and a second lens auxiliary area, and before generating corresponding recorded data based on the game scene picture displayed in real time, the method further comprises:
under the condition that a touch signal for the second lens auxiliary area is continuously acquired, responding to a second lens translation operation for the lens control area, determining a second lens translation direction corresponding to the second lens translation operation, and controlling a virtual lens to translate in a plane parallel to a horizontal plane according to the second lens translation direction, so that the graphical user interface displays a game scene picture determined based on the movement of the virtual lens in real time.
8. The method of claim 1, wherein the video recording control area comprises a first shot assist area and a second shot assist area, and wherein the method further comprises, before generating corresponding recorded data based on the game scene picture displayed in real time:
and in response to a lens resetting operation aiming at the first lens auxiliary area and the second lens auxiliary area, resetting the virtual lens in a default state according to the lens resetting operation, so that the graphical user interface displays a game scene picture determined based on the reset virtual lens in real time.
9. The method of claim 2, wherein the video cassette recording control area comprises a video cassette recording end area, the method further comprising:
and responding to the double-click touch operation aiming at the recording ending area, and ending the recording of the game scene picture.
10. The method according to claim 4, wherein the determining a lens turning direction corresponding to the lens turning operation in response to the lens turning operation for the lens control region comprises:
responding to a first sliding operation aiming at the lens control area, and if the first sliding operation is an upward sliding operation, generating an upper steering direction corresponding to the upward sliding operation;
if the first sliding operation is a downward sliding operation, generating a lower steering direction corresponding to the downward sliding operation;
if the first sliding operation is a leftward sliding operation, generating a leftward steering direction corresponding to the leftward sliding operation;
and if the first sliding operation is a rightward sliding operation, generating a rightward steering direction corresponding to the rightward sliding operation.
11. The method of claim 5, wherein the determining the target focal length corresponding to the focal length control operation in response to the focal length control operation for the lens control region comprises:
and in response to the double-finger zooming operation aiming at the lens control area, determining a target focal length corresponding to the double-finger zooming operation.
12. The method according to claim 6, wherein the determining a first lens translation direction corresponding to a first lens translation operation in response to the first lens translation operation for the lens control area while continuously acquiring the touch signal for the first lens auxiliary area comprises:
responding to a long-press touch operation aiming at the first lens auxiliary area, and acquiring a first touch signal corresponding to the long-press touch operation;
under the condition that the first touch signal is continuously detected, responding to a second sliding operation aiming at the lens control area, and if the second sliding operation is an upward sliding operation, generating an upward translation direction corresponding to the upward sliding operation;
if the second sliding operation is a downward sliding operation, generating a downward translation direction corresponding to the downward sliding operation;
if the second sliding operation is a leftward sliding operation, generating a leftward translation direction corresponding to the leftward sliding operation;
and if the second sliding operation is a rightward sliding operation, generating a rightward translation direction corresponding to the rightward sliding operation.
13. The method according to claim 7, wherein the determining a second lens translation direction corresponding to a second lens translation operation in response to the second lens translation operation for the lens control area while continuously acquiring the touch signal for the second lens auxiliary area comprises:
responding to a long-press touch operation aiming at the second lens auxiliary area, and acquiring a second touch signal corresponding to the long-press touch operation;
under the condition that the second touch signal is continuously detected, responding to a second sliding operation aiming at the lens control area, and if the second sliding operation is an upward sliding operation, generating a forward translation direction corresponding to the upward sliding operation;
and if the second sliding operation is a downward sliding operation, generating a backward translation direction corresponding to the downward sliding operation.
14. The method of claim 8, wherein the resetting a virtual lens to a default state according to a lens resetting operation in response to the lens resetting operation for the first lens auxiliary region and the second lens auxiliary region and displaying a corresponding game scene picture in the graphical user interface comprises:
and responding to the touch operation aiming at the first lens auxiliary area and the second lens auxiliary area at the same time, resetting the virtual lens in a default state, and displaying a corresponding game scene picture in the graphical user interface.
15. An in-game video recording apparatus, wherein a graphical user interface is provided through an electronic terminal, the content displayed by the graphical user interface includes a part of a game scene, a virtual character located in the game scene, and an interaction control, the apparatus comprising:
the video recording mode triggering module is used for responding to the triggering of a game video recording mode, canceling the display of the interactive control in the graphical user interface, controlling the virtual lens of the game to enter a video recording mode, and providing a video recording control area for controlling the virtual lens in the graphical user interface, wherein the video recording mode is used for acquiring and recording the content displayed by the graphical user interface in real time;
the video control module is used for responding to touch operation acting on the video control area, controlling the virtual lens to execute corresponding lens adjustment operation according to the touch operation, so that the graphical user interface displays a game scene picture determined based on the motion of the virtual lens in real time;
and the recording module is used for generating corresponding recording data based on the game scene picture displayed in real time.
16. An electronic device, comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory communicate with each other via the communication bus;
the memory is used for storing a computer program;
the processor, when executing a program stored on the memory, implementing the method of any one of claims 1-14.
17. A computer-readable storage medium having stored thereon instructions, which when executed by one or more processors, cause the processors to perform the method of any one of claims 1-14.
CN202211422527.4A 2022-11-14 2022-11-14 Video recording method and device in game, electronic equipment and storage medium Pending CN115814405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211422527.4A CN115814405A (en) 2022-11-14 2022-11-14 Video recording method and device in game, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211422527.4A CN115814405A (en) 2022-11-14 2022-11-14 Video recording method and device in game, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115814405A true CN115814405A (en) 2023-03-21

Family

ID=85528052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211422527.4A Pending CN115814405A (en) 2022-11-14 2022-11-14 Video recording method and device in game, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115814405A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024255606A1 (en) * 2023-06-16 2024-12-19 网易(杭州)网络有限公司 In-game video recording control method and apparatus, electronic device, and storage medium
WO2025039686A1 (en) * 2023-08-24 2025-02-27 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, electronic device, computer-readable storage medium and computer product
WO2025055694A1 (en) * 2023-09-13 2025-03-20 网易(杭州)网络有限公司 Game editing method and apparatus, and electronic device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024255606A1 (en) * 2023-06-16 2024-12-19 网易(杭州)网络有限公司 In-game video recording control method and apparatus, electronic device, and storage medium
WO2025039686A1 (en) * 2023-08-24 2025-02-27 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, electronic device, computer-readable storage medium and computer product
WO2025055694A1 (en) * 2023-09-13 2025-03-20 网易(杭州)网络有限公司 Game editing method and apparatus, and electronic device

Similar Documents

Publication Publication Date Title
CN108958615B (en) Display control method, terminal and computer readable storage medium
CN108737904B (en) Video data processing method and mobile terminal
CN111010510B (en) Shooting control method and device and electronic equipment
CN107707817B (en) video shooting method and mobile terminal
CN107809658A (en) A kind of barrage content display method and terminal
CN109499061B (en) Game scene picture adjusting method and device, mobile terminal and storage medium
CN107786827B (en) Video shooting method, video playing method and device and mobile terminal
CN115814405A (en) Video recording method and device in game, electronic equipment and storage medium
CN110933306A (en) Method for sharing shooting parameters and electronic equipment
CN110300274B (en) Video file recording method, device and storage medium
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
WO2019174628A1 (en) Photographing method and mobile terminal
CN112817453A (en) Virtual reality equipment and sight following method of object in virtual reality scene
CN108628515B (en) Multimedia content operation method and mobile terminal
CN111104029B (en) Shortcut identifier generation method, electronic device and medium
CN110213440B (en) Image sharing method and terminal
CN110602565A (en) Image processing method and electronic equipment
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN110769174B (en) Video viewing method and electronic equipment
CN108124059B (en) Recording method and mobile terminal
CN111025889B (en) Wearable device and control method
CN108132749B (en) Image editing method and mobile terminal
CN111654755B (en) Video editing method and electronic equipment
CN110941378B (en) Video content display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination