Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", "third" and the like are merely used to distinguish similar objects and do not represent a specific ordering of the objects, it being understood that the "first", "second", "third" may be interchanged with a specific order or sequence, as permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the embodiments of the application is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) And a client, an application program for providing various services, such as a client supporting a virtual scene (e.g., a game client), running in the terminal.
2) In response to a condition or state that is used to represent the condition or state upon which the performed operation depends, one or more of the operations performed may be in real-time or with a set delay when the condition or state upon which it depends is satisfied; without being specifically described, there is no limitation in the execution sequence of the plurality of operations performed.
3) Cloud game (Cloud game): also called game on demand (game on demand), is an online game technology based on cloud computing technology. Cloud gaming technology enables lightweight devices (THIN CLIENT) with relatively limited graphics processing and data computing capabilities to run high quality games. In a cloud game scene, the game is not run on the player game terminal, but is run in a cloud server, the cloud server renders the game scene into video and audio streams, and the video and audio streams are transmitted to the player game terminal through a network. The player game terminal does not need to have strong graphic operation and data processing capability, and only needs to have basic streaming media playing capability and the capability of acquiring player input instructions and sending the player input instructions to the cloud server.
4) The applet is a program developed based on a front-end oriented language (e.g., javaScript) and implementing services in a hypertext markup language (Hyper Text Markup Language, HTML) page, and can be immediately interpreted and executed in a client after being downloaded by the client, thus saving steps installed in the client.
5) The virtual scene can be a simulation environment of a real world, a half-simulation and half-fiction virtual environment, or a pure fiction virtual environment. In an embodiment of the present application, the virtual scene may be a game.
The embodiment of the application provides a live broadcast method, a live broadcast device, electronic equipment, a computer readable storage medium and a computer program product of a virtual scene, which can improve the utilization rate of equipment resources by starting live broadcast aiming at the virtual scene through equipment serving as a virtual handle. Next, the respective descriptions will be given.
It should be noted that, in the application of the present application, the relevant data collection process should strictly obtain the informed consent or the individual consent of the personal information body according to the requirements of the relevant national laws and regulations, and develop the subsequent data use and processing behaviors within the authorized range of the laws and regulations and the personal information body.
The following describes a live broadcast system of a virtual scene provided by the embodiment of the application. Referring to fig. 1, fig. 1 is a schematic architecture diagram of a live broadcast system 100 of a virtual scene according to an embodiment of the present application, to support an exemplary application, a first terminal 400-1 (corresponding to a hosting end of the virtual scene), a third terminal 400-2 (corresponding to a viewer end of the virtual scene), and a server 200 are connected in communication via a network 300, where the network 300 may be a wide area network or a local area network, or a combination of the two, and data transmission is implemented using a wireless or wired link.
The first terminal 400-1 is configured to display a virtual handle interface, where the virtual handle interface is configured to control a virtual object in a virtual scene displayed by the second terminal, and the virtual handle interface includes a live control; in response to the triggering operation for the live control, sending a live instruction for the virtual scene to the server 200;
the server 200 is configured to receive a live broadcast instruction for a virtual scene sent by the first terminal 400-1; responding to the live broadcast instruction, and starting live broadcast aiming at the virtual scene; transmitting a live picture of the virtual scene to the third terminal 400-2;
A third terminal 400-2, configured to receive a live broadcast picture of the virtual scene sent by the server 200; and displaying the live broadcast picture of the virtual scene. In some examples, the live view of the virtual scene may include at least one of: scene pictures of the virtual scene, comment information published for live broadcast of the virtual scene, and live broadcast content of a main broadcasting object corresponding to the live broadcast of the virtual scene.
In some embodiments, the live broadcasting method of the virtual scene provided by the embodiments of the present application may be implemented by various electronic devices, for example, may be implemented by a terminal (for example, the first terminal 400-1 and the third terminal 400-2) alone, or may be implemented by a terminal and a server cooperatively. The live broadcasting method of the virtual scene provided by the embodiment of the application can be applied to various scenes, including but not limited to cloud technology, artificial intelligence, intelligent traffic, auxiliary driving, games and the like.
In some embodiments, the server (e.g., server 200) may be a stand-alone physical server, or may be a server cluster or a distributed system of multiple physical servers. The terminals (e.g., the first terminal 400-1, the third terminal 400-2) may be, but are not limited to, a notebook computer, a tablet computer, a desktop computer, a smart phone, a smart home appliance (e.g., a smart television), a smart watch, a vehicle-mounted terminal, a wearable device, a Virtual Reality (VR) device, etc. The terminal and the server may be directly or indirectly connected through wired or wireless communication, which is not limited by the embodiment of the present application.
In some embodiments, the live broadcasting method of the virtual scene provided by the embodiment of the application can be implemented by means of Cloud Technology (Cloud Technology). Cloud technology refers to a hosting technology for unifying serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources. As an example, a server (e.g., server 200) may also be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, web services, cloud communications, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDNs), and basic cloud computing services such as big data and artificial intelligence platforms.
In some embodiments, multiple servers may be organized into a blockchain, and the servers may be nodes on the blockchain, where there may be information connections between each node in the blockchain, and where information may be transferred between the nodes via the information connections. The data related to the live broadcasting method of the virtual scene (for example, scene data of the virtual scene) provided by the embodiment of the application can be stored on the blockchain.
In some embodiments, the terminal or the server may implement the live broadcasting method of the virtual scene provided by the embodiments of the present application by running various computer executable instructions or computer programs. For example, the computer-executable instructions may be commands at the micro-program level, machine instructions, or software instructions. The computer program may be a native program or a software module in an operating system; a local (Native) Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a game client, a live client, etc.; or an applet that can be embedded in any APP, i.e., a program that can be run only by being downloaded into the browser environment. In general, the computer-executable instructions may be any form of instructions and the computer program may be any form of application, module, or plug-in.
The electronic device for implementing the live broadcasting method of the virtual scene provided by the embodiment of the application is described below. Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 implementing a live method of a virtual scene according to an embodiment of the present application. The electronic device 500 provided in the embodiment of the present application may be a terminal (for example, the first terminal 400-1 described above) or a server. The electronic device 500 provided in the embodiment of the application includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in electronic device 500 are coupled together by bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. The various buses are labeled as bus system 540 in fig. 2 for clarity of illustration.
The Processor 510 may be an integrated circuit chip having signal processing capabilities such as a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc., where the general purpose Processor may be a microprocessor or any conventional Processor, etc.
The user interface 530 includes one or more output devices 531 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Memory 550 may include one or more storage devices physically located away from processor 510. Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM) and the volatile Memory may be a random access Memory (Random Access Memory, RAM). The memory 550 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
Network communication module 552 is used to reach other electronic devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (Universal Serial Bus, USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating a peripheral device and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
The input processing module 554 is configured to detect one or more user inputs or interactions from one of the one or more input devices 532 and translate the detected inputs or interactions.
In some embodiments, the live device of the virtual scene provided by the embodiments of the present application may be implemented in a software manner, and fig. 2 shows a live device 555 of the virtual scene stored in a memory 550, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the display module 5551 and the first transmitting module 5552 are logical, and thus may be arbitrarily combined or further split according to the implemented functions, and the functions of the respective modules will be described below.
The following describes a live broadcast method of a virtual scene provided by the embodiment of the application. Referring to fig. 3, fig. 3 is a flow chart of a live broadcast method of a virtual scene according to an embodiment of the present application, where the live broadcast method of a virtual scene according to the embodiment of the present application includes:
Step 101: the first terminal displays a virtual handle interface.
The virtual handle interface is used for controlling virtual objects in a virtual scene displayed by the second terminal, and comprises a live control.
Here, the first terminal may be provided with an application program supporting a virtual handle, such as a virtual handle client, a virtual handle applet, or the like. When the user needs to use the virtual handle, an operation instruction of an application program set for the first terminal can be triggered, the first terminal responds to the operation instruction, the application program is operated, and a virtual handle interface is displayed, so that the first terminal can serve as the virtual handle. The virtual handle interface may have displayed therein an operation control included in the virtual handle, and as an example, referring to fig. 4A, the virtual handle interface includes a plurality of operation controls, for example, operation control 1, operation control 2, and the like. The user can control the virtual object in the virtual scene through the operation control displayed in the virtual handle interface.
It should be noted that, in some embodiments, the second terminal may be configured to display a virtual scene, where the virtual scene actually runs on a server (such as a cloud server), and after the server finishes rendering a scene image of the virtual scene, the server sends the scene image of the virtual scene to the second terminal, and the second terminal displays the virtual scene based on the received scene image of the virtual scene, so that the virtual scene may be a virtual scene in a cloud service mode, such as a cloud game. In practical application, the first terminal and the second terminal can be in communication connection, so that the first terminal controls the virtual object in the virtual scene through the virtual handle interface; of course, the first terminal and the second terminal may also respectively establish communication connection with the server, so that the first terminal obtains operation information for a virtual object in the virtual scene triggered by the user through the virtual handle interface, and sends the operation information to the server, the server renders scene data of the virtual scene according to the operation information, so as to send a scene picture of the virtual scene obtained by rendering to the second terminal, and the second terminal displays the virtual scene after the user controls the virtual object through the virtual handle interface.
In other embodiments, the virtual scene may also be run on a second terminal that has established a communication connection with the first terminal. On the premise that the first terminal and the second terminal establish communication connection, the first terminal can be used as a virtual handle of a virtual scene to control a virtual object of the virtual scene running on the second terminal. In practical application, the first terminal may be a smart phone, and the second terminal may be a smart television, a tablet computer, or the like.
In some embodiments, the first terminal may perform the following processing before displaying the virtual handle interface: receiving a scanning operation for a graphic code displayed by a second terminal; in response to the scanning operation, a communication connection between the first terminal and the second terminal is established. Here, the user may establish a communication connection between the first terminal and the second terminal by scanning the graphic code displayed by the second terminal. Therefore, the user operation can be simplified, the establishment efficiency of the communication connection can be improved, and the user experience can be improved.
In some embodiments, in response to the scanning operation, the first terminal may establish a communication connection between the first terminal and the second terminal by: in response to the scanning operation, a virtual handle applet is run and a communication connection between the first terminal and the second terminal is established through the virtual handle applet. Here, establishing the communication connection between the first terminal and the second terminal is achieved by means of a virtual handle applet. When the first terminal receives a scanning operation for the graphic code displayed by the second terminal, the virtual handle applet is operated in response to the scanning operation, so that a communication connection between the first terminal and the second terminal is established through the virtual handle applet. Here, the applet belongs to a lightweight application program, can save steps installed at a client, and is quick to start, so that the communication connection establishment efficiency can be improved, the occupation of hardware processing resources and storage resources of the first terminal can be reduced, and the user experience is improved.
In some embodiments, the first terminal may display the virtual handle interface by: and when the communication connection between the first terminal and the second terminal is successfully established, the first terminal displays a virtual handle interface. When the communication connection between the first terminal and the second terminal is successfully established, the first terminal directly displays the virtual handle interface, so that the purpose of rapidly displaying the virtual handle interface is achieved. As an example, referring to fig. 5, fig. 5 is a schematic view of a display flow of a virtual handle interface according to an embodiment of the present application. Here, the second terminal displays a graphic code as shown in (1) of fig. 5; the first terminal receives a scanning operation for the graphic code as shown in (2) of fig. 5; the first terminal establishes a communication connection between the first terminal and the second terminal in response to a scanning operation for the graphic code, and displays a virtual handle interface when the communication connection between the first terminal and the second terminal is established successfully, as shown in (3) of fig. 5.
In some embodiments, the first terminal may perform the following processing before displaying the virtual handle interface: displaying a device connection control for the second terminal; responding to triggering operation for the equipment connection control, and establishing communication connection between the first terminal and the second terminal; and when the communication connection between the first terminal and the second terminal is successfully established, the first terminal displays a virtual handle interface. Here, the first terminal may provide a device connection control for the second terminal, through which a communication connection between the first terminal and the second terminal may be established. As an example, referring to fig. 6, fig. 6 is a schematic flow chart of establishing a communication connection according to an embodiment of the present application. Here, the first terminal displays a device connection control for the second terminal, as shown in (1) in fig. 6; in response to a trigger operation for the device connection control, displaying the identified connectable device including the second terminal therein, as shown in (2) in fig. 6; in response to an acknowledgement selection operation for the second terminal, a communication connection between the first terminal and the second terminal is established. In practical application, after the communication connection between the first terminal and the second terminal is established, the first terminal can directly display the virtual handle interface, so that the purpose of rapidly displaying the virtual handle interface is achieved.
In the embodiment of the application, the virtual handle interface comprises a live control. The user can control to start the live broadcast aiming at the virtual scene through the live broadcast control. Referring to fig. 4A, in the virtual handle interface, a live control "one-click live" of a virtual scene is displayed. Therefore, the first terminal can be used as a virtual handle to control virtual objects of the virtual scene displayed by the second terminal, can also be used for controlling and starting live broadcast aiming at the virtual scene, can enrich functions of equipment serving as the virtual handle, and improves the utilization rate of equipment resources; and the live broadcast aiming at the virtual scene is started through one key of the live broadcast control, so that the operation is simple, and the operation efficiency of starting the live broadcast of the virtual scene is improved.
Step 102: and responding to the triggering operation for the live control, and sending a live instruction for the virtual scene.
The live broadcast instruction is used for indicating to start live broadcast aiming at the virtual scene.
Here, when the first terminal receives a trigger operation for the live control, a live instruction for the virtual scene is transmitted in response to the trigger operation. The live instruction is used for indicating to start live broadcast aiming at the virtual scene. In some examples, the first terminal may send a live instruction to the second terminal, and the second terminal may initiate live broadcasting for the virtual scene and display a live screen of the virtual scene in response to the live instruction.
In other examples, the first terminal may send a live instruction to the server, and the server controls to start live broadcast for the virtual scene. Specifically, the first terminal may send a live broadcast instruction to a live broadcast server, and the live broadcast server starts live broadcast for the virtual scene. After the live broadcast server starts live broadcast aiming at the virtual scene, the first terminal can send live broadcast contents (such as pictures and sounds of a host broadcast) of the virtual scene to the live broadcast server; at this time, the live broadcast server may send live broadcast content to the cloud server of the virtual scene; and after receiving the live broadcast content, the cloud server synthesizes the live broadcast content and the scene picture of the virtual scene to obtain the live broadcast picture of the virtual scene, and pushes the live broadcast picture of the virtual scene to the second terminal for display. Specifically, the cloud server may directly send the live broadcast picture to the second terminal, or may forward the live broadcast picture to the second terminal through the live broadcast server.
In practical application, the live broadcast picture of the virtual scene includes at least one of the following information: scene pictures of the virtual scene, comment information published for live broadcast of the virtual scene, and live broadcast content of a main broadcasting object corresponding to the live broadcast of the virtual scene.
In some embodiments, the first terminal may send the live instruction for the virtual scene in response to a trigger operation for the live control by: responding to the triggering operation for the live broadcast control, and displaying an opening prompt control of the live broadcast acquisition equipment; the live broadcast acquisition equipment comprises at least one of image acquisition equipment and sound acquisition equipment; and responding to a confirmation starting instruction triggered based on the starting prompt control, starting the live broadcast acquisition equipment, and sending a live broadcast instruction aiming at the virtual scene.
Here, when live broadcasting of the virtual scene is performed, the live broadcasting acquisition device may be further turned on to acquire live broadcasting content in the live broadcasting process, where the live broadcasting acquisition device includes at least one of an image acquisition device for acquiring video images (such as video images of a host) in the live broadcasting process and a sound acquisition device for acquiring audio information (such as voice information of the host) in the live broadcasting process. In practical application, when the first terminal receives a triggering operation for the live broadcast control, an opening prompt control of the live broadcast acquisition equipment is displayed in response to the triggering operation, so that whether the live broadcast acquisition equipment is opened or not is confirmed through the opening prompt control. When a confirmation start instruction triggered based on the start prompt control is received, the live broadcast acquisition equipment is started in response to the confirmation start instruction, and meanwhile, a live broadcast instruction aiming at the virtual scene is sent.
As an example, referring to fig. 7, fig. 7 is a schematic diagram of an opening flow of a live broadcast acquisition device according to an embodiment of the present application. Here, the first terminal displays a live control, as shown in (1) in fig. 7; the method comprises the steps that a first terminal responds to triggering operation aiming at a live broadcast control, and an opening prompt control of the live broadcast acquisition equipment is displayed, wherein the opening prompt control comprises a confirmation control for triggering a confirmation opening instruction, a confirmation control for triggering a cancellation opening instruction, and prompt information of 'please open the live broadcast acquisition equipment'; in response to a confirmation on instruction triggered based on the confirmation control "confirm", the live acquisition device is turned on as shown in fig. 7 (2).
In some embodiments, the first terminal may start the live acquisition device by: in the virtual handle interface, displaying a device opening control of a live broadcast acquisition device, wherein the live broadcast acquisition device comprises at least one of an image acquisition device and a sound acquisition device; and responding to the triggering operation of the device opening control, and opening the live broadcast acquisition device. Here, the opening of the live broadcast acquisition device may be irrelevant to the sending of the live broadcast instruction, so that only live broadcast for the virtual scene may be opened, without the need for the host broadcast to perform live broadcast. At this time, the device opening control of the live broadcast acquisition device can be displayed in the virtual handle interface, so that a user can open the live broadcast acquisition device based on the device opening control at any moment of live broadcast according to personal requirements. When the first terminal receives a trigger operation for opening the control of the device, the live broadcast acquisition device is started in response to the trigger operation. The live broadcast collecting device is used for collecting live broadcast content in a live broadcast process, the live broadcast collecting device comprises at least one of an image collecting device and a sound collecting device, the image collecting device is used for collecting video images (such as video images of a host), and the sound collecting device is used for collecting audio information (such as voice information of the host) in the live broadcast process. As an example, referring to fig. 8, fig. 8 is a schematic display diagram of a virtual handle interface provided in an embodiment of the present application, where a device start control of a live broadcast collection device "start live broadcast collection device" is displayed in the virtual handle interface.
In some embodiments, the first terminal may live by: acquiring live broadcast content acquired based on live broadcast acquisition equipment; and sending the live broadcast content to a server, wherein the server is used for sending the live broadcast content to a third terminal of the live broadcast watching object so as to enable the third terminal to output the live broadcast content. Here, the live broadcast content acquired by the live broadcast acquisition device may be sent to the third terminal of the live broadcast viewing object, so that the live broadcast viewing object views the live broadcast content of the live broadcast main broadcasting object while viewing the scene picture of the live broadcast virtual scene, thereby improving the live broadcast effect. Specifically, the live broadcast content may be sent to a live broadcast server, the live broadcast server forwards the live broadcast content to a cloud server running the virtual scene, after the cloud server receives the live broadcast content, the live broadcast content and a scene picture of the virtual scene are synthesized to obtain a live broadcast picture of the virtual scene, and the live broadcast picture of the virtual scene is pushed to a third terminal for display. Thus, the third terminal can output the live content.
In some embodiments, the first terminal may turn off the live acquisition device by: displaying a device closing control of the live broadcast acquisition device in the virtual handle interface; and responding to the triggering operation of the device closing control, and closing the live broadcast acquisition device. Here, after the live acquisition device is turned on, the live acquisition device may also be turned off by a device off control. Thus, the live broadcast acquisition equipment is conveniently opened or closed by the host broadcast object according to the requirement, the degree of freedom of live broadcast of the host broadcast object is improved, and the live broadcast experience of the host broadcast object is improved. As an example, with continued reference to fig. 8, the device shutdown control of the live acquisition device displayed in the virtual handle interface "shuts down the live acquisition device".
In some embodiments, the first terminal may close the live broadcast for the virtual scene by: displaying a live broadcast closing control of the virtual scene in the virtual handle interface; and responding to the triggering operation of the live broadcast closing control, and sending a live broadcast closing instruction for the virtual scene, wherein the live broadcast closing instruction is used for indicating to close the live broadcast for the virtual scene. Here, in the virtual handle interface, a live close control of the virtual scene may also be provided, through which live close control live close to the virtual scene may be controlled. Thus, the anchor can start live broadcast through the live broadcast control, and can close live broadcast through the live broadcast closing control, so that functions of the equipment serving as a virtual handle are further enriched, and user experience is improved. As an example, referring to fig. 4B, after triggering a triggering operation for a live control in a virtual handle interface and starting live, the virtual handle interface is switched from displaying the live control "one-key on live" to displaying the live off control "off live". In some examples, the first terminal may send a live close instruction to the second terminal, and the second terminal may close a live for the virtual scene and display a scene picture of the virtual scene in response to the live close instruction. In other examples, the first terminal may send a live broadcast closing instruction to the server, and the server may control to close live broadcast for the virtual scene and control the second terminal to display a scene screen of the virtual scene. For example, when the virtual scene is a cloud game, the first terminal may send a live broadcast closing instruction to the cloud game server, and the cloud game server closes live broadcast for the virtual scene and pushes a scene picture of the virtual scene to the second terminal for display.
In some embodiments, the first terminal may share live broadcast by: displaying a sharing control for live broadcasting of the virtual scene in the virtual handle interface; based on the sharing control, receiving a live sharing instruction aiming at a target object; and responding to the live broadcast sharing instruction, sending live broadcast invitation information to the server, so that the server sends the live broadcast invitation information to the fourth terminal of the target object, wherein the live broadcast invitation information is used for inviting the target object to watch live broadcast of the virtual scene.
Here, in the virtual handle interface, a sharing control for live broadcasting of the virtual scene may also be provided. Through the sharing control, live broadcasts aiming at the virtual scene can be shared to other users so as to invite the other users to watch the live broadcasts of the virtual scene. In an actual application, a live sharing instruction for a target object may be triggered by a sharing control, for example, a first terminal displays at least one object for selection in response to a triggering operation for the sharing control, where the at least one object includes the target object, and receives the live sharing instruction for the target object in response to a confirmation selection operation for the target object. Continuously, the first terminal responds to the live broadcast sharing instruction, sends live broadcast invitation information to a server (such as a server of an instant messaging client), and after receiving the live broadcast invitation information, the server sends the live broadcast invitation information to a fourth terminal of a target object, and at the moment, the target object can watch live broadcast of a virtual scene based on the live broadcast invitation information.
In practical applications, when a target object is selected based on the sharing control, a sharing manner, such as short message sharing, instant messaging application sharing, social friend circle sharing, and the like, may also be selected. As an example, with continued reference to fig. 8, a share control "live share" for live of a virtual scene is displayed in the virtual handle interface. Therefore, live broadcast can be shared to other users through the sharing control, more users are attracted to watch live broadcast, the live broadcast enthusiasm and enthusiasm of the host are improved, the functions of the device serving as a virtual handle are further enriched, and user experience is improved.
In some embodiments, the first terminal may issue information in the live broadcast process by: displaying a live information release control for the virtual scene in the virtual handle interface; based on the information release control, receiving target information of live broadcast input aiming at the virtual scene; and in response to the issuing instruction for the target information, sending the target message to a server, wherein the server is used for sending the target message to a third terminal of the live broadcast watching object so as to enable the third terminal to display the target message.
Here, in the virtual handle interface, an information release control for live broadcasting of the virtual scene may also be provided, based on which a user (e.g., a host) may release information at the time of live broadcasting. In this way, based on the information release control, the target information input for the live broadcast of the virtual scene can be received, and when the release instruction for the target information is received, the target message is sent to the server (such as the live broadcast server) in response to the release instruction, and the target message is distributed to the third terminal of the live broadcast viewing object by the live broadcast server, so that the third terminal displays the target message. As an example, with continued reference to fig. 8, a live posting control "info post" for a virtual scene is displayed in the virtual handle interface. Therefore, the content which is wanted to be expressed can be sent to the live watching object through the information release control, and interactivity of the anchor and audience is improved, so that live enthusiasm and enthusiasm of the anchor are improved, functions of equipment serving as a virtual handle are further enriched, and user experience is improved.
It should be noted that, the triggering operation mentioned in the embodiment of the present application may be any operation capable of triggering a corresponding function, such as clicking, double clicking, dragging, etc., and the control mentioned in the embodiment of the present application may be equivalent to a function item, and may have various presentation forms, such as a graphic button, a progress bar, a menu, a list, etc., which is not limited in the embodiment of the present application.
By applying the embodiment of the application, the first terminal displays the virtual handle interface, and the virtual handle interface is used for controlling the virtual object in the virtual scene displayed by the second terminal, and the virtual handle interface comprises the live control, and the live control can trigger the live instruction aiming at the virtual scene, and the live instruction is used for indicating to start the live broadcast aiming at the virtual scene. Here, the first terminal may control not only the virtual object in the virtual scene displayed by the second terminal as the virtual handle, but also to start live broadcast for the virtual scene. Thus, 1) the functions of the equipment serving as the virtual handle can be enriched, and the utilization rate of equipment resources is improved; 2) The live broadcast aiming at the virtual scene is started through one key of the live broadcast control, the operation is simple, the operation efficiency of starting the live broadcast of the virtual scene is improved, and the utilization rate of equipment resources is further improved.
The following continues to describe the live broadcast method of the virtual scene provided by the embodiment of the application. Referring to fig. 9, fig. 9 is a flow chart of a live broadcast method of a virtual scene according to an embodiment of the present application, where the live broadcast method of a virtual scene according to an embodiment of the present application includes:
step 201: the server receives live content of the virtual scene.
The virtual scene is displayed on the second terminal, and virtual objects in the virtual scene are controlled through a virtual handle interface displayed on the first terminal, wherein the virtual handle interface comprises a live control.
The live broadcast content is sent after live broadcast aiming at the virtual scene is started according to a live broadcast instruction triggered by the first terminal through the live broadcast control.
In practical applications, the server may be a server for running a virtual scene, for example, the virtual scene is a cloud game, and the server is a cloud game server. The server can render the scene data of the virtual scene to obtain a scene picture of the virtual scene, and then the scene picture of the virtual scene is pushed to the second terminal for display.
In the embodiment of the application, a live control is displayed in a virtual handle interface displayed on the first terminal, and a user can trigger a live instruction aiming at a virtual scene based on the live control at the first terminal so as to control and start live aiming at the virtual scene.
Step 202: and synthesizing the live broadcast content with the scene picture of the virtual scene to obtain the live broadcast picture of the virtual scene.
In practical application, after a user triggers a live broadcast instruction for a virtual scene based on the live broadcast control at a first terminal, the first terminal sends the live broadcast instruction to a live broadcast server. At the moment, the live broadcast server responds to the live broadcast instruction to start live broadcast aiming at the virtual scene, and meanwhile, the live broadcast server can also return prompt information of successful live broadcast start to the first terminal. After receiving prompt information of successful live broadcast starting, the first terminal confirms that live broadcast aiming at the virtual scene is started, and at the moment, the collected live broadcast content is sent to a live broadcast server. And after receiving the live content, the live server forwards the live content to a server running the virtual scene. The server running the virtual scene can synthesize the scene picture of the virtual scene with the live broadcast content to obtain the live broadcast picture of the virtual scene.
Step 203: and sending the live broadcast picture of the virtual scene to the second terminal so that the second terminal displays the live broadcast picture of the virtual scene.
In practical application, after the server running the virtual scene obtains the live broadcast picture of the virtual scene, the live broadcast picture of the virtual scene is sent to the second terminal, specifically, the server may directly send the live broadcast picture to the second terminal, or may forward the live broadcast picture to the second terminal through the live broadcast server. And after the second terminal receives the live broadcast picture of the virtual scene, displaying the live broadcast picture of the virtual scene.
With continued reference to fig. 14, fig. 14 is a flowchart of a live broadcast method of a virtual scene according to an embodiment of the present application, where the live broadcast method of a virtual scene according to an embodiment of the present application includes:
step 301: the first terminal displays a virtual handle interface including a live control.
Here, the first terminal corresponds to a live anchor.
Step 302: and the first terminal responds to the triggering operation for the live control and sends a live instruction for the virtual scene to the live server.
Step 303: and the live broadcast server responds to the received live broadcast instruction, starts live broadcast aiming at the virtual scene, and returns a live broadcast start success notification to the first terminal.
Step 304: and the first terminal receives the live broadcast starting success notification and sends live broadcast contents of the virtual scene to the live broadcast server.
Step 305: the live broadcast server receives live broadcast content of the virtual scene and sends the live broadcast content of the virtual scene to the virtual scene server running the virtual scene.
Here, when the virtual scene is a cloud game, the virtual scene server is a cloud game server.
Step 306: the virtual scene server acquires scene pictures of the virtual scene obtained through rendering, and synthesizes live broadcast contents and the scene pictures to obtain live broadcast pictures of the virtual scene.
Step 307: and the virtual scene server sends the live broadcast picture to the live broadcast server.
Step 308: and the live broadcast server distributes the live broadcast picture to the second terminal and a third terminal for watching the live broadcast audience.
Here, the second terminal is a terminal of the anchor terminal for displaying the virtual scene.
The second terminal will be described next. And after the second terminal receives the live broadcast picture of the virtual scene, displaying the live broadcast picture of the virtual scene. In some embodiments, before displaying the live view of the virtual scene, the second terminal may further perform the following processing: determining whether the second terminal has established a communication connection with the first terminal in response to a display instruction for the virtual scene; when the second terminal and the first terminal have established communication connection, determining to execute operation of running the virtual scene; and when the second terminal and the first terminal do not establish communication connection, displaying connection guide information, wherein the connection guide information is used for indicating a connection establishment mode indicated by the connection guide information, and establishing communication connection between the second terminal and the first terminal.
Here, the virtual scene may be run by triggering a display instruction for the virtual scene at the second terminal. The second terminal may first determine whether the second terminal has established a communication connection with the first terminal in response to a display instruction for the virtual scene. If the second terminal and the first terminal have established communication connection, displaying a virtual scene; if the second terminal does not establish a communication connection with the first terminal, connection guide information may be displayed. The connection guiding information is used for indicating a connection establishing mode indicated by the connection guiding information and establishing communication connection between the second terminal and the first terminal. For example, the connection guiding information may include a graphic code (such as a two-dimensional code) and a corresponding hint information "scan two-dimensional code connection"; the connection guide information may also include only a hint, such as "open cell phone XXX connection".
In some embodiments, the connection guiding information includes a graphic code, where the graphic code is used for establishing a communication connection between the first terminal and the second terminal when the first terminal triggers a scanning operation for the graphic code, and jumping the interface to the virtual handle interface; accordingly, the second terminal may further perform the following processing: obtaining a link corresponding to the virtual handle interface; and encoding the links to obtain the graphic codes. Here, when the connection guide information includes a graphic code, the link corresponding to the virtual handle interface may be acquired first, and then the link may be encoded to obtain the graphic code and displayed. Based on the method, the first terminal can establish communication connection between the first terminal and the second terminal by scanning the graphic code displayed by the second terminal, and meanwhile, after the communication connection between the first terminal and the second terminal is successfully established, a virtual handle interface is displayed.
As an example, referring to fig. 10, fig. 10 is a schematic display diagram of connection guiding information provided in an embodiment of the present application. Here, in response to a selection operation for the virtual scene "virtual scene 1", an execution instruction for the virtual scene is triggered, as shown in (1) in fig. 10; at this time, since it is determined that the second terminal does not establish communication connection with the first terminal, the second terminal displays connection guide information including a graphic code (such as a two-dimensional code) and corresponding hint information "scan two-dimensional code connection", as shown in (2) of fig. 10.
In some embodiments, the second terminal may display a live view of the virtual scene by: displaying a live view of the virtual scene including at least one of the following information: scene pictures of the virtual scene, comment information published for live broadcast of the virtual scene, and live broadcast content of a main broadcasting object corresponding to the live broadcast of the virtual scene. As an example, referring to fig. 11, fig. 11 is a schematic display diagram of a live view of a virtual scene provided by an embodiment of the present application. Here, the live view of the virtual scene includes: scene pictures of the virtual scene, comment information of live broadcast release aiming at the virtual scene, and live broadcast content of the main broadcast object.
In some embodiments, the second terminal may display the live broadcast of the target live broadcast room by: displaying a live room entry control; based on the live broadcasting room entering control, receiving a live broadcasting room entering instruction aiming at a target live broadcasting room, wherein the target live broadcasting room is used for live broadcasting a target virtual scene; and responding to the live broadcasting room entering instruction, and displaying a live broadcasting picture of the target virtual scene. Here, the user may also enter the control through the live room provided by the second terminal to watch live broadcast of the virtual scene. When a live broadcasting room entering instruction aiming at a target live broadcasting room is received based on the room entering control, a live broadcasting picture of the target virtual scene is displayed in response to the live broadcasting room entering instruction. Therefore, the user can not only live broadcast, but also watch live broadcast of other virtual scenes at the second terminal, and user experience is provided.
In some embodiments, the second terminal may interact in the target live room by: displaying a live interaction control aiming at a target live broadcasting room; and responding to the live broadcast interaction instruction triggered based on the live broadcast interaction control, and executing the live broadcast interaction operation indicated by the live broadcast interaction instruction aiming at the target live broadcast room. Here, the live interaction control may include at least one of: comment controls (for posting comment information for a target living room), hidden comment controls (for hiding and displaying comment information for a target living room), resource presentation controls (for sending virtual resources, such as virtual gifts, virtual gold coins, etc., to a host of a target living room), praying controls (for praying a target living room), sharing controls (for sharing a target living room to other users), and the like.
As an example, referring to fig. 12, fig. 12 is a schematic view of a target live room according to an embodiment of the present application. Here, the second terminal displays that the live room enters the control "live room", as shown in (1) in fig. 12; in response to a trigger operation for the live room entry control "live room", displaying at least one live room for selection, the at least one live room including a target live room, as shown in fig. 12 (2); in response to a selection operation for the target living room, a living room entry instruction for the target living room is received, and at this time, in response to the living room entry instruction, a living broadcast screen of a target virtual scene living in the target living room is displayed, as shown in (3) in fig. 12. Here, the live view of the target virtual scene includes a scene view of the target virtual scene, comment information posted for the target live room, and live content of a main broadcasting object of the target live room. And meanwhile, comment controls, hidden comment controls, resource giving controls and the like are displayed.
It should be noted that, the triggering operation mentioned in the embodiment of the present application may be any operation capable of triggering a corresponding function, such as clicking, double clicking, dragging, etc., and the control mentioned in the embodiment of the present application may be equivalent to a function item, and may have various presentation forms, such as a graphic button, a progress bar, a menu, a list, etc., which is not limited in the embodiment of the present application.
By applying the embodiment of the application, the live broadcast content received by the server is sent after the live broadcast aiming at the virtual scene is started according to the live broadcast instruction triggered by the first terminal through the live broadcast control. The first terminal can be used as a virtual handle to control virtual objects in the virtual scene displayed by the second terminal, and can also be used for controlling and starting live broadcast aiming at the virtual scene. Thus, 1) the functions of the equipment serving as the virtual handle can be enriched, and the utilization rate of equipment resources is improved; 2) The live broadcast aiming at the virtual scene is started through one key of the live broadcast control, the operation is simple, the operation efficiency of starting the live broadcast of the virtual scene is improved, and the utilization rate of equipment resources is further improved.
An exemplary application of the embodiment of the present application in an actual application scenario will be described below by taking a virtual scenario as an example of a cloud game. In the related art, a first terminal (for example, a smart phone) is used as a virtual handle to control a game object of a cloud game of a second terminal (for example, a smart television), but the first terminal can only be used as the virtual handle in the experience process of the cloud game.
Based on this, the embodiment of the application provides a live broadcasting method of a virtual scene, so as to at least solve the above-mentioned problems. Next, a first terminal smart phone and a second terminal are taken as smart phones to serve as examples, and a live broadcast method of the virtual scene provided by the embodiment of the application is described. In the embodiment of the application, the live broadcast control is provided on the mobile phone terminal, so that a user can control and start live broadcast aiming at the cloud game through the live broadcast control, the operation path of the user for live broadcast is shortened, the live broadcast will of the user is improved, and the probability of the user for live broadcast and retention of the cloud game by using the television terminal is improved. In practical application, a user can scan a graphic code (such as a two-dimensional code) of a virtual handle applet displayed by a television terminal through a mobile phone terminal, enter a virtual handle interface of the virtual handle applet, click a live control on the virtual handle interface, control and start live broadcast aiming at a cloud game, and meanwhile, the mobile phone terminal can also start a camera. At the moment, the television terminal enters the cloud game, and displays a live broadcast picture of the cloud game, and meanwhile, comment information can be displayed so as to conduct live broadcast interaction with live broadcast audiences.
Wherein, the live broadcasting procedure of the anchor may include:
1) As shown in fig. 10 (1), at the television terminal, the anchor may trigger a selection operation for the cloud game through a remote control device (such as a remote controller of the television terminal), and enter the cloud game.
2) As shown in fig. 10 (2), the system detects that the television terminal is not connected to an external device (e.g., a mobile phone terminal supporting a virtual handle, a handle device), displays connection guide information to guide the user to connect to the television terminal through the mobile phone terminal.
3) As shown in fig. 5 (2), the anchor scans the two-dimensional code of the virtual handle applet displayed by the television terminal through the mobile phone terminal.
4) As shown in fig. 5 (3), after scanning of the two-dimensional code is completed, the mobile phone terminal runs the virtual handle applet to display a virtual handle interface, and displays a live broadcast control "one-key live broadcast start" in the virtual handle interface.
5) As shown in fig. 4A and fig. 4B, after triggering the triggering operation for the live broadcast control and starting the live broadcast at the mobile phone terminal, the live broadcast control is switched from the live broadcast control on one key to the live broadcast off control on one key in the virtual handle interface, and meanwhile, the camera of the mobile phone terminal is started.
6) As shown in fig. 11, the television terminal displays live broadcast pictures of the cloud game, including game pictures of the cloud game, comment information, and live broadcast content of the host.
The process of watching live broadcast by the audience comprises the following steps:
1) As shown in fig. 12 (1), a live room entry control "live room" is displayed at the television terminal, and a user can trigger a trigger operation for the live room entry control through a remote controller of the television terminal.
2) As shown in fig. 12 (2), the television terminal displays at least one live room for selection in response to a trigger operation for the live room entry control "live room"; the user can trigger the selection operation aiming at the target living broadcast room through the remote controller of the television terminal.
3) As shown in fig. 12 (3), the television terminal enters a target live broadcast room in response to a selection operation for the target live broadcast, and in the target live broadcast room, a user can perform interactive operations such as comment, comment hiding, virtual resource giving, and the like.
Next, a method for live broadcasting of a virtual scene provided by an embodiment of the present application will be described in detail, referring to fig. 13, where the method for live broadcasting of a virtual scene provided by the embodiment of the present application includes:
(1) And the user selects the appointed cloud game at the television terminal through the remote controller, and triggers a starting instruction aiming at the cloud game so as to start the cloud game at a cloud game client of the television terminal.
(2) Before a cloud game client of a television terminal starts a cloud game, detecting whether the television terminal is in communication connection with a mobile phone terminal or not; if not, executing (3), if yes, executing (4).
(3) The cloud game client of the television terminal acquires a two-dimensional code picture link containing a link of the virtual handle applet from the cloud game server, the two-dimensional code is displayed on the television terminal, a user scans the two-dimensional code, and the mobile phone terminal is connected with the cloud game client of the television terminal.
(4) The mobile phone terminal displays a virtual handle picture of the virtual handle applet, and displays a live control in the virtual handle interface, and at the moment, the television terminal enters a game picture of the cloud game;
(5) And the mobile phone terminal determines whether the user clicks the live control, if so, the mobile phone terminal executes (6), and if not, the mobile phone terminal executes (10).
(6) The virtual handle applet of the mobile phone terminal applies for opening the camera authority to the user.
(7) After user consent, the virtual handle applet notifies the cloud game server of the live event.
(8) And transmitting the live video stream data acquired in real time through the camera to the cloud game server through socket link.
(9) After receiving the live event, the cloud game server renders a game picture of the cloud game, receives a live video stream of a user from the virtual handle applet, splices the video stream of the game picture with the live video stream so as to splice live data of the user on the game picture, and sends the spliced live stream of the cloud game to the television terminal.
(10) The virtual handle applet of the mobile phone terminal informs the cloud game server that live broadcast is not needed, and the cloud game server directly renders the game picture and pushes the video stream of the game picture to the television terminal.
(11) After receiving the video stream data from the cloud game server, the television terminal directly decodes, renders and displays without performing other operations.
(12) If the user wants to terminate live broadcasting, clicking a live broadcasting closing control displayed on the virtual handle interface, the virtual handle applet informs the cloud game server of the live broadcasting closing event, closes a camera, stops pushing the live broadcasting video stream to the cloud game server, and the cloud game server continues to render the game picture instead of splicing the video stream.
When watching live broadcast of a cloud game, a server generates a live broadcast room link, a cloud game client of a television terminal displays the live broadcast room link, after other users click on the live broadcast room, the cloud game server pushes live broadcast video data of the users to other users at the same time, the other users can select interactive operations such as commentary or giving virtual resources (such as gifts) and the like, the cloud game server pushes events such as commentary and the like to clients of all viewers, and the clients of the viewers conduct commentary or rendering display of the virtual resources (such as gifts) and the like
In practical applications, the second terminal may be a non-television terminal, such as a cloud game mobile terminal, a tablet computer terminal, and the like, which is not limited herein.
By applying the embodiment of the application, through providing the one-key live broadcast function on the terminal supporting the virtual handle, a user can directly play the cloud game and live broadcast of own pictures on the television terminal, the operation path of the user for live broadcast is reduced, the live broadcast will of the user can be greatly improved, the active time of the user on the cloud game client of the television terminal is prolonged, meanwhile, the live broadcast picture can be opened for all the users, the popularity of the user in a game area is improved, and the propagation and popularization of the cloud game are carried out through a relation chain among the users.
The following continues to describe an exemplary structure of the live device 555 implemented as a software module for a virtual scene provided by an embodiment of the present application. The live broadcasting device of the virtual scene provided by the embodiment of the application is applied to the first terminal. In some embodiments, as shown in fig. 2, the software modules stored in the live device 555 of the virtual scene of the memory 550 may include: the display module 5551 is configured to display a virtual handle interface, where the virtual handle interface is used to control a virtual object in a virtual scene displayed by the second terminal, and the virtual handle interface includes a live control; a first sending module 5552, configured to send, in response to a trigger operation for the live control, a live instruction for the virtual scene, where the live instruction is used to instruct to start live broadcast for the virtual scene.
In some embodiments, the first sending module 5552 is further configured to send, after sending a live instruction for the virtual scene, live content of the virtual scene after starting live for the virtual scene based on the live instruction; the live broadcast content is used for synthesizing the live broadcast content and the scene picture of the virtual scene to obtain the live broadcast picture of the virtual scene.
In some embodiments, the display module 5551 is further configured to receive a scan operation for the graphic code displayed by the second terminal before displaying the virtual handle interface; establishing a communication connection between the first terminal and the second terminal in response to the scanning operation; the display module 5551 is further configured to display the virtual handle interface when the communication connection between the first terminal and the second terminal is established successfully.
In some embodiments, the display module 5551 is further configured to run a virtual handle applet in response to the scanning operation, and establish a communication connection between the first terminal and the second terminal through the virtual handle applet.
In some embodiments, the display module 5551 is further configured to display a device connection control for the second terminal before displaying the virtual handle interface; responding to triggering operation for the equipment connection control, and establishing communication connection between the first terminal and the second terminal; the display module 5551 is further configured to display the virtual handle interface when the communication connection between the first terminal and the second terminal is established successfully.
In some embodiments, the first sending module 5552 is further configured to display an open prompt control of the live capturing device in response to a trigger operation for the live capturing control; the live broadcast acquisition equipment comprises at least one of image acquisition equipment and sound acquisition equipment, wherein the opening prompt control is used for confirming whether to open the live broadcast acquisition equipment; and responding to a confirmation starting instruction triggered based on the starting prompt control, starting the live broadcast acquisition equipment, and sending a live broadcast instruction aiming at the virtual scene.
In some embodiments, the display module 5552 is further configured to display, in the virtual handle interface, a device open control of a live broadcast capturing device, where the live broadcast capturing device includes at least one of an image capturing device and a sound capturing device; and responding to the triggering operation of the device opening control, and opening the live broadcast acquisition device.
In some embodiments, the first sending module 5552 is further configured to obtain live content acquired based on the live acquisition device; and sending the live broadcast content to a server, wherein the server is used for sending the live broadcast content to a third terminal of the live broadcast watching object so that the third terminal outputs the live broadcast content.
In some embodiments, the display module 5552 is further configured to display, in the virtual handle interface, a device close control of the live acquisition device; the first sending module 5552 is further configured to close the live broadcast acquisition device in response to a trigger operation for the device closing control.
In some embodiments, the display module 5552 is further configured to display, in the virtual handle interface, a live close control of the virtual scene; the first sending module 5552 is further configured to send a live broadcast closing instruction for the virtual scene in response to a trigger operation for the live broadcast closing control, where the live broadcast closing instruction is used to instruct to close live broadcast for the virtual scene.
In some embodiments, the display module 5552 is further configured to display, in the virtual handle interface, a sharing control for live broadcasting of the virtual scene; based on the sharing control, receiving a live sharing instruction aiming at a target object; the first sending module 5552 is further configured to send, in response to the live sharing instruction, live invitation information to a server, so that the server sends the live invitation information to a fourth terminal of the target object, where the live invitation information is used to invite the target object to watch live of the virtual scene.
In some embodiments, the display module 5552 is further configured to display, in the virtual handle interface, a live information publishing control for the virtual scene; receiving target information of live broadcast input aiming at the virtual scene based on the information release control; the first sending module 5552 is further configured to send the target message to a server in response to an issue instruction for the target information, where the server is configured to send the target message to a third terminal of the live viewing object, so that the third terminal displays the target message.
By applying the embodiment of the application, the first terminal displays the virtual handle interface, and the virtual handle interface is used for controlling the virtual object in the virtual scene displayed by the second terminal, and the virtual handle interface comprises the live control, and the live control can trigger the live instruction aiming at the virtual scene, and the live instruction is used for indicating to start the live broadcast aiming at the virtual scene. Here, the first terminal may control not only the virtual object in the virtual scene displayed by the second terminal as the virtual handle, but also to start live broadcast for the virtual scene. Thus, 1) the functions of the equipment serving as the virtual handle can be enriched, and the utilization rate of equipment resources is improved; 2) The live broadcast aiming at the virtual scene is started through one key of the live broadcast control, the operation is simple, the operation efficiency of starting the live broadcast of the virtual scene is improved, and the utilization rate of equipment resources is further improved.
The embodiment of the application also provides a live broadcast device of the virtual scene, which is applied to the server and comprises the following components: the receiving module is used for receiving the live broadcast content of the virtual scene; the virtual scene is displayed on the second terminal, virtual objects in the virtual scene are controlled through a virtual handle interface displayed on the first terminal, and the virtual handle interface comprises a live control; the live broadcast content is sent after live broadcast aiming at the virtual scene is started according to a live broadcast instruction triggered by the first terminal through the live broadcast control; and the synthesis module is used for synthesizing the live broadcast content with the scene picture of the virtual scene to obtain a live broadcast picture of the virtual scene, wherein the live broadcast picture is used for being sent to the second terminal so as to enable the second terminal to display the live broadcast picture.
By applying the embodiment of the application, the live broadcast content received by the server is sent after the live broadcast aiming at the virtual scene is started according to the live broadcast instruction triggered by the first terminal through the live broadcast control. The first terminal can be used as a virtual handle to control virtual objects in the virtual scene displayed by the second terminal, and can also be used for controlling and starting live broadcast aiming at the virtual scene. Thus, 1) the functions of the equipment serving as the virtual handle can be enriched, and the utilization rate of equipment resources is improved; 2) The live broadcast aiming at the virtual scene is started through one key of the live broadcast control, the operation is simple, the operation efficiency of starting the live broadcast of the virtual scene is improved, and the utilization rate of equipment resources is further improved.
Embodiments of the present application also provide a computer program product comprising computer-executable instructions or a computer program stored in a computer-readable storage medium. The processor of the electronic device reads the computer executable instructions or the computer program from the computer readable storage medium, and the processor executes the computer executable instructions or the computer program, so that the electronic device executes the live broadcast method of the virtual scene provided by the embodiment of the application.
The embodiment of the application also provides a computer readable storage medium, wherein computer executable instructions or a computer program are stored in the computer readable storage medium, and when the computer executable instructions or the computer program are executed by a processor, the processor is caused to execute the live broadcast method of the virtual scene provided by the embodiment of the application.
In some embodiments, the computer readable storage medium may be RAM, ROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, computer-executable instructions may be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, in the form of programs, software modules, scripts, or code, and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, computer-executable instructions may, but need not, correspond to files in a file system, may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (Hyper Text Markup Language, HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, computer-executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.