CN112399189B - Delay output control method, device, system, equipment and medium - Google Patents
Delay output control method, device, system, equipment and medium Download PDFInfo
- Publication number
- CN112399189B CN112399189B CN201910763740.3A CN201910763740A CN112399189B CN 112399189 B CN112399189 B CN 112399189B CN 201910763740 A CN201910763740 A CN 201910763740A CN 112399189 B CN112399189 B CN 112399189B
- Authority
- CN
- China
- Prior art keywords
- control information
- processed
- information
- video stream
- live video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 238000012545 processing Methods 0.000 claims abstract description 82
- 230000003111 delayed effect Effects 0.000 claims abstract description 29
- 238000010345 tape casting Methods 0.000 claims abstract description 15
- 238000009877 rendering Methods 0.000 claims description 50
- 238000003860 storage Methods 0.000 claims description 32
- 238000004806 packaging method and process Methods 0.000 claims description 25
- 230000005540 biological transmission Effects 0.000 claims description 14
- 230000008569 process Effects 0.000 abstract description 28
- 230000006870 function Effects 0.000 description 14
- 230000000694 effects Effects 0.000 description 12
- 230000004048 modification Effects 0.000 description 9
- 238000012986 modification Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000012856 packing Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000005266 casting Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04J—MULTIPLEX COMMUNICATION
- H04J3/00—Time-division multiplex systems
- H04J3/02—Details
- H04J3/06—Synchronising arrangements
- H04J3/0635—Clock or time synchronisation in a network
- H04J3/0638—Clock or time synchronisation among nodes; Internode synchronisation
- H04J3/0658—Clock or time synchronisation among packet nodes
- H04J3/0661—Clock or time synchronisation among packet nodes using timestamps
- H04J3/0667—Bidirectional timestamps, e.g. NTP or PTP for compensation of clock drift and for compensation of propagation delays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/231—Content storage operation, e.g. caching movies for short term storage, replicating data over plural servers, prioritizing data for deletion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
The application discloses a delay output control method, a delay output control device, a delay output control system, a delay output control device and a delay output control medium. The method comprises the following steps: acquiring log information of a live video stream to be processed from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list; processing the time sequence control information according to an input operation instruction to obtain target control information; and sending the target control information to the data server, wherein the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting, the video output in a delayed manner can be timely and flexibly edited according to needs in the live broadcasting process, the content can be flexibly increased in a live broadcasting picture, broadcasting accidents can be reduced, and broadcasting errors can be timely corrected.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a system, a device, and a medium for controlling delay output.
Background
With the development of internet and television industry, the live broadcast form of television stations in the program broadcasting is being widely applied more and more. In many cases, live broadcasting is needed, and various sudden problems such as real-time data errors, operation errors of broadcasting control personnel, errors of picture-text type or effective information of packaged videos, errors of shot pictures and the like can be encountered in the live broadcasting process, so that delayed broadcasting is utilized in the link, and the method is the safest and simplest solution.
Currently, the main implementation of the delay is to add a delay device in the live channel, i.e. to record in advance before the live program starts to be broadcast and store it in the delay device, and to broadcast the pre-stored content when the broadcast start time is reached, and at the same time, the delay device continues to record the live program, so that a time difference is created between the delay and the broadcast delay. However, the delay device is expensive and needs to be recorded in advance, which increases the cost, and after the contents of a general live video need to be completely recorded, the recorded video is transmitted to the storage of the video editing department through the network, and then the video is subjected to more packaging editing processes through editing software, such as editing operations of adding mosaic, adding special effect, and the like, which is not high in timeliness.
Disclosure of Invention
The application provides a delay output control method, a delay output control device, a delay output control system, a delay output control device and a delay output control medium, which can control the delay output of a live video stream and timely and flexibly edit the live video of the delay output according to needs in the live broadcasting process so as to reduce broadcasting accidents and timely correct broadcasting errors.
In a first aspect, a method for controlling a delay output is provided, including:
acquiring log information of a live video stream to be processed from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list;
processing the time sequence control information according to an input operation instruction to obtain target control information;
and sending the target control information to the data server, wherein the target control information is used for editing the live video stream to be processed and controlling the live video stream to be processed to be output when the stream is cast.
In a second aspect, another delay output control method is provided, which is applied to a data server, and includes:
acquiring log information of a live video stream to be processed, and sending the log information of the live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
receiving and forwarding target control information from the service equipment to a rendering server, wherein the target control information is used for indicating the rendering server to edit the to-be-processed live video stream;
and controlling the live video tape casting processed by the rendering server to be output according to the target control information.
In a third aspect, there is provided a delay output control apparatus comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring log information of a to-be-processed live video stream from a data server and generating time sequence control information according to the log information, and the time sequence control information comprises a time axis and a time sequence event list;
the control module is used for processing the time sequence control information according to an input operation instruction to obtain target control information;
and the transmission module is used for sending the target control information to the data server, and the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting.
In a fourth aspect, an embodiment of the present application provides another delay output control apparatus, including:
the acquisition module is used for acquiring log information of a live video stream to be processed;
the transmission module is used for sending log information of a live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
the transmission module is further configured to receive and forward target control information from the service device to a rendering server, where the target control information is used to instruct the rendering server to edit the live video stream to be processed;
and the control module is used for controlling the live video tape-casting processed by the rendering server to be output according to the target control information.
In a fifth aspect, a delay output control system is provided, which includes a rendering server, the service device according to the third aspect, and the data server according to the fourth aspect, where the rendering server is configured to edit the live video stream to be processed according to the target control information, and obtain a processed live video stream.
In a sixth aspect, an embodiment of the present application provides a service device, including an input device and an output device, further including:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the method of delayed output control according to the first aspect.
In a seventh aspect, embodiments of the present application provide a computer storage medium, where one or more instructions are stored, and the one or more instructions are adapted to be loaded by a processor and to perform the steps of the foregoing first aspect and any possible implementation manner thereof.
The method comprises the steps of acquiring log information of a live video stream to be processed from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list; processing the time sequence control information according to an input operation instruction to obtain target control information; the target control information is used for editing the live video stream to be processed and controlling the live video stream to be processed to be output during tape casting, the recorded video is transmitted to a storage of a video editing department through a network without completely recording the content of the live video, and then more packaging editing processing, such as mosaic adding, special effect adding and other editing operations, is carried out on the video through editing software, but the live video stream to be delayed and output can be edited according to the requirements during the live broadcasting process, so that the timeliness is higher, the content can be flexibly increased in the live broadcasting picture, the broadcasting accidents can be reduced, and the broadcasting errors can be corrected in time.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
Fig. 1 is a schematic flowchart of a delay output control method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of another delay output control method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a delay output control system according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of another delay output control system according to an embodiment of the present application;
fig. 5 is a schematic view of an operation interface of a delay output control according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a delay output control apparatus according to an embodiment of the present disclosure;
FIG. 7 is a block diagram of a delay output control system according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a service device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiments of the present application will be described below with reference to the drawings.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a delay output control method according to an embodiment of the present disclosure.
101. The method comprises the steps of obtaining log information of a live video stream to be processed from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list.
In a specific implementation, the service device may interact with the server through a deployed data processing platform. The service device may include a terminal device, including but not limited to other portable devices such as a mobile phone, laptop computer, or tablet computer having a touch sensitive surface (e.g., a touch screen display and/or a touch pad), or a server. It should also be understood that in some embodiments, the devices described above are not portable communication devices, but rather are desktop computers having touch-sensitive surfaces (e.g., touch screen displays and/or touch pads). The server, also referred to as a server, is a device that provides computing services. Since the server needs to respond to and process the service request, the server generally has the capability of assuming and securing the service.
Specifically, the service device in the embodiment of the present application may be a service device of a delay recording platform, and the delay recording platform may delay the shooting time and the live broadcast time to a certain extent, so as to implement live broadcast delay control.
The data of the live video stream can be stored in a data server, and log information generated in the generation and transmission of the live video stream can be acquired and stored by the data server in real time. Under the condition of performing live broadcast delay processing and needing to edit a video, log information of a live broadcast video stream to be processed can be acquired from a data server, wherein the log information can be understood as information of the live broadcast video stream to be processed on a time sequence, generally speaking, each log can record relevant descriptions such as a timestamp, a relevant equipment name, a user, an operation behavior and the like, and system operation and development personnel can know software and hardware information, check errors in a configuration process and cause of the occurrence of the errors through the logs.
And in the service equipment, the time sequence control information can be generated through the log information so as to carry out editing control operation on the live video stream to be processed. The timing control information may include a time axis and a timing event list.
In an alternative embodiment, time processing points of the time axis may be determined according to the log information, and a time axis corresponding to the log information is generated, where the time processing points are distributed on the time axis;
and determining a list template corresponding to the type identifier of the log information, and generating the time sequence event list based on the list template and the content of the log information.
Because each log (log) carries a time-sequence timestamp, a corresponding time processing point can be positioned on a time axis according to the sequence of the timestamps, and the time-sequence processing point can be called as the log point, and meanwhile, the content of log information reflects the time-sequence events of the time processing point, so that an event list containing different time-sequence events can be created.
Alternatively, different event list templates may be selected as desired. The log information of different live video streams can carry different types of identifiers, and the corresponding relation between the type identifiers and the list template can be preset, so that the corresponding list template is determined according to the type identifiers of the log information, and the time sequence events are written based on the list template to generate the time sequence event list.
After obtaining the timing control information, step 102 may be performed.
102. And processing the time sequence control information according to the input operation instruction to obtain target control information.
The corresponding time point or time period can be accurately edited and set through the time shaft and the time processing point so as to realize the editing of the live video stream to be processed, wherein the live video stream to be processed is the live video stream which needs to be delayed and is not played.
The input operation instruction can be triggered by the operation of the user on the service equipment, and can be realized by the service equipment through a delayed recording platform. The operation is carried out as required. Specifically, different time processing points and time sequence events corresponding to the time processing points can be searched, added, changed or deleted to edit parameters of the live video stream to be processed, wherein the time processing points can be deleted in groups from a time axis and a time sequence event list, and log information of the time period is deleted at the same time of deleting corresponding signals.
In an optional implementation manner, the package information corresponding to the time processing point may be added, changed or deleted based on the timing control information, so as to generate a package editing instruction. The package editing instruction is used for the rendering server to determine package information for video package processing, and the package editing instruction can be recorded in the target control information so as to change the package information of the live video stream to be processed.
The above-mentioned packing information may be understood as information other than the original video signal of the clean picture in the video signal, and various kinds of packing information may be added to the general original video signal through the video packing process, including but not limited to: subtitles, mosaics, special effects, filters, various marks and the like.
By the method, the video package processing can be carried out on the video stream of the specific clip, the time positioning is accurate depending on the time shaft and the time processing point, the package information can be conveniently added, deleted or modified, and flexible and rich video picture content can be obtained for outputting the processed live video stream.
Optionally, when the original video signal needs to be changed for the situations such as an error of a shot picture, a prepared video of the corresponding node may be obtained for editing and replacing, and the prepared video may be pre-recorded.
103. And sending the target control information to the data server, wherein the target control information is used for editing the live video stream to be processed and controlling the live video stream to be processed to be output during tape casting.
The service equipment can send the target control information to the data server, and the data server can receive and store the target control information from the server and process the live video stream to be processed through the delay recording platform. Optionally, a delay broadcast time threshold, that is, a time difference between a live video stream generated in real time and a live video stream actually broadcast, may also be set by the delay recording platform, for example, 5 minutes or 2 minutes and 30 seconds. For the delay control in the embodiment of the present application, the delay control may be performed by using a Network Time Protocol (NTP), where the NTP is a Protocol used to synchronize the time of each computer in the network. The delayed broadcasting is realized by the storage of real-time live broadcast data and the time synchronization of NTP.
The live video stream to be processed can be edited according to the target control information, and the live video stream to be processed is output when the stream of the live video to be processed is controlled to be cast, the specific editing processing operation can be processed by a switching table and/or a rendering server controlled by a data server, and can also be processed in the data server, the specific editing processing operation is not limited in the specific editing processing operation, and finally the time-delay live video stream processed by video packaging can be obtained, and the time-delay duration of the time-delay live video stream is consistent with the set time-delay broadcasting duration threshold value.
In an alternative embodiment, the data server and/or the rendering server may be an Electronic Computer Service (ECS) that performs the corresponding functions, and the ECS is a simple, efficient, safe, reliable, and flexible computing Service with processing capability, and is managed in a simpler and more efficient manner than the physical server. A user can rapidly create or release any plurality of cloud servers without purchasing hardware in advance.
Optionally, the live video stream may be a program live signal or an original video signal, where the program video signal is a video signal processed by video packaging.
In the digital television production system and the broadcasting control system, two buses are arranged on a broadcasting switching table: the Program bus (PGM) is also a Program video signal, i.e. the above-mentioned Program live signal. The original video signal (which may be referred to as a clean signal) is a video signal of captured clean pictures.
The Switcher (Switcher) in the embodiment of the application is equipment for a multi-camera studio or outdoor scene production, and is used for connecting selected videos through cutting, overlaying and scribing so as to create and embed other special effects to complete program production. The main function of the switching station is to provide convenience for timely editing, select various video materials and connect them in turn through transition skills.
The signal control processing based on the delayed recording platform and the data server is realized on the basis of the service equipment, the delayed output control of the live video stream can be carried out by combining different software, the control of the traditional broadcast control is novel, the broadcast control can adapt to new software quickly, and the rendering and packaging instruction formats are compatible perfectly. The method can interact with a third-party broadcasting control platform through the form of a control plug-in, so that the real-time data can be accessed and processed more quickly, the scene making and developing time is prolonged, and the method is flexibly suitable for the broadcasting forms of different programs.
The method comprises the steps of acquiring log information of a to-be-processed live video stream from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list; processing the time sequence control information according to an input operation instruction to obtain target control information; and sending the target control information to the data server, wherein the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting, so that the delayed output of the live video stream can be controlled, the recorded video is not required to be completely recorded, the recorded video is transmitted to a storage of a video editing department through a network, and the video is subjected to more packaging editing processes such as mosaic adding, special effect adding and other editing operations through editing software.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating another delay output control method according to an embodiment of the present disclosure. The embodiment shown in fig. 2 is further optimized based on the embodiment shown in fig. 1, and the method may include:
201. and the data server sends log information of the live video stream to be processed to the service equipment.
The data of the live video stream can be stored in a data server, and log information generated in the generation and transmission of the live video stream can be acquired and stored by the data server in real time. Specifically, the data server may execute step 201 in response to the obtaining instruction of the service device. Under the condition of carrying out live broadcast delay processing and needing to edit videos, the service equipment can acquire log information of a live broadcast video stream to be processed from the data server, wherein the log information can be understood as information of the live broadcast video stream to be processed on a time sequence, generally, each log can record relevant descriptions such as a timestamp, a relevant equipment name, a user, operation behaviors and the like, and system operation and development personnel can know software and hardware information and check errors in a configuration process and reasons for the occurrence of the errors through the logs.
202. The service equipment acquires log information of a live video stream to be processed from the data server and generates time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list.
In an optional implementation, the processing the timing control information according to the input operation instruction to obtain the target control information includes:
searching, adding, changing or deleting the time sequence events corresponding to the time processing points according to the input first operation instruction of the time processing points, and determining the time delay broadcasting time length threshold;
and adding, changing or deleting the package information corresponding to the time processing point according to the input second operation instruction for the time processing point to generate the package editing instruction, so as to obtain the target control information.
And the user accesses the delayed recording platform through the service equipment, and can trigger the first operation instruction through operation to edit the live video stream to be processed based on the time point. Such as:
the positions of the time axis and the event list can be quickly located through Log point searching, and the positions of the Log points can be specifically searched through keywords, for example, the corresponding Log points are determined by searching the content in the event list according to the input keywords.
Deleting the Log points from a time axis and an event list in groups, and deleting the Log information of the time period while deleting the live video stream to be processed;
modifying the Log point information, namely adding a modification page in the system, and enabling a user to input data for editing;
increasing Log points, embedding an executable file in the system, and receiving the added Log points and the time sequence events in a window value transmission mode;
optionally, the user accesses the delayed receiving and recording platform through the service device, and may trigger the second operation instruction through operation, so as to edit the video package of the live video stream to be processed based on the time point. For example, the added package information template and content may be selected, added to the database of the data server, and notify the delayed listing platform to refresh the timeline, update Log information of the event list, and the like, which is not limited in the embodiment of the present application.
In an optional implementation manner, the data server may store a plurality of package information templates, may directly obtain and use the package information templates, or add, delete, and modify specific package contents therein, and the service device may further obtain the package information templates from the data server, process the to-be-processed live video stream by using the package information templates (which may be edited first), obtain preview video data, and play the preview video data, so as to implement a preview function. Specifically, an executable file is embedded in the delayed recording platform, and log point information is received in a window value transmission mode, so that playing and previewing of the corresponding packaging template can be realized.
The window value transfer involved in the embodiment of the application is a message transfer mechanism of windows software, and can be transferred between two forms (forms). In VB programming, a Form object is a window, or dialog box, that forms part of an application user interface.
203. And the service equipment processes the time sequence control information according to the input operation instruction to obtain target control information.
204. The service device transmits the target control information to the data server.
The above steps 202 to 204 may refer to the detailed descriptions in steps 101 to 103 of the embodiment shown in fig. 1, and are not described herein again.
205. And the data server receives and forwards the target control information from the service equipment to the rendering server.
The target control information is used for instructing the rendering server to edit the to-be-processed live video stream.
A user can edit and modify the content of the rendering server on the service equipment through the broadcast control platform, wherein all modifications and edits are recorded in the data server, and the data server executes a forwarding control function and provides modification according to correct content and information.
206. And the rendering server edits the live video stream to be processed according to the target control information to obtain a processed live video stream.
In the video processing process, complex special effects and effects are necessarily involved, and the special effects and effects are difficult to achieve real-time display in the editing and playback process based on the current computer computing capability, so that after editing is finished, the required final effect is achieved through rendering to obtain the actually played live video stream.
The rendering server can be used for editing the packaging information of the live video stream to be processed according to the packaging editing instruction to obtain the processed live video stream, and specifically, adding, deleting and modifying the packaging information of the live video stream to be processed according to the content of the packaging editing instruction.
207. And the data server controls the live video tape-casting processed by the rendering server to be output according to the delayed broadcasting time length threshold value in the target control information.
Specifically, the data server may check the live video stream processed by the rendering server based on the target control information, monitor that the live video stream meets the requirement, and control delayed output of the live video stream according to the setting of the target control information. Because the video processing time is generally ahead of the actual broadcasting time, the data server can control the video processing time to broadcast according to a preset delay broadcasting time threshold after the rendering is finished. The data server can send an output instruction to the rendering server according to the delay broadcast time threshold or the output instruction of the service equipment so as to output the video after rendering processing, wherein the output instruction of the service equipment can contain output time information selected by a user, the video modified or replaced by the delay can be selected from excerpted videos, the connection time between the video and the actually broadcast video can be determined through the output time information, and the determination of the output time information and the connection time can be accurately selected based on the time axis and the time processing point.
Optionally, the live video stream processed by the rendering server may be transmitted to the switching station, and the final live video stream may be output through the switching station. Through the network, other terminal equipment can receive the live broadcast signal to watch live broadcast, and the live broadcast is displayed as an edited live broadcast picture in a display interface of the terminal equipment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a delay output control system according to an embodiment of the present disclosure, live video data may be collected in real time by a camera 31 or other devices 32, and processed by a switching station, where the content of a rendering server 33 may be edited and modified by a broadcast control platform a, where all modifications and edits are recorded in a data server B, and the modifications are provided according to correct content and information, and the switching station 33 finally realizes unified output after rendering (3).
In the digital television production system and the broadcasting control system, two buses are arranged on a broadcasting switching table: the Program bus (PGM) is also a Program video signal, which is a Program video signal that is finally broadcast. The Preview bus (Preview, PVW) is also the program Preview signal.
Specifically, the program bus has a plurality of buttons, each button representing a different video input, which direct selected signals to the line input PGM bus, and a BLACK additional button on the head of the PGM bus of the plurality of switching stations for darkening the screen.
If the switching station is required to perform functions such as overlaying, two buses (usually a program bus and a preview bus) and a push rod are required to control the speed of overlaying, if the push rod is pushed to the bottom, the picture of the program bus enters, the picture of the preview bus is drawn out, and if the push rod is stopped in the middle, the overlaying is kept, so that the overlaying of two materials is formed.
The preview bus has exactly the same number, type and arrangement of buttons as the PGM bus and functions similarly, except that the output of the preview bus is PVW output, typically interfaced to a preview monitor (hereinafter pre-monitored) for viewing signals to be switched out. For example, if the director presses the button of camera No. 1 on the preview bus, the picture of camera No. 1 appears on the pre-monitor without affecting the output of PGM, and if the director dislikes the picture of camera No. 1, the director wants to switch to machine No. 2, only needs to press the button of machine No. 2 on the preview bus.
The bold lines in fig. 3 indicate the process flow of the live video stream, while the bold lines do not indicate the interaction of control information. Based on the control of the switching console, the PGM screen can be previewed in real time on the delayed listing platform, and both Clean (Clean) signals 1 and PGM signals 2 are subjected to delay processing, the aforementioned Log point editing is also in this module, and the PGM signals 2 can be edited (data interaction and processing of a, b, and c in the figure), such as the aforementioned video packing processing, which is not described herein again.
Fig. 4 is a schematic structural diagram of another delay output control system according to an embodiment of the present application. In an alternative embodiment, if a live program has some package information that can be determined to be free of errors, i.e., no modification is required. Specifically, referring to fig. 4, on the basis of fig. 3, the broadcast control platform a can control that the package information + Clean signal of the part is not subjected to editing processing based on the data server, and the combined signal can be regarded as a Clean signal, that is, fixed content that does not need to be updated in the part of the package, and can be implemented without a delay system, that is, the flow direction shown in d-e-4-3 in fig. 4.
The broadcast control platform and the delayed recording platform in the embodiment of the application can be accessed based on service equipment, and data processing steps in the embodiment are carried out with the help of a data processor to form a delayed output control system so as to control delayed output of a live video stream, so that the video output in a delayed manner can be timely and flexibly edited according to needs in a live broadcasting process, specifically, the editing processing of each time point or time period is carried out, the recorded video is not required to be completely recorded after the content of the live video is completely recorded, the recorded video is transmitted to the storage of a video editing department through a network, and more packaging editing processing is carried out on the video through editing software, such as editing operations of adding mosaic, adding special broadcasting effects and the like, the real-time performance is high, the content is flexibly increased in live broadcasting of a picture, broadcasting accidents are reduced, and errors are timely corrected. Meanwhile, video content (such as played) which does not need to be changed any more can be deleted or backed up to a cloud database, so that the local storage space can be saved, and the smoothness of video processing can be guaranteed.
Further, referring to fig. 5, fig. 5 is a schematic view of an operation interface of the delay output control according to the embodiment of the present application.
As shown in fig. 5, the operation interface may be an operation interface on the service device, where an area 50 is a video detail display window, which may clearly display a live view of a live video stream to be processed, an area 51 is an editing setting area, which may edit the live video stream to be processed, where an area 511 is a specific modification item, an area 512 is a specific modification content after the modification item is selected, which may support addition, deletion, and change operations, an area 52 is a template selection window, which may display different packaging information templates for video packaging processing, and a user may select the template by himself, and an area 53 is a serial single window, which includes a plurality of serial single during live playing, that is, a segmented video in live playing may be linked in a fixed order. The segmentation of the live video stream to be processed may be performed based on a time processing point (Log point) in the foregoing embodiment, that is, the Log point information is previewed on a time axis, package contents may be added, deleted, modified, PGM signal contents may be modified, and a serial connection list may be automatically generated for broadcasting, which is not described herein again.
Specifically, the dashed box 531 indicates one of the selected serial sheets, and the serial sheet in the selected state can be edited. That is, the video of the serial sheet can be displayed and previewed in 51, the packaging template used by the serial sheet is selected through the area 52, the specific editing operation is performed in the area 51, the edited video is generated and rendered and output, and therefore problem editing in the live broadcasting process can be performed in real time, and various live broadcasting scenes can be flexibly adapted. In the live broadcast process, the shooting time and the live broadcast time are delayed, and when real-time data errors occur in the live broadcast, the section of packaging information can be deleted or the packaging content of the video can be modified manually; if the shooting picture is wrong, a substitute picture can be selected, and a standby broadcast signal can be switched, so that a complete and flexible solution is provided for live broadcast.
Based on the description of the foregoing embodiments and the similar interface shown in fig. 5, the following may be specifically performed but is not limited to: the method comprises the steps of detecting the connection state of an engine, receiving and sending instructions, previewing, reading and importing scene internal data, reading and replacing materials in a database, switching logic control, controlling plug-in data to get through different systems or platforms, performing hierarchical control, calling a background to start and close a rendering module (server support) and the like, so as to realize comprehensive live broadcast management.
In an alternative implementation manner, a processor having a function of a rendering module may be used instead of the data server and the rendering server to implement a similar processing method, which is not limited in this embodiment. For a live scene with large data volume, a proprietary data processor is used to solve the storage and processing problems of a large amount of data, such as caching and managing video streams, system logs and the like, and a rendering server mainly provides rendering processing of the video streams, so that the division of labor is clear, and the data processing efficiency and stability of the system are improved.
In one possible implementation, the system can provide more support for augmented reality applications based on existing live video packaging. The Augmented Reality (Augmented Reality) technology related in the embodiment of the application is a technology for skillfully fusing virtual information and a real world, and a plurality of technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like are widely applied, virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer is applied to the real world after being simulated, and the two kinds of information complement each other, so that the real world is enhanced.
Specifically, similar to the processing method in the foregoing embodiment, for an AR supported by tracking data, after recording all camera motion information, displacement data may be modified for errors, so as to solve the problems of position errors, picture errors, and the like occurring when the AR is displayed in a live broadcast.
The embodiment of the present application further provides another delay output control method, which is applied to a data server, and includes:
acquiring log information of a live video stream to be processed, and sending the log information of the live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
receiving and forwarding target control information from the service equipment to a rendering server, wherein the target control information is used for instructing the rendering server to edit the to-be-processed live video stream;
and controlling the live video tape-casting processed by the rendering server to be output according to the target control information.
The method executed by the data server has been described in the foregoing embodiments, and is not described herein again.
For video packaging in live broadcasting, the existing scheme can be used for modifying a certain fixed project (such as live broadcasting of sports events such as basketball and football, or comprehensive art and the like), live video streams are packaged through preset logics, the flexibility is low, and errors can be set in advance. The system can support corresponding data processing in a plug-in mode, is not limited to certain type of live broadcast, can capture data in real time, customizes broadcast control logic according to program forms, and flexibly adapts to various live broadcast scenes.
Based on the description of the embodiment of the delay output control method, the embodiment of the application also discloses a delay output control device, which can correspond to the service equipment.
Referring to fig. 6, the delay output control device 600 includes:
an obtaining module 610, configured to obtain log information of a live video stream to be processed from a data server, and generate timing control information according to the log information, where the timing control information includes a time axis and a timing event list;
a control module 620, configured to process the timing control information according to an input operation instruction, and obtain target control information;
a transmission module 630, configured to send the target control information to the data server, where the target control information is used to edit the to-be-processed live video stream and output the to-be-processed live video stream when controlling the to-be-processed live video stream flow casting.
In a possible implementation manner, the target control information includes a package editing instruction and a delay broadcast duration threshold, where the package editing instruction is used by the rendering server to determine package information for performing video package processing;
the control module 620 is specifically configured to:
searching, adding, changing or deleting the time sequence events corresponding to the time processing points according to the input first operation instruction of the time processing points, and determining the time delay broadcasting time length threshold;
and adding, changing or deleting the package information corresponding to the time processing point according to the input second operation instruction for the time processing point to generate the package editing instruction, so as to obtain the target control information.
In a possible implementation manner, the obtaining module 610 is specifically configured to:
determining time processing points of the time axis according to the log information, and generating the time axis corresponding to the log information, wherein the time processing points are distributed on the time axis;
and determining a list template corresponding to the type identifier of the log information, and generating the time sequence event list based on the list template and the content of the log information.
In a possible implementation manner, the system further includes a preview module 640, configured to obtain a package information template from the data server, and process the to-be-processed live video stream by using the package information template to obtain preview video data; and playing the preview video data.
Optionally, the live video stream is a program live signal or an original video signal, where the program video signal is a video signal processed by video packaging.
According to an embodiment of the present application, the steps involved in the methods shown in fig. 1 and fig. 2 may be performed by the respective modules in the delay output control apparatus 600 shown in fig. 6.
Based on the description of the foregoing embodiment of the delay output control method, an embodiment of the present application further discloses a delay output control apparatus, which can correspond to the foregoing data server, and includes:
the acquisition module is used for acquiring log information of a live video stream to be processed;
the transmission module is used for sending log information of a live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
the transmission module is further configured to receive and forward target control information from the service device to a rendering server, where the target control information is used to instruct the rendering server to edit the to-be-processed live video stream;
and the control module is used for controlling the live video tape-casting processed by the rendering server to be output according to the target control information.
The delay output control device may perform any steps that may be performed by the data server as described in the foregoing embodiments, and details are not described here.
According to another embodiment of the present application, the modules in the delay time output control apparatus 600 shown in fig. 6 may be respectively or completely combined into one or several other modules to form the delay time output control apparatus, or some module(s) may be further split into multiple functionally smaller modules to form the delay time output control apparatus, which may implement the same operation without affecting the implementation of the technical effect of the embodiment of the present application. The modules are divided based on logic functions, and in practical application, the functions of one module can be realized by a plurality of modules, or the functions of a plurality of modules can be realized by one module. In other embodiments of the present application, the terminal-based terminal may also include other modules, and in practical applications, these functions may also be implemented by the assistance of other modules, and may be implemented by cooperation of a plurality of modules.
According to another embodiment of the present application, the delayed output control apparatus 600 as shown in fig. 6 may be constructed by running a computer program (including program codes) capable of executing the steps involved in the respective methods as shown in fig. 1 and/or fig. 2 on a general-purpose computing device such as a computer including a processing element such as a Central Processing Unit (CPU), a random access storage medium (RAM), a read only storage medium (ROM), and a storage element, and implementing the delayed output control method of the embodiment of the present application. The computer program may be recorded on a computer-readable recording medium, for example, and loaded into and executed by the computing apparatus via the computer-readable recording medium.
The delay output control device 600 in the embodiment of the present application may acquire log information of a live video stream to be processed from a data server, and generate timing control information according to the log information, where the timing control information includes a time axis and a timing event list; processing the time sequence control information according to an input operation instruction to obtain target control information; and sending the target control information to the data server, wherein the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting, and can be used for flexibly editing the video output in a delayed manner in the live process as required, flexibly increasing the content in the live picture, reducing the broadcasting accidents and timely correcting the broadcasting errors.
Based on the above description of the embodiments of the delay output control method, please refer to fig. 7, and fig. 7 is a schematic diagram of a framework of a delay output control system according to an embodiment of the present application, in which the delay output control system 700 includes:
a rendering server 710, a service device 720 as described above in the foregoing embodiment, and a data server 730 as described above in the foregoing embodiment;
the rendering server 710 may be configured to edit the to-be-processed live video stream according to the target control information, and obtain a processed live video stream. The system may further include other terminal devices, such as a camera device for acquiring a live video signal, a switching station linked with a rendering server, a conversion device (for converting a video signal and a digital signal), and the like, which is not limited in this embodiment of the present application.
The functions and execution methods of the above devices have been described in the foregoing embodiments, and are not described again here.
Based on the description of the method embodiment and the device embodiment, the embodiment of the application also provides a service device. Referring to fig. 8, the service device 800 includes at least a processor 801, an input device 802, an output device 803, and a computer storage medium 804. The processor 801, the input device 802, the output device 803, and the computer storage medium 804 within the terminal may be connected by a bus or other means.
A computer storage medium 804 may be stored in the memory of the terminal, the computer storage medium 604 being configured to store a computer program comprising program instructions, and the processor 801 being configured to execute the program instructions stored by the computer storage medium 804. The processor 801 (or CPU) is a computing core and a control core of the terminal, and is adapted to implement one or more instructions, and in particular, is adapted to load and execute the one or more instructions so as to implement a corresponding method flow or a corresponding function; in one embodiment, the processor 801 described above in the embodiments of the present application may be configured to perform a series of processes, including any steps in the method performed by the service device in the foregoing embodiments, and the like.
An embodiment of the present application further provides a computer storage medium (Memory), where the computer storage medium is a Memory device in a terminal and is used to store programs and data. It is understood that the computer storage medium herein may include a built-in storage medium in the terminal, and may also include an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in this memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by processor 801. The computer storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; and optionally at least one computer storage medium located remotely from the processor.
In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by the processor 801 to implement the respective steps of the method in the above-described embodiments; in particular implementations, one or more instructions in a computer storage medium may be loaded and executed by processor 801 to perform any of the steps of fig. 1 and 2.
The service device 800 of the embodiment of the present application may obtain log information of a to-be-processed live video stream from a data server, and generate timing control information according to the log information, where the timing control information includes a time axis and a timing event list; processing the time sequence control information according to an input operation instruction to obtain target control information; and sending the target control information to the data server, wherein the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting, and can be used for flexibly editing the video output in a delayed manner in the live process as required, flexibly increasing the content in the live picture, reducing the broadcasting accidents and timely correcting the broadcasting errors.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the division of the module is only one logical division, and other divisions may be possible in actual implementation, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not performed. The shown or discussed mutual coupling, direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some interfaces, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The usable medium may be a read-only memory (ROM), or a Random Access Memory (RAM), or a magnetic medium, such as a floppy disk, a hard disk, a magnetic tape, a magnetic disk, or an optical medium, such as a Digital Versatile Disk (DVD), or a semiconductor medium, such as a Solid State Disk (SSD).
Claims (9)
1. A delay output control method is applied to service equipment and is characterized by comprising the following steps:
acquiring log information of a live video stream to be processed from a data server, and generating time sequence control information according to the log information, wherein the time sequence control information comprises a time axis and a time sequence event list;
acquiring a packaging information template from the data server, and processing the live video stream to be processed by using the packaging information template to acquire preview video data;
playing the preview video data;
processing the time sequence control information according to an input operation instruction to obtain target control information;
and sending the target control information to the data server, wherein the target control information is used for editing the live video stream to be processed and controlling the live video stream to be processed to be output when the stream is cast.
2. The method of claim 1, wherein the target control information comprises a package editing instruction and a delayed play-out duration threshold, wherein the package editing instruction is used for a rendering server to determine package information for performing video packaging processing;
the processing the time sequence control information according to the input operation instruction, and the obtaining the target control information includes:
searching, adding, changing or deleting a time sequence event corresponding to the time processing point and the time processing point according to an input first operation instruction of the time processing point on the time axis, and determining the time delay broadcasting time threshold;
and adding, changing or deleting the packaging information corresponding to the time processing point according to an input second operation instruction for the time processing point on the time axis to generate the packaging editing instruction, so as to obtain the target control information.
3. The method of claim 2, wherein generating timing control information from the log information, the timing control information including a timeline and a list of timing events, comprises:
determining time processing points of the time axis according to the log information, and generating the time axis corresponding to the log information, wherein the time processing points are distributed on the time axis;
and determining a list template corresponding to the type identification of the log information, and generating the time sequence event list based on the list template and the content of the log information.
4. A delay output control method is applied to a data server and is characterized by comprising the following steps:
acquiring log information of a live video stream to be processed, and sending the log information of the live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
sending a package information template to the service device, wherein the package information template is used by the service device for processing the live video stream to be processed to obtain and play preview video data;
receiving and forwarding target control information from the service equipment to a rendering server, wherein the target control information is used for indicating the rendering server to edit the to-be-processed live video stream;
and controlling the live video tape casting processed by the rendering server to be output according to the target control information.
5. A time-lapse output control apparatus, comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring log information of a to-be-processed live video stream from a data server and generating time sequence control information according to the log information, and the time sequence control information comprises a time axis and a time sequence event list;
the preview module is used for acquiring a package information template from the data server, processing the live video stream to be processed by using the package information template and acquiring preview video data; playing the preview video data;
the control module is used for processing the time sequence control information according to an input operation instruction to obtain target control information;
and the transmission module is used for sending the target control information to the data server, and the target control information is used for editing the to-be-processed live video stream and controlling the to-be-processed live video stream to be output during tape casting.
6. A time-lapse output control apparatus, comprising:
the acquisition module is used for acquiring log information of a live video stream to be processed;
the transmission module is used for sending log information of a live video stream to be processed to service equipment, wherein the log information is used for the service equipment to generate target control information;
the transmission module is further configured to: sending a package information template to the service device, wherein the package information template is used by the service device for processing the live video stream to be processed to obtain and play preview video data;
the transmission module is further configured to receive and forward target control information from the service device to a rendering server, where the target control information is used to instruct the rendering server to edit the to-be-processed live video stream;
and the control module is used for controlling the live video tape-casting processed by the rendering server to be output according to the target control information.
7. A delay output control system, comprising a rendering server, a service device and a data server, wherein the service device is the delay output control apparatus according to claim 5, and the data server is the delay output control apparatus according to claim 6, wherein the rendering server is configured to edit the live video stream to be processed according to the target control information, so as to obtain a processed live video stream.
8. A service device comprising an input device and an output device, characterized by further comprising:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium having stored thereon one or more instructions adapted to be loaded by the processor and to execute the method of delayed output control according to any of claims 1-4.
9. A computer-readable storage medium having stored thereon one or more instructions adapted to be loaded by a processor and to execute the method of delayed output control according to any of claims 1-4.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910763740.3A CN112399189B (en) | 2019-08-19 | 2019-08-19 | Delay output control method, device, system, equipment and medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910763740.3A CN112399189B (en) | 2019-08-19 | 2019-08-19 | Delay output control method, device, system, equipment and medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112399189A CN112399189A (en) | 2021-02-23 |
| CN112399189B true CN112399189B (en) | 2022-05-17 |
Family
ID=74603440
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201910763740.3A Active CN112399189B (en) | 2019-08-19 | 2019-08-19 | Delay output control method, device, system, equipment and medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112399189B (en) |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113099130A (en) * | 2021-04-15 | 2021-07-09 | 北京字节跳动网络技术有限公司 | Collaborative video processing method and device, electronic equipment and storage medium |
| CN113347458A (en) * | 2021-06-04 | 2021-09-03 | 广州博冠信息科技有限公司 | Live broadcast method, live broadcast device, live broadcast system, storage medium and electronic equipment |
| CN113660505B (en) * | 2021-08-31 | 2024-03-22 | 天津泰讯视动科技有限责任公司 | Direct broadcasting method, controller and system based on ultrahigh-definition direct broadcasting and direct broadcasting integrated machine |
| CN113905270B (en) * | 2021-11-03 | 2024-04-09 | 广州博冠信息科技有限公司 | Program broadcasting control method and device, readable storage medium and electronic equipment |
| CN114363644B (en) * | 2021-12-15 | 2022-09-06 | 广州波视信息科技股份有限公司 | Time delay method, time delay device and storage medium |
| CN116112702B (en) * | 2023-01-17 | 2025-08-05 | 北京达佳互联信息技术有限公司 | Live broadcast method, device, electronic device and storage medium |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016018787A1 (en) * | 2014-07-31 | 2016-02-04 | Dolby Laboratories Licensing Corporation | Audio processing systems and methods |
| EP3110158A1 (en) * | 2015-06-22 | 2016-12-28 | AD Insertion Platform Sarl | Method and platform for automatic selection of video sequences to fill a break in a broadcasted program |
| WO2017166499A1 (en) * | 2016-03-30 | 2017-10-05 | 乐视控股(北京)有限公司 | Live broadcast delay method and device |
| WO2018187318A1 (en) * | 2017-04-04 | 2018-10-11 | Qualcomm Incorporated | Segment types as delimiters and addressable resource identifiers |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7698728B2 (en) * | 2003-11-12 | 2010-04-13 | Home Box Office, Inc. | Automated playlist chaser |
| CN116612731A (en) * | 2016-07-22 | 2023-08-18 | 杜比实验室特许公司 | Network-based processing and distribution of multimedia content for live musical performances |
| US11082734B2 (en) * | 2018-12-21 | 2021-08-03 | Turner Broadcasting System, Inc. | Publishing a disparate live media output stream that complies with distribution format regulations |
-
2019
- 2019-08-19 CN CN201910763740.3A patent/CN112399189B/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016018787A1 (en) * | 2014-07-31 | 2016-02-04 | Dolby Laboratories Licensing Corporation | Audio processing systems and methods |
| EP3110158A1 (en) * | 2015-06-22 | 2016-12-28 | AD Insertion Platform Sarl | Method and platform for automatic selection of video sequences to fill a break in a broadcasted program |
| WO2017166499A1 (en) * | 2016-03-30 | 2017-10-05 | 乐视控股(北京)有限公司 | Live broadcast delay method and device |
| WO2018187318A1 (en) * | 2017-04-04 | 2018-10-11 | Qualcomm Incorporated | Segment types as delimiters and addressable resource identifiers |
Non-Patent Citations (1)
| Title |
|---|
| "电视直播过程中延时技术的应用策略分析";何伟;《西部广播电视》;20180125;全文 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112399189A (en) | 2021-02-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112399189B (en) | Delay output control method, device, system, equipment and medium | |
| CN112291627B (en) | Video editing method and device, mobile terminal and storage medium | |
| US11943512B2 (en) | Content structure aware multimedia streaming service for movies, TV shows and multimedia contents | |
| US8515241B2 (en) | Real-time video editing | |
| CA2943975C (en) | Method for associating media files with additional content | |
| JP4278189B2 (en) | Digital multimedia editing and data management system | |
| EP2439650A2 (en) | Method and system for providing distributed editing and storage of digital media over a network | |
| CN108632676A (en) | Display methods, device, storage medium and the electronic device of image | |
| US20240107087A1 (en) | Server, terminal and non-transitory computer-readable medium | |
| CN112333536A (en) | Audio and video editing method, equipment and computer readable storage medium | |
| US20210264686A1 (en) | Method implemented by computer for the creation of contents comprising synthesis images | |
| US20070106680A1 (en) | Digital media asset management system and method for supporting multiple users | |
| CN113365093B (en) | Live broadcast method, device, system, electronic equipment and storage medium | |
| US12212883B2 (en) | Information processing devices, methods, and computer-readable medium for performing information processing to output video content using video from mutiple video sources | |
| US20250227325A1 (en) | Server, method and computer program | |
| CN112383790A (en) | Live broadcast screen recording method and device, electronic equipment and storage medium | |
| US20030214605A1 (en) | Autokeying method, system, and computer program product | |
| JP2004126637A (en) | Contents creation system and contents creation method | |
| KR20030062315A (en) | Method, system, and program for creating, recording and distributing digital stream contents | |
| WO2006001238A1 (en) | Archive management device, archive management system, and archive management program | |
| JP4129162B2 (en) | Content creation demonstration system and content creation demonstration method | |
| CN104837061A (en) | Method and device for modifying and managing video playlist | |
| CN103618913A (en) | Method and device for playing 3D film source in intelligent television | |
| US7675827B2 (en) | Information processing apparatus, information processing method, and program | |
| KR20060035033A (en) | Customized video production system and method using video sample |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |