CN110177294A - Player audio and video synchronization method and system, storage medium and terminal - Google Patents
Player audio and video synchronization method and system, storage medium and terminal Download PDFInfo
- Publication number
- CN110177294A CN110177294A CN201910501166.4A CN201910501166A CN110177294A CN 110177294 A CN110177294 A CN 110177294A CN 201910501166 A CN201910501166 A CN 201910501166A CN 110177294 A CN110177294 A CN 110177294A
- Authority
- CN
- China
- Prior art keywords
- video
- audio
- data
- player
- duration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000001360 synchronised effect Effects 0.000 claims abstract description 21
- 230000015654 memory Effects 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 9
- 230000005055 memory storage Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4331—Caching operations, e.g. of an advertisement for later insertion during playback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The present invention provides a kind of player audio and video synchronization method and system, storage medium and terminal, the following steps are included: when the video data in player is advanced in time compared to audio data, video data is stored into caching, until constructing video based on video data when video data is synchronous with audio data;When the video data in player lags in time compared to audio data, video is constructed based on video data, and the playing audio-fequency data when lagging duration and being not more than first threshold, the playing audio-fequency data after lag duration is greater than when first threshold is not more than second threshold and waits the first duration, playing audio-fequency data after the second duration is waited when lagging duration and being greater than second threshold.Audio video synchronization to audio is realized being played simultaneously for audio-video using audio as benchmark by player audio and video synchronization method of the invention and system, storage medium and terminal, so that the broadcasting of audio-video is able to maintain within an acceptable range in varied situations.
Description
Technical field
The present invention relates to the technical field of player, more particularly to a kind of player audio and video synchronization method and system,
Storage medium and terminal.
Background technique
In audio/video flow, the frame per second (Frame Rate) of video indicates the frame number (picture number) of display in video one second;Sound
The sample rate (Sample Rate) of frequency indicates the number for the sample (Sample) that audio one second plays.Passed through according to these data
Simply calculate available audio stream a certain sample play time and video flowing a certain frame play time.Audio-video
It synchronizes and refers to that each frame picture that player is rendering and each section of sound being played on are all strictly mapped, no
There are the deviations that human ear and naked eyes can distinguish.Under ideal conditions, audio stream should be synchronous with video flowing, will not
There is deviation.But in reality, audio, video data is acquired from collection terminal, arrives transmission over networks audio, video data, then arrive and play
Hold playing audio-video data during this, due to code encoding/decoding mode, deblocking packet mode, the network transmission protocol, terminal log evidence
The factors such as processing mode, the delay in network, shake and terminal hardware performance, will cause the deviation of audio, video data, draw
Play the asynchronous of audio and video playing.It in the prior art, can be by decapsulation, solution as shown in Figure 1, audio/video flow, which reaches, plays end
Code stream journey obtains original audio, video data, and to carry out, corresponding audio is played and video image is drawn.However, in above-mentioned mistake
It is likely encountered in journey and following causes the nonsynchronous situation of audio-video:
(1) when just starting to play, when some video frame hard decoders, decoder is possible to caching part video frame, causes to regard
The output of frequency frame is later than audio frame, causes audio-video asynchronous.
(2) due to network Caton, after audio and video buffer runs out of, network restores again, decoder buffered video frame again,
Cause video frame output to be later than audio frame, causes audio-video asynchronous.
(3) insufficient due to playing end hardware decoding capability, video hard decoder does not catch up with audio decoder speed, causes audio-video
It is asynchronous.
Therefore, it is necessary to a kind of datum quantities, and video and the broadcasting speed of audio can be allowed all using the amount as standard, adjusted at any time
The speed of broadcasting, to guarantee the synchronization of audio and video.In the prior art, audio-visual synchronization mainly uses three kinds of modes:
(1) in audio video synchronization to audio, synchronization video is exactly carried out on the basis of the broadcasting speed of audio.If video compares audio
Broadcasting speed is slow, then accelerates video playout speed;Conversely, then delayed video plays.
(2) by audio sync to video, isochronous audio is exactly carried out on the basis of the broadcasting speed of video.
(3) on the basis of on the clock outside video and audio sync, selecting an external clock, video and audio are broadcast
Speed is put all using the clock as standard.
Therefore, playing process sound intermediate frequency and video synchronization be actually a dynamic process, it is temporary for synchronizing,
It is asynchronous, it is normality.In the prior art, audio-video timestamp deviation can be used to describe the synchronism of audio/video flow.According to
The explanation of ITU-R BT.1359-1, when audio-video timestamp deviation -125ms (i.e. audio lags behind video 125ms) to+
When between 45ms (i.e. audio leads video 45ms), the difference of the imperceptible audio-video timestamp of people, this region be can consider
It is retaining zone;When audio-video timestamp deviation is when except -185ms to+90ms, the timestamp deviation of audio and video can reach
To unacceptable degree, this region may be considered asynchronous region.
Therefore, how to realize that the synchronization of audio-video in player becomes current project urgently to be resolved.
Summary of the invention
In view of the foregoing deficiencies of prior art, the purpose of the present invention is to provide a kind of player audio-visual synchronization sides
Audio video synchronization to audio is realized that synchronizing for audio-video is broadcast using audio as benchmark by method and system, storage medium and terminal
It puts, so that the broadcasting of audio-video is able to maintain within an acceptable range in varied situations.
In order to achieve the above objects and other related objects, the present invention provides a kind of player audio and video synchronization method, including
Following steps: when the video data in player is advanced in time compared to audio data, the video data is stored
Into caching, until constructing video based on the video data when video data is synchronous with the audio data;Work as broadcasting
When video data in device lags in time compared to audio data, video is constructed based on the video data, and lagging
Playing audio data when duration is not more than first threshold is greater than the first threshold in the lag duration and is not more than second
Playing audio data after the first duration is waited when threshold value, waits second when the lag duration is greater than the second threshold
Playing audio data after duration.
In one embodiment of the invention, it is respectively provided with timestamp on the video data and the audio data, based on view
Frequency data time stamp and audio data timestamp judge the video data it is advanced in time compared to the audio data and
Lag.
In one embodiment of the invention, the first threshold is 45ms.
In one embodiment of the invention, the second threshold is 90ms.
In one embodiment of the invention, described first when a length of 1ms-5ms.
In one embodiment of the invention, described second when a length of 20ms-30ms.
In one embodiment of the invention, applied to the player based on HLS protocol.
Accordingly, the present invention provides a kind of player audio-visual synchronization system, including advanced synchronization module and late synchronous
Module;
The advanced synchronization module is used for when the video data in player is advanced in time compared to audio data,
The video data is stored into caching, until being based on the video counts when video data is synchronous with the audio data
According to building video;
The late synchronous module is used for when the video data in player lags in time compared to audio data,
Video, and the playing audio data when lagging duration and being not more than first threshold are constructed based on the video data, described
Lag duration, which is greater than when the first threshold is not more than second threshold, waits playing audio data after the first duration, described
Lag duration waits playing audio data after the second duration when being greater than the second threshold.
The present invention provides a kind of storage medium, is stored thereon with computer program, realization when which is executed by processor
Above-mentioned player audio and video synchronization method.
The present invention provides a kind of terminal, comprising: processor and memory;
The memory is for storing computer program;
The processor is used to execute the computer program of the memory storage, so that the terminal executes above-mentioned broadcast
Put device audio and video synchronization method.
As described above, player audio and video synchronization method of the invention and system, storage medium and terminal, have with following
Beneficial effect:
(1) using audio as benchmark, audio video synchronization to audio is realized to being played simultaneously for audio-video, so that sound regards
The broadcasting of frequency is able to maintain within an acceptable range in varied situations;
(2) can be applied to it is various need in audio and video playing scene, to realize the audio/video communication service of high quality.
Detailed description of the invention
Fig. 1 is shown as the flow chart of player end processing audio, video data in the prior art;
Fig. 2 is shown as the flow chart of player audio and video synchronization method of the invention in an embodiment;
Fig. 3 is shown as the structural schematic diagram of player audio-visual synchronization system of the invention in an embodiment;
Fig. 4 is shown as the terminal of the invention structural schematic diagram in an embodiment.
Component label instructions
31 look ahead modules
32 lag processing modules
41 processors
42 memories
Specific embodiment
Illustrate embodiments of the present invention below by way of specific specific example, those skilled in the art can be by this specification
Other advantages and efficacy of the present invention can be easily understood for disclosed content.The present invention can also pass through in addition different specific realities
The mode of applying is embodied or practiced, the various details in this specification can also based on different viewpoints and application, without departing from
Various modifications or alterations are carried out under spirit of the invention.
Player audio and video synchronization method of the invention and system, storage medium and terminal are passed through using audio as benchmark
Audio video synchronization to audio is realized being played simultaneously for audio-video by the mode for adjusting audio and video, so that audio-video is broadcast
It puts and is able to maintain in varied situations within an acceptable range, avoid the asynchronous bring deviation of audio-video, greatly improve
The viewing experience of user.
As shown in Figure 1, in an embodiment, player audio and video synchronization method of the invention the following steps are included:
Step S1, when the video data in player is advanced in time compared to audio data, by the video counts
According to storing into caching, until constructing video based on the video data when video data is synchronous with the audio data.
Specifically, after player gets original video data and audio data, the video data is first determined whether
It is whether synchronous with the audio data.In one embodiment of the invention, it is all provided on the video data and the audio data
Timestamp is set, judges the video data compared to the audio data based on video data timestamp and audio data timestamp
Advanced in time, lag is synchronous.
When the video data and the audio data synchronous in time, then without executing any processing, direct base
In video data building video and playing audio data, to realize the synchronization of audio-video on a player.Work as institute
When stating video data and the audio data difference, then on the basis of the audio data, judge the video data compared to
The audio data is advanced in time or lag.When the video data surpasses in time compared to the audio data
When preceding, show that current video progress is faster than present video progress, the video data needs to wait corresponding audio data.Therefore it will
The video data is stored into caching, until the video data is based on the video counts when synchronous with the audio data again
According to building video, to realize the synchronization of audio-video on a player.
Step S2, when the video data in player lags in time compared to audio data, it is based on the video
Data construct video, and the playing audio data when lagging duration and being not more than first threshold, are greater than in the lag duration
The first threshold waits playing audio data after the first duration when being not more than second threshold, is greater than in the lag duration
Playing audio data after the second duration is waited when the second threshold.
Specifically, when the video data lags in time compared to the audio data, show current video into
Degree is slower than present video progress.Therefore it needs immediately based on video data building video to carry out video playing.However, only
Building video is unable to reach the effect of audio-visual synchronization immediately, then needs to handle the audio data.
Wherein, when lagging duration and being not more than first threshold such as 45ms, it is asynchronous that user is not felt by audio-video, therefore not right
The audio data is adjusted but playing audio data.
When the lag duration is greater than the first threshold such as 45ms and is not more than second threshold such as 90ms, audio-video is different
Step is discernable, but within an acceptable range, therefore wait playing audio data after the first duration.Wherein, in order to make audio
The time difference with video can be more smoothly reduced, readjustment only waits the first duration every time, to narrow the gap step by step.Yu Ben
Invent in an embodiment, described first when a length of 1ms-5ms.Preferably, described first when a length of 1ms.
When the lag duration is greater than the second threshold such as 90ms, audio-video is asynchronous to already belong to unacceptable model
It encloses, therefore needs to wait for playing audio data after the second duration.Wherein, second duration is greater than first duration, passes through
Gradually narrow the gap, the final purpose for realizing audio-visual synchronization.In one embodiment of the invention, described second when a length of 20ms-
30ms.Preferably, described second when a length of 25ms.
In one embodiment of the invention, player audio and video synchronization method of the invention is applied to be based on HLS (HTTP
Live Streaming) agreement player.Specifically, HLS is the stream media network based on HTTP proposed by Apple Inc.
Transport protocol.Its working principle be entire stream is divided into it is small one by one downloaded based on the file of HTTP, every time only download
It is some.When Media Stream is playing, client can choose same with different rate downloadings from many different Back Up Sources
The resource of sample allows flow media session to adapt to different data rates comprising a M3U8 index file, TS media slicing text
Part and key encrypt string file.
As shown in figure 3, player audio-visual synchronization system of the invention includes advanced synchronization module 31 in an embodiment
With late synchronous module 32.
Advanced synchronization module 31 is used for when the video data in player is advanced in time compared to audio data, will
The video data is stored into caching, until being based on the video data when video data is synchronous with the audio data
Construct video.
Specifically, after player gets original video data and audio data, the video data is first determined whether
It is whether synchronous with the audio data.In one embodiment of the invention, it is all provided on the video data and the audio data
Timestamp is set, judges the video data compared to the audio data based on video data timestamp and audio data timestamp
Advanced in time, lag is synchronous.
When the video data and the audio data synchronous in time, then without executing any processing, direct base
In video data building video and playing audio data, to realize the synchronization of audio-video on a player.Work as institute
When stating video data and the audio data difference, then on the basis of the audio data, judge the video data compared to
The audio data is advanced in time or lag.When the video data surpasses in time compared to the audio data
When preceding, show that current video progress is faster than present video progress, the video data needs to wait corresponding audio data.Therefore it will
The video data is stored into caching, until the video data is based on the video counts when synchronous with the audio data again
According to building video, to realize the synchronization of audio-video on a player.
Lag processing module 32 is connected with look ahead module 31, for when the video data in player is compared to audio
When data lag in time, video is constructed based on the video data, and play when lagging duration and being not more than first threshold
The audio data plays after the lag duration is greater than and waits the first duration when the first threshold is not more than second threshold
The audio data waits playing audio data after the second duration when the lag duration is greater than the second threshold.
Specifically, when the video data lags in time compared to the audio data, show current video into
Degree is slower than present video progress.Therefore it needs immediately based on video data building video to carry out video playing.However, only
Building video is unable to reach the effect of audio-visual synchronization immediately, then needs to handle the audio data.
Wherein, when lagging duration and being not more than first threshold such as 45ms, it is asynchronous that user is not felt by audio-video, therefore not right
The audio data is adjusted but playing audio data.
When the lag duration is greater than the first threshold such as 45ms and is not more than second threshold such as 90ms, audio-video is different
Step is discernable, but within an acceptable range, therefore wait playing audio data after the first duration.Wherein, in order to make audio
The time difference with video can be more smoothly reduced, readjustment only waits the first duration every time, to narrow the gap step by step.Yu Ben
Invent in an embodiment, described first when a length of 1ms-5ms.Preferably, described first when a length of 1ms.
When the lag duration is greater than the second threshold such as 90ms, audio-video is asynchronous to already belong to unacceptable model
It encloses, therefore needs to wait for playing audio data after the second duration.Wherein, second duration is greater than first duration, passes through
Gradually narrow the gap, the final purpose for realizing audio-visual synchronization.In one embodiment of the invention, described second when a length of 20ms-
30ms.Preferably, described second when a length of 25ms.
In one embodiment of the invention, player audio-visual synchronization system of the invention is applied to be based on HLS (HTTP
Live Streaming) agreement player.Specifically, HLS is the stream media network based on HTTP proposed by Apple Inc.
Transport protocol.Its working principle be entire stream is divided into it is small one by one downloaded based on the file of HTTP, every time only download
It is some.When Media Stream is playing, client can choose same with different rate downloadings from many different Back Up Sources
The resource of sample allows flow media session to adapt to different data rates comprising a M3U8 index file, TS media slicing text
Part and key encrypt string file.
It should be noted that it should be understood that the modules of apparatus above division be only a kind of logic function division,
It can completely or partially be integrated on a physical entity in actual implementation, it can also be physically separate.And these modules can be with
All realized by way of processing element calls with software;It can also all realize in the form of hardware;It can also part mould
Block realizes that part of module passes through formal implementation of hardware by way of processing element calls software.For example, x module can be
The processing element individually set up also can integrate and realize in some chip of above-mentioned apparatus, in addition it is also possible to program generation
The form of code is stored in the memory of above-mentioned apparatus, is called by some processing element of above-mentioned apparatus and is executed the above x mould
The function of block.The realization of other modules is similar therewith.Furthermore these modules completely or partially can integrate together, can also be only
It is vertical to realize.Processing element described here can be a kind of integrated circuit, the processing capacity with signal.During realization,
Each step of the above method or the above modules can be by the integrated logic circuits of the hardware in processor elements or soft
The instruction of part form is completed.
For example, the above module can be arranged to implement one or more integrated circuits of above method, such as:
One or more specific integrated circuits (Application Specific Integrated Circuit, abbreviation ASIC), or,
One or more microprocessors (Digital Singnal Processor, abbreviation DSP), or, one or more scene can compile
Journey gate array (Field Programmable Gate Array, abbreviation FPGA) etc..For another example, when some above module passes through place
When managing the form realization of element scheduler program code, which can be general processor, such as central processing unit
(Central Processing Unit, abbreviation CPU) or it is other can be with the processor of caller code.For another example, these modules
It can integrate together, realized in the form of system on chip (system-on-a-chip, abbreviation SOC).
It is stored with computer program on storage medium of the invention, which realizes above-mentioned broadcasting when being executed by processor
Device audio and video synchronization method.The storage medium, which includes: that ROM, RAM, magnetic disk, USB flash disk, storage card or CD etc. are various, to be deposited
Store up the medium of program code.
As shown in figure 4, terminal of the invention includes: processor 41 and memory 42 in an embodiment.
The memory 42 is for storing computer program.
The memory 42, which includes: that ROM, RAM, magnetic disk, USB flash disk, storage card or CD etc. are various, can store program generation
The medium of code.
The processor 41 is connected with the memory 42, the computer program stored for executing the memory 42,
So that the terminal executes above-mentioned player audio and video synchronization method.
Preferably, the processor 41 can be general processor, including central processing unit (Central Processing
Unit, abbreviation CPU), network processing unit (Network Processor, abbreviation NP) etc.;It can also be digital signal processor
(Digital Signal Processor, abbreviation DSP), specific integrated circuit (Application Specific
Integrated Circuit, abbreviation ASIC), field programmable gate array (Field Programmable Gate Array,
Abbreviation FPGA) either other programmable logic device, discrete gate or transistor logic, discrete hardware components.
In conclusion player audio and video synchronization method of the invention and system, storage medium and terminal using audio as
Audio video synchronization to audio is realized being played simultaneously for audio-video by benchmark, so that the broadcasting of audio-video is in varied situations
It is able to maintain within an acceptable range;Can be applied to it is various need in audio and video playing scene, thus realize high quality sound view
Frequency communication service.So the present invention effectively overcomes various shortcoming in the prior art and has high industrial utilization value.
The above-described embodiments merely illustrate the principles and effects of the present invention, and is not intended to limit the present invention.It is any ripe
The personage for knowing this technology all without departing from the spirit and scope of the present invention, carries out modifications and changes to above-described embodiment.Cause
This, institute is complete without departing from the spirit and technical ideas disclosed in the present invention by those of ordinary skill in the art such as
At all equivalent modifications or change, should be covered by the claims of the present invention.
Claims (10)
1. a kind of player audio and video synchronization method, it is characterised in that: the following steps are included:
When the video data in player is advanced in time compared to audio data, the video data is stored to caching
In, until constructing video based on the video data when video data is synchronous with the audio data;
When the video data in player lags in time compared to audio data, based on video data building view
Frequently, the playing audio data and when lagging duration and being not more than first threshold, in the lag duration greater than first threshold
Value waits playing audio data after the first duration when being not more than second threshold, is greater than second threshold in the lag duration
Playing audio data after the second duration is waited when value.
2. player audio and video synchronization method according to claim 1, it is characterised in that: in the video data and described
It is respectively provided with timestamp on audio data, judges that the video data is compared based on video data timestamp and audio data timestamp
In audio data lead and lag in time.
3. player audio and video synchronization method according to claim 1, it is characterised in that: the first threshold is 45ms.
4. player audio and video synchronization method according to claim 1, it is characterised in that: the second threshold is 90ms.
5. player audio and video synchronization method according to claim 1, it is characterised in that: a length of 1ms- when described first
5ms。
6. player audio and video synchronization method according to claim 1, it is characterised in that: a length of 20ms- when described second
30ms。
7. player audio and video synchronization method according to claim 1, it is characterised in that: applied to based on HLS protocol
Player.
8. a kind of player audio-visual synchronization system, it is characterised in that: including advanced synchronization module and late synchronous module;
The advanced synchronization module is used for when the video data in player is advanced in time compared to audio data, by institute
It states video data to store into caching, until being based on the video data structure when video data is synchronous with the audio data
Build video;
The late synchronous module is used for when the video data in player lags in time compared to audio data, is based on
The video data constructs video, and the playing audio data when lagging duration and being not more than first threshold, in the lag
Duration, which is greater than when the first threshold is not more than second threshold, waits playing audio data after the first duration, in the lag
Duration waits playing audio data after the second duration when being greater than the second threshold.
9. a kind of storage medium, is stored thereon with computer program, which is characterized in that realize power when the program is executed by processor
Benefit require any one of 1 to 7 described in player audio and video synchronization method.
10. a kind of terminal characterized by comprising processor and memory;
The memory is for storing computer program;
The processor is used to execute the computer program of the memory storage, so that the terminal perform claim requires 1 to 7
Any one of described in player audio and video synchronization method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910501166.4A CN110177294A (en) | 2019-06-11 | 2019-06-11 | Player audio and video synchronization method and system, storage medium and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910501166.4A CN110177294A (en) | 2019-06-11 | 2019-06-11 | Player audio and video synchronization method and system, storage medium and terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110177294A true CN110177294A (en) | 2019-08-27 |
Family
ID=67697316
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910501166.4A Pending CN110177294A (en) | 2019-06-11 | 2019-06-11 | Player audio and video synchronization method and system, storage medium and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110177294A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113938781A (en) * | 2021-08-27 | 2022-01-14 | 北京声智科技有限公司 | Headphone-based holographic projection method, device, device and storage medium |
CN114630170A (en) * | 2022-03-24 | 2022-06-14 | 北京字节跳动网络技术有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
CN115942021A (en) * | 2023-02-17 | 2023-04-07 | 央广新媒体文化传媒(北京)有限公司 | Audio and video stream synchronous playing method and device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030128294A1 (en) * | 2002-01-04 | 2003-07-10 | James Lundblad | Method and apparatus for synchronizing audio and video data |
CN1435996A (en) * | 2002-01-31 | 2003-08-13 | 汤姆森特许公司 | Audio/video system provided with variable delay |
CN1969561A (en) * | 2004-06-18 | 2007-05-23 | 杜比实验室特许公司 | Maintaining synchronization of streaming audio and video using internet protocol |
US20080209482A1 (en) * | 2007-02-28 | 2008-08-28 | Meek Dennis R | Methods, systems. and products for retrieving audio signals |
CN101394469A (en) * | 2008-10-29 | 2009-03-25 | 北京创毅视讯科技有限公司 | Audio and video synchronization method, device and a digital television chip |
CN107801080A (en) * | 2017-11-10 | 2018-03-13 | 普联技术有限公司 | A kind of audio and video synchronization method, device and equipment |
-
2019
- 2019-06-11 CN CN201910501166.4A patent/CN110177294A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030128294A1 (en) * | 2002-01-04 | 2003-07-10 | James Lundblad | Method and apparatus for synchronizing audio and video data |
CN1435996A (en) * | 2002-01-31 | 2003-08-13 | 汤姆森特许公司 | Audio/video system provided with variable delay |
CN1969561A (en) * | 2004-06-18 | 2007-05-23 | 杜比实验室特许公司 | Maintaining synchronization of streaming audio and video using internet protocol |
US20080209482A1 (en) * | 2007-02-28 | 2008-08-28 | Meek Dennis R | Methods, systems. and products for retrieving audio signals |
CN101394469A (en) * | 2008-10-29 | 2009-03-25 | 北京创毅视讯科技有限公司 | Audio and video synchronization method, device and a digital television chip |
CN107801080A (en) * | 2017-11-10 | 2018-03-13 | 普联技术有限公司 | A kind of audio and video synchronization method, device and equipment |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113938781A (en) * | 2021-08-27 | 2022-01-14 | 北京声智科技有限公司 | Headphone-based holographic projection method, device, device and storage medium |
CN114630170A (en) * | 2022-03-24 | 2022-06-14 | 北京字节跳动网络技术有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
CN114630170B (en) * | 2022-03-24 | 2023-10-31 | 抖音视界有限公司 | Audio and video synchronization method and device, electronic equipment and storage medium |
CN115942021A (en) * | 2023-02-17 | 2023-04-07 | 央广新媒体文化传媒(北京)有限公司 | Audio and video stream synchronous playing method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113225598B (en) | Method, device and equipment for synchronizing audio and video of mobile terminal and storage medium | |
CN104269182B (en) | The methods, devices and systems that a kind of audio sync is played | |
US11368731B2 (en) | Method and apparatus for segmenting data | |
US10856018B2 (en) | Clock synchronization techniques including modification of sample rate conversion | |
CN106612452B (en) | Method and device for audio and video synchronization of set-top box | |
US10887646B2 (en) | Live streaming with multiple remote commentators | |
CN110177294A (en) | Player audio and video synchronization method and system, storage medium and terminal | |
CN105656616B (en) | Method, device, transmitter and receiver for data synchronization among multiple devices | |
CN107223334A (en) | The method and apparatus for being changed to MPEG 2TS for MMTP to be circulated | |
WO2016008131A1 (en) | Techniques for separately playing audio and video data in local networks | |
CN110381350A (en) | Multichannel playing back videos synchronization system and its processing method based on webrtc | |
JP7171929B2 (en) | Audio stream and video stream synchronous switching method and apparatus | |
JP2023508945A (en) | Synchronization of wireless audio with video | |
CN106331820B (en) | Audio and video synchronization processing method and device | |
CN108882010A (en) | A kind of method and system that multi-screen plays | |
US12114332B2 (en) | Early notification for transmission of encoded video data | |
CN112770165B (en) | A Distributed Synchronization Method for Audio and Video Streams | |
CN108124183B (en) | Method for synchronously acquiring video and audio to perform one-to-many video and audio streaming | |
CN116261000A (en) | Audio and video synchronization method and device in cloud conference and electronic equipment | |
US20090154347A1 (en) | Pacing of transport stream to compensate for timestamp jitter | |
TWI600319B (en) | A method for capturing video and audio simultaneous for one-to-many video streaming | |
CN112153322A (en) | Data distribution method, device, equipment and storage medium | |
CN115278858B (en) | Low-delay audio data transmission method and device | |
US20240357289A1 (en) | Wireless Surround Sound System With Common Bitstream | |
KR102251148B1 (en) | Audio-Video Synchronization Processing Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190827 |