US20200029130A1 - Method and apparatus for configuring content in a broadcast system - Google Patents
Method and apparatus for configuring content in a broadcast system Download PDFInfo
- Publication number
- US20200029130A1 US20200029130A1 US16/588,417 US201916588417A US2020029130A1 US 20200029130 A1 US20200029130 A1 US 20200029130A1 US 201916588417 A US201916588417 A US 201916588417A US 2020029130 A1 US2020029130 A1 US 2020029130A1
- Authority
- US
- United States
- Prior art keywords
- mpu
- data
- layer
- aus
- fragmentation unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000013467 fragmentation Methods 0.000 claims abstract description 11
- 238000006062 fragmentation reaction Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 7
- 230000005540 biological transmission Effects 0.000 description 26
- 238000010276 construction Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 7
- 230000002123 temporal effect Effects 0.000 description 6
- 239000012634 fragment Substances 0.000 description 4
- 239000004291 sulphur dioxide Substances 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000005538 encapsulation Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
-
- H04L65/4076—
-
- H04L65/607—
-
- H04L65/608—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
- H04L65/611—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for multicast or broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/65—Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/70—Media network packetisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/177—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a group of pictures [GOP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23605—Creation or processing of packetized elementary streams [PES]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23608—Remultiplexing multiplex streams, e.g. involving modifying time stamps or remapping the packet identifiers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/2362—Generation or processing of Service Information [SI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/238—Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
- H04N21/2381—Adapting the multiplex stream to a specific network, e.g. an Internet Protocol [IP] network
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
- H03M13/35—Unequal or adaptive error protection, e.g. by providing a different level of protection according to significance of source information or by adapting the coding according to the change of transmission channel characteristics
Definitions
- the present invention relates generally to a method and an apparatus for configuring content in a broadcast system, and more particularly, to a method and an apparatus for configuring a data unit of content in a broadcast system supporting multimedia services based on an Internet Protocol (IP).
- IP Internet Protocol
- a conventional broadcast network generally uses the Moving Picture Experts Group-2 Transport Stream (MPEG-2 TS) for transmission of multimedia content.
- MPEG-2 TS is a representative transmission technique that allows a plurality of broadcast programs (a plurality of encoded video bit streams) to transmit multiplexed bit streams in a transmission environment having errors.
- the MPEG-2 TS is appropriately used in digital TeleVsion (TV) broadcasting, etc.
- FIG. 1 illustrates a layer structure supporting a conventional MPEG-2 TS.
- the conventional MPEG-2 TS layer includes a media coding layer 110 , a sync (synchronization) layer 120 , a delivery layer 130 , a network layer 140 , a data link layer 150 , and a physical layer 160 .
- the media coding layer 110 and the sync layer 120 configure media data to a format usable for recording or transmission.
- the delivery layer 130 , the network layer 140 , the data link layer 150 , and the physical layer 160 configure a multimedia frame for recording or transmitting a data block having the format configured by the sync layer 120 in/to a separate recording medium.
- the configured multimedia frame is transmitted to a subscriber terminal, etc., through a predetermined network.
- the sync layer 120 includes a fragment block 122 and an access unit 124
- the delivery layer 130 includes an MPEG-2 TS/MPEG-4 (MP4) Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP) block 134 , and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP) block 136 .
- MPEG-2 TS/MPEG-4 MP4 Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP) block 134 , and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP) block 136 .
- RTP Real-time Transport Protocol
- FLUTE unidirectional transport
- HTTP RTP/HyperText Transfer Protocol
- UDP User Datagram Protocol
- TCP Transmission Control Protocol
- the MPEG-2 TS has several limitations in supporting multimedia services. Specifically, the MPEG-2 TS has limitations of inefficient transmission due to unidirectional communication and a fixed size of a frame, generation of an unnecessary overhead due to the usage of a transport protocol, and an IP specialized for audio/video data, etc.
- MPEG MEDIA Transport (MMT) standard has been proposed by MPEG in order to overcome the above-described limitations of the MPEG-2 TS.
- the MMT standard may be applied for the efficient transmission of complex content through heterogeneous networks.
- the complex content includes a set of content having multimedia factors by a video/audio application, etc.
- the heterogeneous networks include networks in which a broadcast network and a communication network coexist.
- the MMT standard attempts to define a transmission technique that is friendlier to an IP that is a basic technique in a transmission network for the multimedia services.
- the MMT standard attempts to representatively provide efficient MPEG transmission techniques in a multimedia service environment that changes based on the IP, and in this respect, the standardization and continuous research of the MMT standard have been progressed.
- FIG. 2 illustrates a conventional layer structure of an MMT system for transmission of a multimedia frame according to multi-service/content through heterogeneous networks.
- an MMT system for configuring and transmitting a multimedia frame includes a media coding layer 210 , an encapsulation layer (Layer E) 220 , delivery layers (Layer D) 230 and 290 , a network layer 240 , a data link layer 250 , a physical layer 260 , and control layers (Layer C) 270 and 280 .
- the layers include three technique areas, Layer E 220 , Layers D 230 and 290 , and Layers C 270 and 280 .
- Layer E 220 controls complex content generation
- Layers D 230 and 290 control the transmission of the generated complex content through the heterogeneous network
- Layers C 270 and 280 control consumption management and the transmission management of the complex content.
- Layer E 220 includes three layers, i.e., MMT E. 3 222 , MMT E. 2 224 , and MMT E. 1 226 .
- the MMT E. 3 222 generates a fragment, which is a basic unit for the MMT service, based on coded multimedia data provided from the media coding layer 210 .
- the MMT E. 2 224 generates an Access Unit (AU) for the MMT service by using the fragment generated by the MMT E. 3 222 .
- the AU is the smallest data unit having a unique presentation time.
- the MMT E. 1 226 combines or divides the AUs provided by the MMT E. 2 224 to generate a format for generation, storage, and transmission of the complex content.
- Layer D includes three layers, i.e., MMT D. 1 232 , MMT D. 2 234 , and MMT D. 3 290 .
- the MMT D. 1 232 operates with an Application Protocol (AP) similarly functioning to the RTP or the HTTP
- the MMT D. 2 234 operates with a network layer protocol similarly functioning to the UDP or the TCP
- the MMT D. 3 290 controls optimization between the layers included in Layer E 220 and the layers included in Layer D 230 .
- AP Application Protocol
- Layer C includes two layers, i.e., MMT C. 1 270 and MMT C. 2 280 .
- the MMT C. 1 270 provides information related to the generation and the consumption of the complex content
- the MMT C. 2 280 provides information related to the transmission of the complex content.
- FIG. 3 illustrates a conventional data transmission layer for a broadcast system.
- Layer E in a transmission side stores elements of the content, such as video and audio, encoded to a Network Abstraction Layer (NAL) unit, a fragment unit, etc., by a codec encoder, such as an Advanced Video Codec (AVC) and a Scalable Video Codec (SVC) in units of AUs in layer E 3 , which is the top-level layer, and transmits the stored elements in the units of AUs to layer E 2 , which is a lower layer.
- NAL Network Abstraction Layer
- AVC Advanced Video Codec
- SVC Scalable Video Codec
- Layer E 2 structuralizes a plurality of AUs, encapsulates the structuralized AUs based on Layer E 2 units, stores the encapsulated AUs in the unit of Elementary Streams (ES), and transmits the stored AUs to Layer E 1 , which is a next lower layer.
- Layer E 1 instructs a relation and a construction of the elements of the content, such as the video and audio, encapsulates the elements together with the ES, and transmits the encapsulated elements to Layer D 1 in units of packages.
- Layer D 1 divides a received package in accordance with a form suitable for transmission of the divided package to a lower layer, and the lower layer then transmits the packet to a next lower layer.
- Layer D in a reception side collects the packets transmitted from the transmission side to configure the collected packets to the package of Layer E 1 .
- a receiver recognizes elements of the content within the package, a relation between the elements of the content, and information on construction of the elements of the content, to transfer the recognized information to a content element relation/construction processor and a content element processor.
- the content relation/construction processor transfers the respective elements for the proper reproduction of the entire content to the content element processor, and the content element processor controls elements to be reproduced at a set time and displayed at a set position on a screen.
- a conventional Layer E 2 technique provides only the AU itself or information on a processing time for the AU reproduction, e.g., a Decoding Time Stamp (DTS) or a Composition Time Stamp (CTS) and a Random Access Point (RAP). Accordingly, the utilization of the conventional Layer E 2 technique is limited.
- DTS Decoding Time Stamp
- CTS Composition Time Stamp
- RAP Random Access Point
- the present invention is designed to address at least the above-described problems and/or disadvantages occurring in the prior art, and to provide at least the advantages described below.
- An aspect of the present invention is to provide a method of configuring AUs to a data unit for efficient reproduction of the AUs in Layer E 2 .
- a method for receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and processing the received MPU, wherein the MPU comprises at least one fragmentation unit, wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and wherein the sequence number of the MPU is unique to where the MPU belongs.
- MPU media processing unit
- FIG. 1 is a block diagram illustrating a layer structure for a conventional MPEG-2 TS
- FIG. 2 is a block diagram illustrating an MMT service by a broadcast system based on a conventional MMT standard
- FIG. 3 illustrates a conventional data transmission layer diagram in a broadcast system
- FIG. 4 illustrates a conventional reproduction flow of a DU configured through encapsulation of AUs one by one
- FIG. 5 illustrates a conventional process of receiving and reproducing a Data Unit (DU);
- FIG. 6 illustrates a process of receiving and reproducing a DU according to an embodiment of the present invention
- FIG. 7A illustrates a construction of conventional AUs
- FIG. 7B illustrates a construction of AUs according to an embodiment of the present invention
- FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU;
- FIGS. 9A and 9B are diagrams illustrating a comparison of an Application-Forward Error Control (AL-FEC) according to a construction of AUs within a DU; and
- FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
- a method for configuring DUs by grouping a plurality of AUs.
- the DUs are continuously concatenated to become Elementary Streams (ES), which become data transmitted from Layer E 2 to Layer E 1 .
- ES Elementary Streams
- a DU is configured by encapsulating the AUs one by one, a DTS and a CTS are granted to each AU, and a picture type (Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture) of a corresponding AU is expressed in each AU or whether a corresponding AU is a RAP is displayed.
- a picture type Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture
- FIGS. 4 and 5 illustrate a reproduction flow of a conventional DU configured by encapsulating the Aus, one by one
- FIG. 6 illustrates a reproduction flow of a DU configured with a plurality of AUs according to an embodiment of the present invention.
- a receiver searches for the RAP, i.e., a DU in a type of I-picture, by continuously examining subsequent concatenated DUs ( 402 ), such that it is possible to initiate the reproduction the DU ( 403 ).
- a DU is provided by grouping a plurality of AUs, and further configuring the DU in units of Group Of Pictures (GOPs), compared to the generation of a DU for each of the respective AUs.
- GOPs Group Of Pictures
- all DUs may be independently reproduced, without having to wait until a next DU is decoded, eliminating a complex buffer control requirement.
- the reproduction of the DU from a time ( 601 ) instructed in Layer E 1 does not require an inverse-directional search of the DUs ( 602 through 604 ).
- a DU may be configured with a plurality of GOP units.
- the I-pictures, the P-pictures, and the B-pictures are separately grouped and stored, and the respective data may be differently stored in three places.
- FIG. 7A illustrates a construction of a conventional AU
- FIG. 7B illustrates a construction of an AU according to an embodiment of the present invention.
- a transmission system utilizing an error correction method such as an AL-FEC
- grouping the AUs according to a property and stored them in the DU is also helpful for reducing transmission overhead due to the AL-FEC.
- FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU between a conventional art and an embodiment of the present invention.
- FIGS. 8A and 8B when the transmission of the DU is interrupted or an error is generated during the transmission of the DU, in FIG. 8A , it is impossible to view content after 8 seconds. However, in FIG. 8B , it is possible to view content for up to 14 seconds although it has a low temporal scalability.
- FIGS. 9A and 9B are diagrams illustrating a comparison of an AL-FEC according to a construction of AUs within a DU between the conventional art and an embodiment of the present invention.
- the AUs when the AUs are arranged according to picture type, because the AUs of the I-picture and P-picture affect a picture quality, it is sufficient to apply AL-FEC only to the AUs in the I-picture and P-picture, as indicated by a thick line of FIG. 9B . Accordingly, the overhead of the AL-FEC is decreased over the remaining durations, i.e., AUs of the B-pictures.
- FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention.
- the DU includes a header 1001 and a set of AUs 1002 included in a GOP or a plurality of GOPs.
- the header 1001 includes a DU description 1010 , which includes information on the DU, an AU structure description 1020 , which includes information on a construction the AUs 1002 , and AU information 1030 , which includes information on each AU.
- the DU description 1010 may include the following information.
- Length 1011 This information represents a size of a DU and is a value obtained by adding a size of headers of remaining DUs and a size of a payload after a corresponding field.
- Length 1011 may be represented in units of bytes.
- Sequence Number 1012 This information represents a sequence of a corresponding DU within the ES. Omission or duplicate reception between a plurality of continuous DUs may be identified using the sequence number 1012 . When an increase of sequence numbers between a previous DU and a continuously received DU exceeds “1”, this indicates that an error is generated in the transmission of the DU.
- Type of AU 1013 This information represents a type of AU included in the DU.
- the AU may be generally classified into “timed data” or “non-timed data”, expressed with “0” or “1”, respectively.
- the non-time data corresponds to general data, such a picture or a file.
- Decoding Time of DU 1014 This information represents a time to start decoding a first AU of the DU, as a representative value.
- Duration of DU 1015 This information represents a temporal length of the DU. A value obtained by adding a duration to the CTS of the first AU of the DU is the same as the time of termination of the reproduction of the finally decoded AU of the DU.
- Error Correction Code of DU 1016 For example, a Cyclic Redundancy Check (CRC), a parity bit, etc., may be used as a code for error correction.
- CRC Cyclic Redundancy Check
- parity bit etc.
- an AU structure description 1020 may include the following information.
- Number of AUs 1021 This information represents the number of AUs within the DU.
- Pattern of AUs 1022 This information represents a structure and an arrangement pattern of AUs.
- the Pattern of AUs 1022 may be indicated with values 0: open GOP, 1: closed GOP, 2: IPBIPB, 4:IIPPBB, 6: Unknown, or 8: reserved.
- Each bit value is added through the OR calculation for use.
- the construction of IPBIPB of the closed GOP is 1
- 2 3.
- Open GOP represented by “0”, represents when the GOP is the open GOP.
- Closed GOP represented by “1”, represents when the GOP is the closed GOP. Definitions of the open GOP and closed GOP are the same as that of the conventional art.
- IPBIPB represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated at least two times within the DU, e.g., IPBBIPBB or IPPBBBBIPPBBBB.
- IIPPBB represented by “4”
- IIPPBB represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated only one time within the DU, e.g., IIPPBBBB or IIPPPPBBBBBBBB.
- Reserved represents a value reserved for a later user.
- Size of Patterns 1023 This information represents a size of each duration of a repeated pattern. For example, when pattern IPBIPB is actually configured as IPPBBBBIPPBBBB, lengths of duration I, duration PP, and duration BBBB are added to be represented as three values in units of bytes.
- the size of the pattern may be expressed as:
- the AU information 1030 may include the following information.
- a value of the Independent and Disposable AUs 1036 is “1”
- a value of the Independent and Disposable AUs 1036 is “2”
- a value of the Independent and Disposable AUs 1036 is “4”.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
- This application is a Continuation Application of U.S. patent application Ser. No. 13/421,375, filed on Mar. 15, 2012, and claims priority under 35 U.S.C. § 119(a) to Korean Patent Application Serial No. 10-2011-0023578, which was filed in the Korean Industrial Property Office on Mar. 16, 2011, the entire content of which is incorporated herein by reference.
- The present invention relates generally to a method and an apparatus for configuring content in a broadcast system, and more particularly, to a method and an apparatus for configuring a data unit of content in a broadcast system supporting multimedia services based on an Internet Protocol (IP).
- A conventional broadcast network generally uses the Moving Picture Experts Group-2 Transport Stream (MPEG-2 TS) for transmission of multimedia content. The MPEG-2 TS is a representative transmission technique that allows a plurality of broadcast programs (a plurality of encoded video bit streams) to transmit multiplexed bit streams in a transmission environment having errors. For example, the MPEG-2 TS is appropriately used in digital TeleVsion (TV) broadcasting, etc.
-
FIG. 1 illustrates a layer structure supporting a conventional MPEG-2 TS. - Referring to
FIG. 1 , the conventional MPEG-2 TS layer includes amedia coding layer 110, a sync (synchronization)layer 120, adelivery layer 130, anetwork layer 140, adata link layer 150, and aphysical layer 160. Themedia coding layer 110 and thesync layer 120 configure media data to a format usable for recording or transmission. Thedelivery layer 130, thenetwork layer 140, thedata link layer 150, and thephysical layer 160 configure a multimedia frame for recording or transmitting a data block having the format configured by thesync layer 120 in/to a separate recording medium. The configured multimedia frame is transmitted to a subscriber terminal, etc., through a predetermined network. - Accordingly, the
sync layer 120 includes afragment block 122 and anaccess unit 124, and thedelivery layer 130 includes an MPEG-2 TS/MPEG-4 (MP4) Real-time Transport Protocol (RTP) Payload Format/File delivery over unidirectional transport (FLUTE) 132 block, an RTP/HyperText Transfer Protocol (HTTP)block 134, and a User Datagram Protocol (UDP)/Transmission Control Protocol (TCP)block 136. - However, the MPEG-2 TS has several limitations in supporting multimedia services. Specifically, the MPEG-2 TS has limitations of inefficient transmission due to unidirectional communication and a fixed size of a frame, generation of an unnecessary overhead due to the usage of a transport protocol, and an IP specialized for audio/video data, etc.
- Accordingly, the newly proposed MPEG MEDIA Transport (MMT) standard has been proposed by MPEG in order to overcome the above-described limitations of the MPEG-2 TS.
- For example, the MMT standard may be applied for the efficient transmission of complex content through heterogeneous networks. Here, the complex content includes a set of content having multimedia factors by a video/audio application, etc. The heterogeneous networks include networks in which a broadcast network and a communication network coexist.
- In addition, the MMT standard attempts to define a transmission technique that is friendlier to an IP that is a basic technique in a transmission network for the multimedia services.
- Accordingly, the MMT standard attempts to representatively provide efficient MPEG transmission techniques in a multimedia service environment that changes based on the IP, and in this respect, the standardization and continuous research of the MMT standard have been progressed.
-
FIG. 2 illustrates a conventional layer structure of an MMT system for transmission of a multimedia frame according to multi-service/content through heterogeneous networks. - Referring to
FIG. 2 , an MMT system for configuring and transmitting a multimedia frame includes amedia coding layer 210, an encapsulation layer (Layer E) 220, delivery layers (Layer D) 230 and 290, anetwork layer 240, adata link layer 250, aphysical layer 260, and control layers (Layer C) 270 and 280. The layers include three technique areas,Layer E 220,Layers D Layer E 220 controls complex content generation,Layers D - Layer E 220 includes three layers, i.e., MMT E.3 222, MMT E.2 224, and MMT E.1 226. The MMT E.3 222 generates a fragment, which is a basic unit for the MMT service, based on coded multimedia data provided from the
media coding layer 210. The MMT E.2 224 generates an Access Unit (AU) for the MMT service by using the fragment generated by the MMT E.3 222. The AU is the smallest data unit having a unique presentation time. The MMT E.1 226 combines or divides the AUs provided by the MMT E.2 224 to generate a format for generation, storage, and transmission of the complex content. - Layer D includes three layers, i.e., MMT D.1 232, MMT D.2 234, and MMT D.3 290. The MMT D.1 232 operates with an Application Protocol (AP) similarly functioning to the RTP or the HTTP, the MMT D.2 234 operates with a network layer protocol similarly functioning to the UDP or the TCP, and the MMT D.3 290 controls optimization between the layers included in
Layer E 220 and the layers included inLayer D 230. - Layer C includes two layers, i.e., MMT C.1 270 and MMT C.2 280. The MMT C.1 270 provides information related to the generation and the consumption of the complex content, and the MMT C.2 280 provides information related to the transmission of the complex content.
-
FIG. 3 illustrates a conventional data transmission layer for a broadcast system. - Referring to
FIG. 3 , Layer E in a transmission side stores elements of the content, such as video and audio, encoded to a Network Abstraction Layer (NAL) unit, a fragment unit, etc., by a codec encoder, such as an Advanced Video Codec (AVC) and a Scalable Video Codec (SVC) in units of AUs in layer E3, which is the top-level layer, and transmits the stored elements in the units of AUs to layer E2, which is a lower layer. - In the conventional technique, a definition and a construction of the AU transmitted from Layer E3 to Layer E2 depend on a codec.
- Layer E2 structuralizes a plurality of AUs, encapsulates the structuralized AUs based on Layer E2 units, stores the encapsulated AUs in the unit of Elementary Streams (ES), and transmits the stored AUs to Layer E1, which is a next lower layer. Layer E1 instructs a relation and a construction of the elements of the content, such as the video and audio, encapsulates the elements together with the ES, and transmits the encapsulated elements to Layer D1 in units of packages.
- Layer D1 divides a received package in accordance with a form suitable for transmission of the divided package to a lower layer, and the lower layer then transmits the packet to a next lower layer.
- Layer D in a reception side collects the packets transmitted from the transmission side to configure the collected packets to the package of Layer E1. A receiver recognizes elements of the content within the package, a relation between the elements of the content, and information on construction of the elements of the content, to transfer the recognized information to a content element relation/construction processor and a content element processor. The content relation/construction processor transfers the respective elements for the proper reproduction of the entire content to the content element processor, and the content element processor controls elements to be reproduced at a set time and displayed at a set position on a screen.
- However, a conventional Layer E2 technique provides only the AU itself or information on a processing time for the AU reproduction, e.g., a Decoding Time Stamp (DTS) or a Composition Time Stamp (CTS) and a Random Access Point (RAP). Accordingly, the utilization of the conventional Layer E2 technique is limited.
- Accordingly, the present invention is designed to address at least the above-described problems and/or disadvantages occurring in the prior art, and to provide at least the advantages described below.
- An aspect of the present invention is to provide a method of configuring AUs to a data unit for efficient reproduction of the AUs in Layer E2.
- In accordance with an aspect of the present invention, a method is provided for receiving a media processing unit (MPU) including a data part and a control part, the MPU being processed independently, wherein the data part includes media data and the control part includes parameters related to the media data; and processing the received MPU, wherein the MPU comprises at least one fragmentation unit, wherein the parameters comprise a first parameter indicating a sequence number of the MPU, and wherein the sequence number of the MPU is unique to where the MPU belongs.
- The above and other aspects, features, and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram illustrating a layer structure for a conventional MPEG-2 TS; -
FIG. 2 is a block diagram illustrating an MMT service by a broadcast system based on a conventional MMT standard; -
FIG. 3 illustrates a conventional data transmission layer diagram in a broadcast system; -
FIG. 4 illustrates a conventional reproduction flow of a DU configured through encapsulation of AUs one by one; -
FIG. 5 illustrates a conventional process of receiving and reproducing a Data Unit (DU); -
FIG. 6 illustrates a process of receiving and reproducing a DU according to an embodiment of the present invention; -
FIG. 7A illustrates a construction of conventional AUs; -
FIG. 7B illustrates a construction of AUs according to an embodiment of the present invention; -
FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU; -
FIGS. 9A and 9B are diagrams illustrating a comparison of an Application-Forward Error Control (AL-FEC) according to a construction of AUs within a DU; and -
FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention. - Hereinafter, various embodiments of the present invention will be described with reference to the accompanying drawings in detail. In the following description, a detailed explanation of known related functions and constitutions may be omitted to avoid unnecessarily obscuring the subject matter of the present invention. Further, the terms used in the description are defined considering the functions of the present invention and may vary depending on the intention or usual practice of a user or operator. Therefore, the definitions should be made based on the entire content of the description.
- In accordance with an embodiment of the present invention, a method is proposed for configuring DUs by grouping a plurality of AUs. The DUs are continuously concatenated to become Elementary Streams (ES), which become data transmitted from Layer E2 to Layer E1.
- Conventionally, a DU is configured by encapsulating the AUs one by one, a DTS and a CTS are granted to each AU, and a picture type (Intra (I)-picture, Bidirectionally Predictive (B)-picture, or Predictive (P)-picture) of a corresponding AU is expressed in each AU or whether a corresponding AU is a RAP is displayed.
-
FIGS. 4 and 5 illustrate a reproduction flow of a conventional DU configured by encapsulating the Aus, one by one, andFIG. 6 illustrates a reproduction flow of a DU configured with a plurality of AUs according to an embodiment of the present invention. - Referring to
FIG. 4 , when data begins to be received from a center of a DU string (401), because there is a probability that a corresponding DU is not the RAP, i.e., the I-picture, a receiver searches for the RAP, i.e., a DU in a type of I-picture, by continuously examining subsequent concatenated DUs (402), such that it is possible to initiate the reproduction the DU (403). - In accordance with an embodiment of the present invention, a DU is provided by grouping a plurality of AUs, and further configuring the DU in units of Group Of Pictures (GOPs), compared to the generation of a DU for each of the respective AUs. When the DU is configured in the GOPs, all DUs may be independently reproduced, without having to wait until a next DU is decoded, eliminating a complex buffer control requirement.
- Further, as illustrated in
FIG. 5 , when Layer E1 (501) instructs reproduction while limiting a part of an ES, if the DU merely includes one AU, there is no guarantee that the DU corresponding to the instructed CTS is the I-picture. Therefore, it is necessary for the receiver to search for DUs prior to the corresponding DU in an inverse direction (502), decode the DUs from the I-picture (503), and reproduce the DU (504), in order to reproduce the DU from an instructed time point. - However, in accordance with an embodiment of the present invention, as illustrated in
FIG. 6 , when the DU is configured in a unit of a GOP (as indicated by a dashed line), the reproduction of the DU from a time (601) instructed in Layer E1 does not require an inverse-directional search of the DUs (602 through 604). - In accordance with an embodiment of the present invention, a DU may be configured with a plurality of GOP units. When the DU is configured with a plurality of GOP units, the I-pictures, the P-pictures, and the B-pictures are separately grouped and stored, and the respective data may be differently stored in three places.
-
FIG. 7A illustrates a construction of a conventional AU, andFIG. 7B illustrates a construction of an AU according to an embodiment of the present invention. - As illustrated in
FIG. 7B , when the AUs are grouped according to a property and stored in the DU, even if a part of the DU fails to be transmitted during the transmission, it is possible to realize temporal scalability through a frame drop, etc. Further, because a transmission system utilizing an error correction method, such as an AL-FEC, may utilize a recoverable scope by departmentalizing a scope recoverable with the AL-FEC into a part including the collected I-pictures, a part including the collected PB-pictures, etc., grouping the AUs according to a property and stored them in the DU is also helpful for reducing transmission overhead due to the AL-FEC. -
FIGS. 8A and 8B are diagrams illustrating a comparison of a temporal scalability according to a construction of AUs within a DU between a conventional art and an embodiment of the present invention. - Referring to
FIGS. 8A and 8B , when the transmission of the DU is interrupted or an error is generated during the transmission of the DU, inFIG. 8A , it is impossible to view content after 8 seconds. However, inFIG. 8B , it is possible to view content for up to 14 seconds although it has a low temporal scalability. -
FIGS. 9A and 9B are diagrams illustrating a comparison of an AL-FEC according to a construction of AUs within a DU between the conventional art and an embodiment of the present invention. - As illustrated in
FIG. 9A , when the I-pictures, the P-pictures, and the B-pictures are arranged without any consideration to picture type, it is impossible to identify the construction of the AUs within the DU. Consequently, AL-FEC must then be applied to all durations. - However, in accordance with an embodiment of the present invention, when the AUs are arranged according to picture type, because the AUs of the I-picture and P-picture affect a picture quality, it is sufficient to apply AL-FEC only to the AUs in the I-picture and P-picture, as indicated by a thick line of
FIG. 9B . Accordingly, the overhead of the AL-FEC is decreased over the remaining durations, i.e., AUs of the B-pictures. - As described above, there are several advantages in the configuration of the DU within a unit of a GOP or a plurality of units of GOPs.
-
FIG. 10 illustrates a construction of a DU according to an embodiment of the present invention. - Referring to
FIG. 10 , the DU includes aheader 1001 and a set ofAUs 1002 included in a GOP or a plurality of GOPs. - The
header 1001 includes aDU description 1010, which includes information on the DU, anAU structure description 1020, which includes information on a construction theAUs 1002, andAU information 1030, which includes information on each AU. - For example, the
DU description 1010 may include the following information. - 1) Length 1011: This information represents a size of a DU and is a value obtained by adding a size of headers of remaining DUs and a size of a payload after a corresponding field. For example, the
Length 1011 may be represented in units of bytes. - 2) Sequence Number 1012: This information represents a sequence of a corresponding DU within the ES. Omission or duplicate reception between a plurality of continuous DUs may be identified using the
sequence number 1012. When an increase of sequence numbers between a previous DU and a continuously received DU exceeds “1”, this indicates that an error is generated in the transmission of the DU. - 3) Type of AU 1013: This information represents a type of AU included in the DU. For example, the AU may be generally classified into “timed data” or “non-timed data”, expressed with “0” or “1”, respectively. Timed data, represented by “0”, includes the CTS and/or the DTS and corresponds to multimedia elements, such as video data and audio data. Non-time data, represented by “1”, includes no CTS or DTS. The non-time data corresponds to general data, such a picture or a file.
- 4) Decoding Time of DU 1014: This information represents a time to start decoding a first AU of the DU, as a representative value.
- 5) Duration of DU 1015: This information represents a temporal length of the DU. A value obtained by adding a duration to the CTS of the first AU of the DU is the same as the time of termination of the reproduction of the finally decoded AU of the DU.
- 6) Error Correction Code of DU 1016: For example, a Cyclic Redundancy Check (CRC), a parity bit, etc., may be used as a code for error correction.
- Further, an
AU structure description 1020 may include the following information. - 1) Number of AUs 1021: This information represents the number of AUs within the DU.
- 2) Pattern of AUs 1022: This information represents a structure and an arrangement pattern of AUs. For example, the Pattern of
AUs 1022 may be indicated with values 0: open GOP, 1: closed GOP, 2: IPBIPB, 4:IIPPBB, 6: Unknown, or 8: reserved. - Each bit value is added through the OR calculation for use. For example, the construction of IPBIPB of the closed GOP is 1|2=3.
- Open GOP, represented by “0”, represents when the GOP is the open GOP. Closed GOP, represented by “1”, represents when the GOP is the closed GOP. Definitions of the open GOP and closed GOP are the same as that of the conventional art.
- IPBIPB, represented by “2”, represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated at least two times within the DU, e.g., IPBBIPBB or IPPBBBBIPPBBBB. IIPPBB, represented by “4”, represents when I-pictures, P-pictures, and B-pictures are collected based on each group and repeated only one time within the DU, e.g., IIPPBBBB or IIPPPPBBBBBBBB. Unknown, represented by “6”, represents a failure to identify a pattern, and is used in when an order of AUs is not changed.
- Reserved, represented by “8”, represents a value reserved for a later user.
- 3) Size of Patterns 1023: This information represents a size of each duration of a repeated pattern. For example, when pattern IPBIPB is actually configured as IPPBBBBIPPBBBB, lengths of duration I, duration PP, and duration BBBB are added to be represented as three values in units of bytes.
- The size of the pattern may be expressed as:
-
- for(i=0;i<number_of_patterns,i++){Size of patterns;}:
- Further, the
AU information 1030 may include the following information. - 1) DTS of AUs 1031: This information represents the DTS of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Decoding timestamp of AU;}”.
- 2) CTS of AUs 1032: This information represents the CTS of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Composition timestamp of AU;}”.
- 3) Size of AUs 1033: This information represents a size of the AU in the unit of bytes, and may be expressed as “for(i=0;i<number_of_AUs;i++){Size of AU;}”.
- 4) Duration of AUs 1034: This information represents a temporal length of the AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Duration of AU;}”.
- 5) AU num of RAP 1035: This information represents a number of the AU, and may be expressed as “for(i=0;i<number_of_RAPs;i++){AU number;}”.
- 6) Independent and disposable AUs 1036: This information represents a relationship between a corresponding AU and a different AU, and may be expressed as “for(i=0;i<number_of_AUs;i++){Independent and disposable value of AU;}”.
- More specifically, when the corresponding AU is dependent on the different AU, a value of the Independent and
Disposable AUs 1036 is “1”, when the different AU refers to the corresponding AU, a value of the Independent andDisposable AUs 1036 is “2”, and when the corresponding AU and the different AU have duplicated information, a value of the Independent andDisposable AUs 1036 is “4”. - While the present invention has been shown and described with reference to certain embodiments and drawings thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/588,417 US20200029130A1 (en) | 2011-03-16 | 2019-09-30 | Method and apparatus for configuring content in a broadcast system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2011-0023578 | 2011-03-16 | ||
KR1020110023578A KR101803970B1 (en) | 2011-03-16 | 2011-03-16 | Method and apparatus for composing content |
US13/421,375 US10433024B2 (en) | 2011-03-16 | 2012-03-15 | Method and apparatus for configuring content in a broadcast system |
US16/588,417 US20200029130A1 (en) | 2011-03-16 | 2019-09-30 | Method and apparatus for configuring content in a broadcast system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/421,375 Continuation US10433024B2 (en) | 2011-03-16 | 2012-03-15 | Method and apparatus for configuring content in a broadcast system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200029130A1 true US20200029130A1 (en) | 2020-01-23 |
Family
ID=46829540
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/421,375 Active US10433024B2 (en) | 2011-03-16 | 2012-03-15 | Method and apparatus for configuring content in a broadcast system |
US16/588,417 Abandoned US20200029130A1 (en) | 2011-03-16 | 2019-09-30 | Method and apparatus for configuring content in a broadcast system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/421,375 Active US10433024B2 (en) | 2011-03-16 | 2012-03-15 | Method and apparatus for configuring content in a broadcast system |
Country Status (4)
Country | Link |
---|---|
US (2) | US10433024B2 (en) |
EP (1) | EP2687013A4 (en) |
KR (1) | KR101803970B1 (en) |
WO (1) | WO2012125001A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190075334A1 (en) * | 2014-01-17 | 2019-03-07 | Saturn Licensing Llc | Communication apparatus, communication data generation method, and communication data processing method |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101903443B1 (en) | 2012-02-02 | 2018-10-02 | 삼성전자주식회사 | Apparatus and method for transmitting/receiving scene composition information |
KR20130090824A (en) * | 2012-02-06 | 2013-08-14 | 한국전자통신연구원 | Mmt asset structures, structing methods and structing apparatuses supporting random access for a system transporting coded media data in heterogeneous ip network |
US9071853B2 (en) * | 2012-08-31 | 2015-06-30 | Google Technology Holdings LLC | Broadcast content to HTTP client conversion |
KR102163261B1 (en) | 2012-10-11 | 2020-10-08 | 삼성전자주식회사 | Apparatus and method for delivering and receiving multimedia data in hybrid network |
EP2907278B1 (en) * | 2012-10-11 | 2019-07-31 | Samsung Electronics Co., Ltd. | Apparatus and method for transmitting mmt packets in a broadcasting and communication system |
US10015486B2 (en) * | 2012-10-26 | 2018-07-03 | Intel Corporation | Enhanced video decoding with application layer forward error correction |
US11290510B2 (en) * | 2012-11-29 | 2022-03-29 | Samsung Electronics Co., Ltd. | Method and apparatus for encapsulation of motion picture experts group media transport assets in international organization for standardization base media files |
KR101484843B1 (en) | 2013-04-19 | 2015-01-20 | 삼성전자주식회사 | A method and apparatus for transmitting a media transport packet in a multimedia transport system |
EP3007454A4 (en) * | 2013-06-05 | 2016-06-01 | Panasonic Ip Corp America | METHOD FOR DECODING DATA, APPARATUS FOR DECODING DATA, AND METHOD FOR TRANSMITTING DATA |
JP2015015706A (en) * | 2013-07-03 | 2015-01-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | Data transmission method, data reproduction method, data transmission device, and data reproduction device |
KR101814400B1 (en) | 2013-08-19 | 2018-01-04 | 엘지전자 주식회사 | Apparatus for transmitting broadcast signals, apparatus for receiving broadcast signals, method for transmitting broadcast signals and method for receiving broadcast signals |
US9537779B2 (en) | 2013-10-11 | 2017-01-03 | Huawei Technologies Co., Ltd. | System and method for real-time traffic delivery |
EP4054199A1 (en) | 2013-12-16 | 2022-09-07 | Panasonic Intellectual Property Corporation of America | Receiving device and reception method |
JP6652320B2 (en) * | 2013-12-16 | 2020-02-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Transmission method, reception method, transmission device, and reception device |
US9218848B1 (en) * | 2014-07-01 | 2015-12-22 | Amazon Technologies, Inc. | Restructuring video streams to support random access playback |
CA2968855C (en) * | 2014-11-25 | 2021-08-24 | Arris Enterprises Llc | Filler detection during trickplay |
US11051026B2 (en) * | 2015-08-31 | 2021-06-29 | Intel Corporation | Method and system of frame re-ordering for video coding |
US10142707B2 (en) * | 2016-02-25 | 2018-11-27 | Cyberlink Corp. | Systems and methods for video streaming based on conversion of a target key frame |
CN115484498A (en) * | 2021-05-31 | 2022-12-16 | 华为技术有限公司 | Method and device for playing video |
Family Cites Families (95)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6434319B1 (en) * | 1994-01-19 | 2002-08-13 | Thomson Licensing S.A. | Digital video tape recorder for digital HDTV |
US5809201A (en) * | 1994-06-24 | 1998-09-15 | Mitsubishi Denki Kabushiki Kaisha | Specially formatted optical disk and method of playback |
US6009236A (en) * | 1994-09-26 | 1999-12-28 | Mitsubishi Denki Kabushiki Kaisha | Digital video signal record and playback device and method for giving priority to a center of an I frame |
US6064794A (en) * | 1995-03-30 | 2000-05-16 | Thomson Licensing S.A. | Trick-play control for pre-encoded video |
US6138147A (en) * | 1995-07-14 | 2000-10-24 | Oracle Corporation | Method and apparatus for implementing seamless playback of continuous media feeds |
US5926610A (en) * | 1995-11-15 | 1999-07-20 | Sony Corporation | Video data processing method, video data processing apparatus and video data recording and reproducing apparatus |
KR100593581B1 (en) * | 1997-10-17 | 2006-06-28 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | How to encapsulate data into transport packets of constant size |
CA2265089C (en) * | 1998-03-10 | 2007-07-10 | Sony Corporation | Transcoding system using encoding history information |
US6483543B1 (en) * | 1998-07-27 | 2002-11-19 | Cisco Technology, Inc. | System and method for transcoding multiple channels of compressed video streams using a self-contained data unit |
US8290351B2 (en) * | 2001-04-03 | 2012-10-16 | Prime Research Alliance E., Inc. | Alternative advertising in prerecorded media |
CN100393128C (en) * | 1999-02-05 | 2008-06-04 | 索尼公司 | Encoding device and method, decoding device and method and coding system and method |
US7096487B1 (en) * | 1999-10-27 | 2006-08-22 | Sedna Patent Services, Llc | Apparatus and method for combining realtime and non-realtime encoded content |
JP4362914B2 (en) * | 1999-12-22 | 2009-11-11 | ソニー株式会社 | Information providing apparatus, information using apparatus, information providing system, information providing method, information using method, and recording medium |
CN1383662A (en) * | 2000-06-06 | 2002-12-04 | 皇家菲利浦电子有限公司 | Interactive processing system |
JP4361674B2 (en) * | 2000-06-26 | 2009-11-11 | パナソニック株式会社 | Playback apparatus and computer-readable recording medium |
KR100640921B1 (en) | 2000-06-29 | 2006-11-02 | 엘지전자 주식회사 | How to Create and Send Protocol Data Units |
US6871006B1 (en) * | 2000-06-30 | 2005-03-22 | Emc Corporation | Processing of MPEG encoded video for trick mode operation |
US6816194B2 (en) * | 2000-07-11 | 2004-11-09 | Microsoft Corporation | Systems and methods with error resilience in enhancement layer bitstream of scalable video coding |
AU2002245609A1 (en) * | 2001-03-05 | 2002-09-19 | Intervideo, Inc. | Systems and methods of error resilience in a video decoder |
US20030046429A1 (en) * | 2001-08-30 | 2003-03-06 | Sonksen Bradley Stephen | Static data item processing |
US20030185299A1 (en) * | 2001-11-30 | 2003-10-02 | Taro Takita | Program, recording medium, and image encoding apparatus and method |
US20090282444A1 (en) * | 2001-12-04 | 2009-11-12 | Vixs Systems, Inc. | System and method for managing the presentation of video |
FI114527B (en) * | 2002-01-23 | 2004-10-29 | Nokia Corp | Grouping of picture frames in video encoding |
EP1479245A1 (en) * | 2002-01-23 | 2004-11-24 | Nokia Corporation | Grouping of image frames in video coding |
CN100471267C (en) * | 2002-03-08 | 2009-03-18 | 法国电信公司 | Method for the transmission of dependent data flows |
JP4281309B2 (en) * | 2002-08-23 | 2009-06-17 | ソニー株式会社 | Image processing apparatus, image processing method, image frame data storage medium, and computer program |
CA2497697C (en) * | 2002-09-12 | 2013-07-09 | Matsushita Electric Industrial Co., Ltd. | Recording medium, playback device, program, playback method, and recording method |
KR100488804B1 (en) * | 2002-10-07 | 2005-05-12 | 한국전자통신연구원 | System for data processing of 2-view 3dimention moving picture being based on MPEG-4 and method thereof |
US7409702B2 (en) * | 2003-03-20 | 2008-08-05 | Sony Corporation | Auxiliary program association table |
US7313236B2 (en) * | 2003-04-09 | 2007-12-25 | International Business Machines Corporation | Methods and apparatus for secure and adaptive delivery of multimedia content |
US7567584B2 (en) * | 2004-01-15 | 2009-07-28 | Panasonic Corporation | Multiplex scheme conversion apparatus |
US8351514B2 (en) * | 2004-01-16 | 2013-01-08 | General Instrument Corporation | Method, protocol, and apparatus for transporting advanced video coding content |
US7586924B2 (en) * | 2004-02-27 | 2009-09-08 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Apparatus and method for coding an information signal into a data stream, converting the data stream and decoding the data stream |
JP2005277591A (en) * | 2004-03-23 | 2005-10-06 | Toshiba Corp | Electronic camera apparatus and imaging signal generating method |
JP2005285209A (en) * | 2004-03-29 | 2005-10-13 | Toshiba Corp | Metadata of moving image |
TWI405466B (en) * | 2004-04-16 | 2013-08-11 | Panasonic Corp | A regeneration device, a regeneration program, a regeneration method, and a regeneration system |
CN101677382B (en) * | 2004-04-28 | 2013-01-09 | 松下电器产业株式会社 | Stream generation apparatus, stream generation method, coding apparatus, coding method, recording medium and program thereof |
US7843994B2 (en) * | 2004-04-28 | 2010-11-30 | Panasonic Corporation | Moving picture stream generation apparatus, moving picture coding apparatus, moving picture multiplexing apparatus and moving picture decoding apparatus |
TW200845724A (en) * | 2004-06-02 | 2008-11-16 | Matsushita Electric Ind Co Ltd | Multiplexing apparatus and demultiplexing apparatus |
PL1751978T3 (en) * | 2004-06-02 | 2011-07-29 | Panasonic Corp | Picture coding apparatus and picture decoding apparatus |
JP4608953B2 (en) * | 2004-06-07 | 2011-01-12 | ソニー株式会社 | Data recording apparatus, method and program, data reproducing apparatus, method and program, and recording medium |
JP4575129B2 (en) * | 2004-12-02 | 2010-11-04 | ソニー株式会社 | DATA PROCESSING DEVICE, DATA PROCESSING METHOD, PROGRAM, AND PROGRAM RECORDING MEDIUM |
KR100665102B1 (en) * | 2004-12-03 | 2007-01-04 | 한국전자통신연구원 | A video coding rate control method considering the length of a transport packet and a video encoding apparatus using the same |
KR100651486B1 (en) | 2004-12-07 | 2006-11-29 | 삼성전자주식회사 | Apparatus and method for transmitting multimedia content through a network |
WO2006075635A1 (en) * | 2005-01-17 | 2006-07-20 | Matsushita Electric Industrial Co., Ltd. | Image decoding method |
US7848408B2 (en) | 2005-01-28 | 2010-12-07 | Broadcom Corporation | Method and system for parameter generation for digital noise reduction based on bitstream properties |
JP4261508B2 (en) * | 2005-04-11 | 2009-04-30 | 株式会社東芝 | Video decoding device |
JP4374548B2 (en) * | 2005-04-15 | 2009-12-02 | ソニー株式会社 | Decoding device and method, recording medium, and program |
US7978955B2 (en) * | 2005-04-22 | 2011-07-12 | Sony Corporation | Recording device, recording method, reproducing device, reproducing method, program, and recording medium |
JP2008539638A (en) * | 2005-04-26 | 2008-11-13 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Apparatus and method for processing a data stream having a packet sequence and timing information about the packet |
EP1725036A1 (en) | 2005-05-20 | 2006-11-22 | Thomson Licensing | A method and a video server for embedding audiovisual packets in an IP packet |
US8055783B2 (en) | 2005-08-22 | 2011-11-08 | Utc Fire & Security Americas Corporation, Inc. | Systems and methods for media stream processing |
US8712169B2 (en) * | 2005-08-26 | 2014-04-29 | Thomson Licensing | Transcoded images for improved trick play |
FR2898754B1 (en) | 2006-03-17 | 2008-06-13 | Thales Sa | METHOD FOR PROTECTING MULTIMEDIA DATA USING ADDITIONAL NETWORK ABSTRACTION LAYERS (NAL) |
EP1845685B1 (en) | 2006-04-11 | 2012-06-27 | Alcatel Lucent | Optimised transmission of content IP packets by adding to the IP packets content-related information |
US9432433B2 (en) * | 2006-06-09 | 2016-08-30 | Qualcomm Incorporated | Enhanced block-request streaming system using signaling or block creation |
JP4207981B2 (en) * | 2006-06-13 | 2009-01-14 | ソニー株式会社 | Information processing apparatus, information processing method, program, and recording medium |
US7746882B2 (en) * | 2006-08-22 | 2010-06-29 | Nokia Corporation | Method and device for assembling forward error correction frames in multimedia streaming |
EP2060074A1 (en) * | 2006-09-15 | 2009-05-20 | France Telecom | Method and device for adapting a scalable data stream, corresponding computer program product and network equipment |
CN101528633A (en) * | 2006-11-01 | 2009-09-09 | 日立金属株式会社 | Semiconductor ceramic composition and process for producing the same |
US20080141091A1 (en) * | 2006-12-06 | 2008-06-12 | General Instrument Corporation | Method and Apparatus for Recovering From Errors in Transmission of Encoded Video Over a Local Area Network |
CN101622879B (en) * | 2007-01-18 | 2012-05-23 | 诺基亚公司 | Carriage of sei messages in rtp payload format |
US7890556B2 (en) * | 2007-04-04 | 2011-02-15 | Sony Corporation | Content recording apparatus, content playback apparatus, content playback system, image capturing apparatus, processing method for the content recording apparatus, the content playback apparatus, the content playback system, and the image capturing apparatus, and program |
DE112008000552B4 (en) * | 2007-05-14 | 2020-04-23 | Samsung Electronics Co., Ltd. | Method and device for receiving radio |
US20090106807A1 (en) * | 2007-10-19 | 2009-04-23 | Hitachi, Ltd. | Video Distribution System for Switching Video Streams |
JP2009135686A (en) * | 2007-11-29 | 2009-06-18 | Mitsubishi Electric Corp | Stereoscopic video recording method, stereoscopic video recording medium, stereoscopic video reproducing method, stereoscopic video recording apparatus, and stereoscopic video reproducing apparatus |
EP2071850A1 (en) | 2007-12-10 | 2009-06-17 | Alcatel Lucent | Intelligent wrapping of video content to lighten downstream processing of video streams |
JP2009163643A (en) * | 2008-01-09 | 2009-07-23 | Sony Corp | Video retrieval device, editing device, video retrieval method and program |
US8973028B2 (en) * | 2008-01-29 | 2015-03-03 | Samsung Electronics Co., Ltd. | Information storage medium storing metadata and method of providing additional contents, and digital broadcast reception apparatus |
JP2009253675A (en) * | 2008-04-07 | 2009-10-29 | Canon Inc | Reproducing apparatus and method, and program |
US20100049865A1 (en) * | 2008-04-16 | 2010-02-25 | Nokia Corporation | Decoding Order Recovery in Session Multiplexing |
EP2129028B1 (en) * | 2008-05-06 | 2012-10-17 | Alcatel Lucent | Recovery of transmission errorrs |
US20100008419A1 (en) * | 2008-07-10 | 2010-01-14 | Apple Inc. | Hierarchical Bi-Directional P Frames |
US20100091841A1 (en) * | 2008-10-07 | 2010-04-15 | Motorola, Inc. | System and method of optimized bit extraction for scalable video coding |
US8301974B2 (en) | 2008-10-22 | 2012-10-30 | Samsung Electronics Co., Ltd. | System and method for low complexity raptor codes for multimedia broadcast/multicast service |
KR20110106465A (en) * | 2009-01-28 | 2011-09-28 | 노키아 코포레이션 | Method and apparatus for video coding and decoding |
US9281847B2 (en) * | 2009-02-27 | 2016-03-08 | Qualcomm Incorporated | Mobile reception of digital video broadcasting—terrestrial services |
US20100254453A1 (en) * | 2009-04-02 | 2010-10-07 | Qualcomm Incorporated | Inverse telecine techniques |
JP4993224B2 (en) * | 2009-04-08 | 2012-08-08 | ソニー株式会社 | Playback apparatus and playback method |
EP2265026A1 (en) * | 2009-06-16 | 2010-12-22 | Canon Kabushiki Kaisha | Method and device for deblocking filtering of SVC type video streams during decoding |
US8310947B2 (en) * | 2009-06-24 | 2012-11-13 | Empire Technology Development Llc | Wireless network access using an adaptive antenna array |
US20110019693A1 (en) * | 2009-07-23 | 2011-01-27 | Sanyo North America Corporation | Adaptive network system with online learning and autonomous cross-layer optimization for delay-sensitive applications |
ES2607457T3 (en) * | 2009-09-09 | 2017-03-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Data structure for transmitting access units comprising audio-video data |
JP5540969B2 (en) * | 2009-09-11 | 2014-07-02 | ソニー株式会社 | Nonvolatile memory device, memory controller, and memory system |
US8731053B2 (en) * | 2009-11-18 | 2014-05-20 | Tektronix, Inc. | Method of multiplexing H.264 elementary streams without timing information coded |
US9185335B2 (en) * | 2009-12-28 | 2015-11-10 | Thomson Licensing | Method and device for reception of video contents and services broadcast with prior transmission of data |
KR101777348B1 (en) * | 2010-02-23 | 2017-09-11 | 삼성전자주식회사 | Method and apparatus for transmitting and receiving of data |
US9223643B2 (en) * | 2010-03-04 | 2015-12-29 | Microsoft Technology Licensing, Llc | Content interruptions |
CN107257326B (en) * | 2010-04-20 | 2021-04-23 | 三星电子株式会社 | Interface apparatus and method for transmitting and receiving media data |
US20110293021A1 (en) * | 2010-05-28 | 2011-12-01 | Jayant Kotalwar | Prevent audio loss in the spliced content generated by the packet level video splicer |
US9049497B2 (en) * | 2010-06-29 | 2015-06-02 | Qualcomm Incorporated | Signaling random access points for streaming video data |
US8918533B2 (en) * | 2010-07-13 | 2014-12-23 | Qualcomm Incorporated | Video switching for streaming video data |
US9319448B2 (en) * | 2010-08-10 | 2016-04-19 | Qualcomm Incorporated | Trick modes for network streaming of coded multimedia data |
US9025941B2 (en) * | 2011-02-10 | 2015-05-05 | Panasonic intellectual property Management co., Ltd | Data creation device and playback device for video picture in video stream |
US9565476B2 (en) * | 2011-12-02 | 2017-02-07 | Netzyn, Inc. | Video providing textual content system and method |
-
2011
- 2011-03-16 KR KR1020110023578A patent/KR101803970B1/en active Active
-
2012
- 2012-03-15 US US13/421,375 patent/US10433024B2/en active Active
- 2012-03-16 WO PCT/KR2012/001908 patent/WO2012125001A2/en active Application Filing
- 2012-03-16 EP EP12757912.6A patent/EP2687013A4/en not_active Ceased
-
2019
- 2019-09-30 US US16/588,417 patent/US20200029130A1/en not_active Abandoned
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190075334A1 (en) * | 2014-01-17 | 2019-03-07 | Saturn Licensing Llc | Communication apparatus, communication data generation method, and communication data processing method |
US10820024B2 (en) * | 2014-01-17 | 2020-10-27 | Saturn Licensing Llc | Communication apparatus, communication data generation method, and communication data processing method |
US11284135B2 (en) * | 2014-01-17 | 2022-03-22 | Saturn Licensing Llc | Communication apparatus, communication data generation method, and communication data processing method |
Also Published As
Publication number | Publication date |
---|---|
WO2012125001A3 (en) | 2012-12-27 |
US20120240174A1 (en) | 2012-09-20 |
KR101803970B1 (en) | 2017-12-28 |
EP2687013A4 (en) | 2014-09-10 |
WO2012125001A2 (en) | 2012-09-20 |
EP2687013A2 (en) | 2014-01-22 |
KR20120105875A (en) | 2012-09-26 |
US10433024B2 (en) | 2019-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200029130A1 (en) | Method and apparatus for configuring content in a broadcast system | |
US11196786B2 (en) | Interface apparatus and method for transmitting and receiving media data | |
US11895357B2 (en) | Broadcasting signal transmission device, broadcasting signal reception device, broadcasting signal transmission method, and broadcasting signal reception method | |
US20200280747A1 (en) | Apparatus and method for transmitting multimedia frame in broadcast system | |
KR101972951B1 (en) | Method of delivering media data based on packet with header minimizing delivery overhead | |
JP6422527B2 (en) | Data receiving method and apparatus in multimedia system | |
US8301982B2 (en) | RTP-based loss recovery and quality monitoring for non-IP and raw-IP MPEG transport flows | |
US20160105259A1 (en) | Apparatus and method of transmitting/receiving broadcast data | |
US20170272691A1 (en) | Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method | |
US20140334504A1 (en) | Method for hybrid delivery of mmt package and content and method for receiving content | |
KR20130040090A (en) | Apparatus and method for delivering multimedia data in hybrid network | |
KR20050052531A (en) | System and method for transmitting scalable coded video over ip network | |
KR20140084142A (en) | Network streaming of media data | |
EP2667625A2 (en) | Apparatus and method for transmitting multimedia data in a broadcast system | |
WO2007045140A1 (en) | A real-time method for transporting multimedia data | |
MacAulay et al. | WHITEPAPER IP streaming of MPEG-4: Native RTP vs MPEG-2 transport stream | |
Paulsen et al. | MPEG-4/AVC versus MPEG-2 in IPTV. | |
TW202448163A (en) | Signaling media timing information from a media application to a network element | |
KR20130058539A (en) | Methods of synchronization in hybrid delivery | |
KR20150035857A (en) | Apparatus and method for delivering multimedia data in hybrid network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |