[go: up one dir, main page]

WO2017018768A1 - Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion et procédé de réception de signal de radiodiffusion - Google Patents

Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion et procédé de réception de signal de radiodiffusion Download PDF

Info

Publication number
WO2017018768A1
WO2017018768A1 PCT/KR2016/008118 KR2016008118W WO2017018768A1 WO 2017018768 A1 WO2017018768 A1 WO 2017018768A1 KR 2016008118 W KR2016008118 W KR 2016008118W WO 2017018768 A1 WO2017018768 A1 WO 2017018768A1
Authority
WO
WIPO (PCT)
Prior art keywords
service
information
broadcast
attribute
component
Prior art date
Application number
PCT/KR2016/008118
Other languages
English (en)
Korean (ko)
Inventor
이장원
곽민성
고우석
홍성룡
문경수
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2017018768A1 publication Critical patent/WO2017018768A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
    • H04L9/40Network security protocols

Definitions

  • the present invention relates to a broadcast signal transmission apparatus, a broadcast signal reception apparatus, and a broadcast signal transmission and reception method.
  • the digital broadcast signal may include a larger amount of video / audio data than the analog broadcast signal, and may further include various types of additional data as well as the video / audio data.
  • the digital broadcasting system may provide high definition (HD) images, multichannel audio, and various additional services.
  • HD high definition
  • data transmission efficiency for a large amount of data transmission, robustness of a transmission / reception network, and network flexibility in consideration of a mobile receiving device should be improved.
  • the present invention provides a system and an associated signaling scheme that can effectively support next-generation broadcast services in an environment that supports next-generation hybrid broadcasting using terrestrial broadcasting networks and Internet networks. Suggest.
  • the present invention provides a signaling structure for supporting a hybrid broadcast system.
  • the presentation synchronization of the DASH segment can be easily achieved by adding the presentation attribute information to the MPD.
  • the DASH client may recognize the information about the presentation and process the segment effectively.
  • the transmission path information of the service component may be provided to the DASH client by adding transmission path information to the MPD.
  • FIG. 1 is a diagram illustrating a protocol stack according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a service discovery process according to an embodiment of the present invention.
  • LLS low level signaling
  • SLT service list table
  • FIG. 4 illustrates a USBD and an S-TSID delivered to ROUTE according to an embodiment of the present invention.
  • FIG. 5 is a diagram illustrating a USBD delivered to MMT according to an embodiment of the present invention.
  • FIG. 6 illustrates a link layer operation according to an embodiment of the present invention.
  • FIG. 7 illustrates a link mapping table (LMT) according to an embodiment of the present invention.
  • FIG. 8 shows a structure of a broadcast signal transmission apparatus for a next generation broadcast service according to an embodiment of the present invention.
  • FIG 9 illustrates a writing operation of a time interleaver according to an embodiment of the present invention.
  • FIG. 10 is a block diagram of an interleaving address generator composed of a main-PRBS generator and a sub-PRBS generator according to each FFT mode included in a frequency interleaver according to an embodiment of the present invention.
  • FIG. 11 is a diagram illustrating a receiver of a next generation broadcast system according to an embodiment of the present invention.
  • FIG. 12 is a block diagram of a hybrid broadcast receiver according to an embodiment of the present invention.
  • FIG. 13 is a block diagram of a hybrid broadcast receiver according to another embodiment of the present invention.
  • FIG. 14 illustrates a timeline component for synchronization between a transport stream of a broadcasting network and a transport stream of an internet network (heterogeneous network) according to an embodiment of the present invention.
  • FIG. 15 shows a broadcast transport frame according to an embodiment of the present invention.
  • FIG. 16 shows a broadcast transport frame according to another embodiment of the present invention.
  • 17 is a diagram illustrating a state in which presentation attribute information is added to an MPD according to an embodiment of the present invention.
  • FIG. 18 is a diagram illustrating an example of a process of calculating a presentation start time of a segment according to an embodiment of the present invention.
  • FIG 19 illustrates presentable component information according to an embodiment of the present invention.
  • FIG. 20 is a diagram illustrating a TargetScreen according to an embodiment of the present invention.
  • 21 is a diagram illustrating a value of a TargetScreen according to an embodiment of the present invention.
  • FIG. 22 is a diagram illustrating a TargetProperty according to an embodiment of the present invention.
  • FIG. 23 is a diagram illustrating a state in which presentable component information is added to a common attribute / element in an MPD according to an embodiment of the present invention.
  • FIG. 24 is a diagram illustrating a state in which presentable component information is included in RepresentationBaseType according to an embodiment of the present invention.
  • 25 is a diagram illustrating a state in which presentable component information is added to a content component attribute / element in an MPD according to an embodiment of the present invention.
  • FIG. 26 is a diagram showing the presentable component information included in a ContentComponentType according to an embodiment of the present invention.
  • FIG. 27 illustrates a continuous component and a composite component according to an embodiment of the present invention.
  • 29 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • FIG. 30 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • 31 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • 32 is a diagram illustrating an example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • 33 is a diagram illustrating another example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • 34 is a diagram illustrating another example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • FIG. 35 is a diagram illustrating a state in which transmission path information is added as a lower element of a BaseURL element, and added to ExtendedBaseURL according to an embodiment of the present invention.
  • 36 is a diagram illustrating an example of transmitting transmission path information on an MPD according to another embodiment of the present invention.
  • FIG. 37 is a diagram illustrating an example of expressing transmission path information by an XML element according to another embodiment of the present invention.
  • 38 is a diagram illustrating another example of representing transmission path information using an XML element according to another embodiment of the present invention.
  • 39 is a view showing another example of expressing transmission path information by an XML element according to another embodiment of the present invention.
  • FIG. 40 illustrates a broadcast signal transmission method according to an embodiment of the present invention.
  • 41 is a view showing a broadcast signal transmission apparatus according to an embodiment of the present invention.
  • FIG. 42 is a view showing a broadcast signal receiving method according to an embodiment of the present invention.
  • FIG. 43 is a diagram illustrating a broadcast signal receiving apparatus according to an embodiment of the present invention.
  • the present invention provides an apparatus and method for transmitting and receiving broadcast signals for next generation broadcast services.
  • the next generation broadcast service includes a terrestrial broadcast service, a mobile broadcast service, a UHDTV service, and the like.
  • a broadcast signal for a next generation broadcast service may be processed through a non-multiple input multiple output (MIMO) or MIMO scheme.
  • the non-MIMO scheme according to an embodiment of the present invention may include a multiple input single output (MISO) scheme, a single input single output (SISO) scheme, and the like.
  • MISO multiple input single output
  • SISO single input single output
  • the present invention proposes a physical profile (or system) that is optimized to minimize receiver complexity while achieving the performance required for a particular application.
  • FIG. 1 is a diagram illustrating a protocol stack according to an embodiment of the present invention.
  • the service may be delivered to the receiver through a plurality of layers.
  • the transmitting side can generate service data.
  • the delivery layer on the transmitting side performs processing for transmission to the service data, and the physical layer encodes it as a broadcast signal and transmits it through a broadcasting network or broadband.
  • the service data may be generated in a format according to ISO BMFF (base media file format).
  • the ISO BMFF media file may be used in broadcast network / broadband delivery, media encapsulation and / or synchronization format.
  • the service data is all data related to the service, and may include a concept including service components constituting the linear service, signaling information thereof, non real time (NRT) data, and other files.
  • the delivery layer will be described.
  • the delivery layer may provide a transmission function for service data.
  • the service data may be delivered through a broadcast network and / or broadband.
  • the first method may be to process service data into Media Processing Units (MPUs) based on MPEG Media Transport (MMT) and transmit the data using MMM protocol (MMTP).
  • MPUs Media Processing Units
  • MMT MPEG Media Transport
  • MMTP MMM protocol
  • the service data delivered through the MMTP may include service components for linear service and / or service signaling information thereof.
  • the second method may be to process service data into DASH segments based on MPEG DASH and transmit it using Real Time Object Delivery over Unidirectional Transport (ROUTE).
  • the service data delivered through the ROUTE protocol may include service components for the linear service, service signaling information and / or NRT data thereof. That is, non-timed data such as NRT data and files may be delivered through ROUTE.
  • Data processed according to the MMTP or ROUTE protocol may be processed into IP packets via the UDP / IP layer.
  • a service list table (SLT) may also be transmitted through a broadcasting network through a UDP / IP layer.
  • the SLT may be included in the LLS (Low Level Signaling) table and transmitted. The SLT and the LLS table will be described later.
  • IP packets may be treated as link layer packets at the link layer.
  • the link layer may encapsulate data of various formats delivered from an upper layer into a link layer packet and then deliver the data to the physical layer. The link layer will be described later.
  • At least one or more service elements may be delivered via a broadband path.
  • the data transmitted through the broadband may include service components in a DASH format, service signaling information and / or NRT data thereof. This data can be processed via HTTP / TCP / IP, passed through the link layer for broadband transmission, and delivered to the physical layer for broadband transmission.
  • the physical layer may process data received from a delivery layer (upper layer and / or link layer) and transmit the data through a broadcast network or a broadband. Details of the physical layer will be described later.
  • the service may be a collection of service components that are shown to the user as a whole, the components may be of different media types, the service may be continuous or intermittent, the service may be real time or non-real time, and the real time service may be a sequence of TV programs. It can be configured as.
  • the service may be a linear audio / video or audio only service that may have app-based enhancements.
  • the service may be an app-based service whose reproduction / configuration is controlled by the downloaded application.
  • the service may be an ESG service that provides an electronic service guide (ESG).
  • ESG electronic service guide
  • EA Emergency Alert
  • the service component may be delivered by (1) one or more ROUTE sessions or (2) one or more MMTP sessions.
  • the service component When a linear service with app-based enhancement is delivered through a broadcast network, the service component may be delivered by (1) one or more ROUTE sessions and (2) zero or more MMTP sessions.
  • data used for app-based enhancement may be delivered through a ROUTE session in the form of NRT data or other files.
  • linear service components (streaming media components) of one service may not be allowed to be delivered using both protocols simultaneously.
  • the service component may be delivered by one or more ROUTE sessions.
  • the service data used for the app-based service may be delivered through a ROUTE session in the form of NRT data or other files.
  • some service components or some NRT data, files, etc. of these services may be delivered via broadband (hybrid service delivery).
  • the linear service components of one service may be delivered through the MMT protocol.
  • the linear service components of one service may be delivered via a ROUTE protocol.
  • the linear service component and NRT data (NRT service component) of one service may be delivered through the ROUTE protocol.
  • linear service components of one service may be delivered through the MMT protocol, and NRT data (NRT service components) may be delivered through the ROUTE protocol.
  • some service component or some NRT data of a service may be delivered over broadband.
  • the data related to the app-based service or the app-based enhancement may be transmitted through a broadcast network according to ROUTE or through broadband in the form of NRT data.
  • NRT data may also be referred to as locally cashed data.
  • Each ROUTE session includes one or more LCT sessions that deliver, in whole or in part, the content components that make up the service.
  • an LCT session may deliver an individual component of a user service, such as an audio, video, or closed caption stream.
  • Streaming media is formatted into a DASH segment.
  • Each MMTP session includes one or more MMTP packet flows carrying an MMT signaling message or all or some content components.
  • the MMTP packet flow may carry a component formatted with an MMT signaling message or an MPU.
  • an LCT session For delivery of NRT user service or system metadata, an LCT session carries a file based content item.
  • These content files may consist of continuous (timed) or discrete (non-timed) media components of an NRT service, or metadata such as service signaling or ESG fragments.
  • Delivery of system metadata, such as service signaling or ESG fragments, can also be accomplished through the signaling message mode of the MMTP.
  • the tuner can scan frequencies and detect broadcast signals at specific frequencies.
  • the receiver can extract the SLT and send it to the module that processes it.
  • the SLT parser can parse the SLT, obtain data, and store it in the channel map.
  • the receiver may acquire bootstrap information of the SLT and deliver it to the ROUTE or MMT client. This allows the receiver to obtain and store the SLS. USBD or the like can be obtained, which can be parsed by the signaling parser.
  • FIG. 2 is a diagram illustrating a service discovery process according to an embodiment of the present invention.
  • the broadcast stream delivered by the broadcast signal frame of the physical layer may carry LLS (Low Level Signaling).
  • LLS data may be carried through the payload of an IP packet delivered to a well known IP address / port. This LLS may contain an SLT depending on its type.
  • LLS data may be formatted in the form of an LLS table. The first byte of every UDP / IP packet carrying LLS data may be the beginning of the LLS table. Unlike the illustrated embodiment, the IP stream carrying LLS data may be delivered to the same PLP along with other service data.
  • the SLT enables the receiver to generate a service list through a fast channel scan and provides access information for locating the SLS.
  • the SLT includes bootstrap information, which enables the receiver to obtain Service Layer Signaling (SLS) for each service.
  • SLS Service Layer Signaling
  • the bootstrap information may include destination IP address and destination port information of the ROUTE session including the LCT channel carrying the SLS and the LCT channel.
  • the bootstrap information may include a destination IP address and destination port information of the MMTP session carrying the SLS.
  • the SLS of service # 1 described by the SLT is delivered via ROUTE, and the SLT includes bootstrap information (sIP1, dIP1, dPort1) for the ROUTE session including the LCT channel to which the SLS is delivered. can do.
  • SLS of service # 2 described by the SLT is delivered through MMT, and the SLT may include bootstrap information (sIP2, dIP2, and dPort2) for an MMTP session including an MMTP packet flow through which the SLS is delivered.
  • the SLS is signaling information describing characteristics of a corresponding service and may include information for acquiring a corresponding service and a service component of the corresponding service, or may include receiver capability information for reproducing the corresponding service significantly. Having separate service signaling for each service allows the receiver to obtain the appropriate SLS for the desired service without having to parse the entire SLS delivered in the broadcast stream.
  • the SLS When the SLS is delivered through the ROUTE protocol, the SLS may be delivered through a dedicated LCT channel of a ROUTE session indicated by the SLT.
  • the SLS may include a user service bundle description (USBD / USD), a service-based transport session instance description (S-TSID), and / or a media presentation description (MPD).
  • USBD / USD user service bundle description
  • S-TSID service-based transport session instance description
  • MPD media presentation description
  • USBD to USD is one of the SLS fragments and may serve as a signaling hub for describing specific technical information of a service.
  • the USBD may include service identification information, device capability information, and the like.
  • the USBD may include reference information (URI reference) to other SLS fragments (S-TSID, MPD, etc.). That is, USBD / USD can refer to S-TSID and MPD respectively.
  • the USBD may further include metadata information that enables the receiver to determine the transmission mode (broadcast network / broadband). Details of the USBD / USD will be described later.
  • the S-TSID is one of the SLS fragments, and may provide overall session description information for a transport session carrying a service component of a corresponding service.
  • the S-TSID may provide transport session description information for the ROUTE session to which the service component of the corresponding service is delivered and / or the LCT channel of the ROUTE sessions.
  • the S-TSID may provide component acquisition information of service components related to one service.
  • the S-TSID may provide a mapping between the DASH Representation of the MPD and the tsi of the corresponding service component.
  • the component acquisition information of the S-TSID may be provided in the form of tsi, an identifier of an associated DASH representation, and may or may not include a PLP ID according to an embodiment.
  • the component acquisition information enables the receiver to collect audio / video components of a service and to buffer, decode, and the like of DASH media segments.
  • the S-TSID may be referenced by the USBD as described above. Details of the S-TSID will be described later.
  • the MPD is one of the SLS fragments and may provide a description of the DASH media presentation of the service.
  • the MPD may provide a resource identifier for the media segments and may provide contextual information within the media presentation for the identified resources.
  • the MPD may describe the DASH representation (service component) delivered through the broadcast network, and may also describe additional DASH representations delivered through the broadband (hybrid delivery).
  • the MPD may be referenced by the USBD as described above.
  • the SLS When the SLS is delivered through the MMT protocol, the SLS may be delivered through a dedicated MMTP packet flow of an MMTP session indicated by the SLT.
  • packet_id of MMTP packets carrying SLS may have a value of 00.
  • the SLS may include a USBD / USD and / or MMT Package (MP) table.
  • USBD is one of the SLS fragments, and may describe specific technical information of a service like that in ROUTE.
  • the USBD here may also include reference information (URI reference) to other SLS fragments.
  • the USBD of the MMT may refer to the MP table of the MMT signaling.
  • the USBD of the MMT may also include reference information on the S-TSID and / or the MPD.
  • the S-TSID may be for NRT data transmitted through the ROUTE protocol. This is because NRT data can be delivered through the ROUTE protocol even when the linear service component is delivered through the MMT protocol.
  • MPD may be for a service component delivered over broadband in hybrid service delivery. Details of the USBD of the MMT will be described later.
  • the MP table is a signaling message of the MMT for MPU components and may provide overall session description information for an MMTP session carrying a service component of a corresponding service.
  • the MP table may also contain descriptions for assets delivered via this MMTP session.
  • the MP table is streaming signaling information for MPU components, and may provide a list of assets corresponding to one service and location information (component acquisition information) of these components. Specific contents of the MP table may be in a form defined in MMT or a form in which modifications are made.
  • Asset is a multimedia data entity, which may mean a data entity associated with one unique ID and used to generate one multimedia presentation. Asset may correspond to a service component constituting a service.
  • the MP table may be used to access a streaming service component (MPU) corresponding to a desired service.
  • the MP table may be referenced by the USBD as described above.
  • MMT signaling messages may be defined. Such MMT signaling messages may describe additional information related to the MMTP session or service.
  • ROUTE sessions are identified by source IP address, destination IP address, and destination port number.
  • the LCT session is identified by a transport session identifier (TSI) that is unique within the scope of the parent ROUTE session.
  • MMTP sessions are identified by destination IP address and destination port number.
  • the MMTP packet flow is identified by a unique packet_id within the scope of the parent MMTP session.
  • the S-TSID, the USBD / USD, the MPD, or the LCT session carrying them may be called a service signaling channel.
  • the S-TSID, the USBD / USD, the MPD, or the LCT session carrying them may be called a service signaling channel.
  • the S-TSID, the USBD / USD, the MPD, or the LCT session carrying them may be called a service signaling channel.
  • the MMT signaling messages or packet flow carrying them may be called a service signaling channel.
  • one ROUTE or MMTP session may be delivered through a plurality of PLPs. That is, one service may be delivered through one or more PLPs. Unlike shown, components constituting one service may be delivered through different ROUTE sessions. In addition, according to an embodiment, components constituting one service may be delivered through different MMTP sessions. According to an embodiment, components constituting one service may be delivered divided into a ROUTE session and an MMTP session. Although not shown, a component constituting one service may be delivered through a broadband (hybrid delivery).
  • LLS low level signaling
  • SLT service list table
  • An embodiment t3010 of the illustrated LLS table may include information according to an LLS_table_id field, a provider_id field, an LLS_table_version field, and / or an LLS_table_id field.
  • the LLS_table_id field may identify a type of the corresponding LLS table, and the provider_id field may identify service providers related to services signaled by the corresponding LLS table.
  • the service provider is a broadcaster using all or part of the broadcast stream, and the provider_id field may identify one of a plurality of broadcasters using the broadcast stream.
  • the LLS_table_version field may provide version information of a corresponding LLS table.
  • the corresponding LLS table includes the above-described SLT, a rating region table (RRT) including information related to a content advisory rating, a SystemTime information providing information related to system time, and an emergency alert. It may include one of the CAP (Common Alert Protocol) message that provides information related to. According to an embodiment, other information other than these may be included in the LLS table.
  • RRT rating region table
  • CAP Common Alert Protocol
  • One embodiment t3020 of the illustrated SLT may include an @bsid attribute, an @sltCapabilities attribute, a sltInetUrl element, and / or a Service element.
  • Each field may be omitted or may exist in plurality, depending on the value of the illustrated Use column.
  • the @bsid attribute may be an identifier of a broadcast stream.
  • the @sltCapabilities attribute can provide the capability information required to decode and significantly reproduce all services described by the SLT.
  • the sltInetUrl element may provide base URL information used to obtain ESG or service signaling information for services of the corresponding SLT through broadband.
  • the sltInetUrl element may further include an @urlType attribute, which may indicate the type of data that can be obtained through the URL.
  • the service element may be an element including information on services described by the corresponding SLT, and a service element may exist for each service.
  • the Service element contains the @serviceId property, the @sltSvcSeqNum property, the @protected property, the @majorChannelNo property, the @minorChannelNo property, the @serviceCategory property, the @shortServiceName property, the @hidden property, the @broadbandAccessRequired property, the @svcCapabilities property, the BroadcastSvcSignaling element, and / or the svcInetUrl element. It may include.
  • the @serviceId attribute may be an identifier of a corresponding service, and the @sltSvcSeqNum attribute may indicate a sequence number of SLT information for the corresponding service.
  • the @protected attribute may indicate whether at least one service component necessary for meaningful playback of the corresponding service is protected.
  • the @majorChannelNo and @minorChannelNo attributes may indicate the major channel number and the minor channel number of the corresponding service, respectively.
  • the @serviceCategory attribute can indicate the category of the corresponding service.
  • the service category may include a linear A / V service, a linear audio service, an app-based service, an ESG service, and an EAS service.
  • the @shortServiceName attribute may provide a short name of the corresponding service.
  • the @hidden attribute can indicate whether the service is for testing or proprietary use.
  • the @broadbandAccessRequired attribute may indicate whether broadband access is required for meaningful playback of the corresponding service.
  • the @svcCapabilities attribute can provide the capability information necessary for decoding and meaningful reproduction of the corresponding service.
  • the BroadcastSvcSignaling element may provide information related to broadcast signaling of a corresponding service. This element may provide information such as a location, a protocol, and an address with respect to signaling through a broadcasting network of a corresponding service. Details will be described later.
  • the svcInetUrl element may provide URL information for accessing signaling information for a corresponding service through broadband.
  • the sltInetUrl element may further include an @urlType attribute, which may indicate the type of data that can be obtained through the URL.
  • the aforementioned BroadcastSvcSignaling element may include an @slsProtocol attribute, an @slsMajorProtocolVersion attribute, an @slsMinorProtocolVersion attribute, an @slsPlpId attribute, an @slsDestinationIpAddress attribute, an @slsDestinationUdpPort attribute, and / or an @slsSourceIpAddress attribute.
  • the @slsProtocol attribute can indicate the protocol used to deliver the SLS of the service (ROUTE, MMT, etc.).
  • the @slsMajorProtocolVersion attribute and @slsMinorProtocolVersion attribute may indicate the major version number and the minor version number of the protocol used to deliver the SLS of the corresponding service, respectively.
  • the @slsPlpId attribute may provide a PLP identifier for identifying a PLP that delivers the SLS of the corresponding service.
  • this field may be omitted, and the PLP information to which the SLS is delivered may be identified by combining information in the LMT to be described later and bootstrap information of the SLT.
  • the @slsDestinationIpAddress attribute, @slsDestinationUdpPort attribute, and @slsSourceIpAddress attribute may indicate the destination IP address, the destination UDP port, and the source IP address of the transport packet carrying the SLS of the corresponding service, respectively. They can identify the transport session (ROUTE session or MMTP session) to which the SLS is delivered. These may be included in the bootstrap information.
  • FIG. 4 illustrates a USBD and an S-TSID delivered to ROUTE according to an embodiment of the present invention.
  • One embodiment t4010 of the illustrated USBD may have a bundleDescription root element.
  • the bundleDescription root element may have a userServiceDescription element.
  • the userServiceDescription element may be an instance of one service.
  • the userServiceDescription element may include an @globalServiceID attribute, an @serviceId attribute, an @serviceStatus attribute, an @fullMPDUri attribute, an @sTSIDUri attribute, a name element, a serviceLanguage element, a capabilityCode element, and / or a deliveryMethod element.
  • Each field may be omitted or may exist in plurality, depending on the value of the illustrated Use column.
  • the @globalServiceID attribute is a globally unique identifier of the service and can be used to link with ESG data (Service @ globalServiceID).
  • the @serviceId attribute is a reference corresponding to the corresponding service entry of the SLT and may be the same as service ID information of the SLT.
  • the @serviceStatus attribute may indicate the status of the corresponding service. This field may indicate whether the corresponding service is active or inactive.
  • the @fullMPDUri attribute can refer to the MPD fragment of the service. As described above, the MPD may provide a reproduction description for a service component delivered through a broadcast network or a broadband.
  • the @sTSIDUri attribute may refer to the S-TSID fragment of the service.
  • the S-TSID may provide parameters related to access to the transport session carrying the service as described above.
  • the name element may provide the name of the service.
  • This element may further include an @lang attribute, which may indicate the language of the name provided by the name element.
  • the serviceLanguage element may indicate the available languages of the service. That is, this element may list the languages in which the service can be provided.
  • the capabilityCode element may indicate capability or capability group information of the receiver side necessary for significantly playing a corresponding service. This information may be compatible with the capability information format provided by the service announcement.
  • the deliveryMethod element may provide delivery related information with respect to contents accessed through a broadcasting network or a broadband of a corresponding service.
  • the deliveryMethod element may include a broadcastAppService element and / or a unicastAppService element. Each of these elements may have a basePattern element as its child element.
  • the broadcastAppService element may include transmission related information on the DASH presentation delivered through the broadcast network.
  • These DASH representations may include media components across all periods of the service media presentation.
  • the basePattern element of this element may represent a character pattern used by the receiver to match the segment URL. This can be used by the DASH client to request segments of the representation. Matching may imply that the media segment is delivered over the broadcast network.
  • the unicastAppService element may include transmission related information on the DASH representation delivered through broadband. These DASH representations may include media components across all periods of the service media presentation.
  • the basePattern element of this element may represent a character pattern used by the receiver to match the segment URL. This can be used by the DASH client to request segments of the representation. Matching may imply that the media segment is delivered over broadband.
  • An embodiment t4020 of the illustrated S-TSID may have an S-TSID root element.
  • the S-TSID root element may include an @serviceId attribute and / or an RS element.
  • Each field may be omitted or may exist in plurality, depending on the value of the illustrated Use column.
  • the @serviceId attribute is an identifier of a corresponding service and may refer to a corresponding service of USBD / USD.
  • the RS element may describe information on ROUTE sessions through which service components of a corresponding service are delivered. Depending on the number of such ROUTE sessions, there may be a plurality of these elements.
  • the RS element may further include an @bsid attribute, an @sIpAddr attribute, an @dIpAddr attribute, an @dport attribute, an @PLPID attribute, and / or an LS element.
  • the @bsid attribute may be an identifier of a broadcast stream through which service components of a corresponding service are delivered. If this field is omitted, the default broadcast stream may be a broadcast stream that includes a PLP that carries the SLS of the service. The value of this field may be the same value as the @bsid attribute of SLT.
  • the @sIpAddr attribute, the @dIpAddr attribute, and the @dport attribute may indicate a source IP address, a destination IP address, and a destination UDP port of the corresponding ROUTE session, respectively. If these fields are omitted, the default values may be the source IP address, destination IP address, and destination UDP port values of the current, ROUTE session carrying that SLS, that is, carrying that S-TSID. For other ROUTE sessions that carry service components of the service but not the current ROUTE session, these fields may not be omitted.
  • the @PLPID attribute may indicate PLP ID information of a corresponding ROUTE session. If this field is omitted, the default value may be the PLP ID value of the current PLP to which the corresponding S-TSID is being delivered. According to an embodiment, this field is omitted, and the PLP ID information of the corresponding ROUTE session may be confirmed by combining information in the LMT to be described later and IP address / UDP port information of the RS element.
  • the LS element may describe information on LCT channels through which service components of a corresponding service are delivered. Depending on the number of such LCT channels, there may be a plurality of these elements.
  • the LS element may include an @tsi attribute, an @PLPID attribute, an @bw attribute, an @startTime attribute, an @endTime attribute, an SrcFlow element, and / or a RepairFlow element.
  • the @tsi attribute may represent tsi information of a corresponding LCT channel. Through this, LCT channels through which a service component of a corresponding service is delivered may be identified.
  • the @PLPID attribute may represent PLP ID information of a corresponding LCT channel. In some embodiments, this field may be omitted.
  • the @bw attribute may indicate the maximum bandwidth of the corresponding LCT channel.
  • the @startTime attribute may indicate the start time of the LCT session, and the @endTime attribute may indicate the end time of the LCT channel.
  • the SrcFlow element may describe the source flow of ROUTE.
  • the source protocol of ROUTE is used to transmit the delivery object, and can establish at least one source flow in one ROUTE session. These source flows can deliver related objects as an object flow.
  • the RepairFlow element may describe the repair flow of ROUTE. Delivery objects delivered according to the source protocol may be protected according to Forward Error Correction (FEC).
  • FEC Forward Error Correction
  • the repair protocol may define a FEC framework that enables such FEC protection.
  • FIG. 5 is a diagram illustrating a USBD delivered to MMT according to an embodiment of the present invention.
  • One embodiment of the illustrated USBD may have a bundleDescription root element.
  • the bundleDescription root element may have a userServiceDescription element.
  • the userServiceDescription element may be an instance of one service.
  • the userServiceDescription element may include an @globalServiceID attribute, an @serviceId attribute, a Name element, a serviceLanguage element, a content advisoryRating element, a Channel element, an mpuComponent element, a routeComponent element, a broadbandComponent element, and / or a ComponentInfo element.
  • Each field may be omitted or may exist in plurality, depending on the value of the illustrated Use column.
  • the @globalServiceID attribute, the @serviceId attribute, the Name element and / or the serviceLanguage element may be the same as the corresponding fields of the USBD delivered to the above-described ROUTE.
  • the contentAdvisoryRating element may indicate the content advisory rating of the corresponding service. This information may be compatible with the content advisory rating information format provided by the service announcement.
  • the channel element may include information related to the corresponding service. The detail of this element is mentioned later.
  • the mpuComponent element may provide a description for service components delivered as an MPU of a corresponding service.
  • This element may further include an @mmtPackageId attribute and / or an @nextMmtPackageId attribute.
  • the @mmtPackageId attribute may refer to an MMT package of service components delivered as an MPU of a corresponding service.
  • the @nextMmtPackageId attribute may refer to an MMT package to be used next to the MMT package referenced by the @mmtPackageId attribute in time.
  • the MP table can be referenced through the information of this element.
  • the routeComponent element may include a description of service components of the corresponding service delivered to ROUTE. Even if the linear service components are delivered in the MMT protocol, the NRT data may be delivered according to the ROUTE protocol as described above. This element may describe information about such NRT data. The detail of this element is mentioned later.
  • the broadbandComponent element may include a description of service components of the corresponding service delivered over broadband.
  • some service components or other files of a service may be delivered over broadband. This element may describe information about these data.
  • This element may further include the @fullMPDUri attribute. This attribute may refer to an MPD that describes service components delivered over broadband.
  • the element when the broadcast signal is weakened due to driving in a tunnel or the like, the element may be needed to support handoff between the broadcast network and the broadband band. When the broadcast signal is weakened, while acquiring the service component through broadband, and when the broadcast signal is stronger, the service continuity may be guaranteed by acquiring the service component through the broadcast network.
  • the ComponentInfo element may include information on service components of a corresponding service. Depending on the number of service components of the service, there may be a plurality of these elements. This element may describe information such as the type, role, name, identifier, and protection of each service component. Detailed information on this element will be described later.
  • the aforementioned channel element may further include an @serviceGenre attribute, an @serviceIcon attribute, and / or a ServiceDescription element.
  • the @serviceGenre attribute may indicate the genre of the corresponding service
  • the @serviceIcon attribute may include URL information of an icon representing the corresponding service.
  • the ServiceDescription element provides a service description of the service, which may further include an @serviceDescrText attribute and / or an @serviceDescrLang attribute. Each of these attributes may indicate the text of the service description and the language used for that text.
  • the aforementioned routeComponent element may further include an @sTSIDUri attribute, an @sTSIDDestinationIpAddress attribute, an @sTSIDDestinationUdpPort attribute, an @sTSIDSourceIpAddress attribute, an @sTSIDMajorProtocolVersion attribute, and / or an @sTSIDMinorProtocolVersion attribute.
  • the @sTSIDUri attribute may refer to an S-TSID fragment. This field may be the same as the corresponding field of USBD delivered to ROUTE described above. This S-TSID may provide access related information for service components delivered in ROUTE. This S-TSID may exist for NRT data delivered according to the ROUTE protocol in the situation where linear service components are delivered according to the MMT protocol.
  • the @sTSIDDestinationIpAddress attribute, the @sTSIDDestinationUdpPort attribute, and the @sTSIDSourceIpAddress attribute may indicate a destination IP address, a destination UDP port, and a source IP address of a transport packet carrying the aforementioned S-TSID, respectively. That is, these fields may identify a transport session (MMTP session or ROUTE session) carrying the aforementioned S-TSID.
  • the @sTSIDMajorProtocolVersion attribute and the @sTSIDMinorProtocolVersion attribute may indicate a major version number and a minor version number of the transport protocol used to deliver the aforementioned S-TSID.
  • ComponentInfo element may further include an @componentType attribute, an @componentRole attribute, an @componentProtectedFlag attribute, an @componentId attribute, and / or an @componentName attribute.
  • the @componentType attribute may indicate the type of the corresponding component. For example, this property may indicate whether the corresponding component is an audio, video, or closed caption component.
  • the @componentRole attribute can indicate the role (role) of the corresponding component. For example, this property can indicate whether the main audio, music, commentary, etc., if the corresponding component is an audio component. If the corresponding component is a video component, it may indicate whether it is primary video. If the corresponding component is a closed caption component, it may indicate whether it is a normal caption or an easy reader type.
  • the @componentProtectedFlag attribute may indicate whether a corresponding service component is protected, for example, encrypted.
  • the @componentId attribute may represent an identifier of a corresponding service component.
  • the value of this attribute may be a value such as asset_id (asset ID) of the MP table corresponding to this service component.
  • the @componentName attribute may represent the name of the corresponding service component.
  • FIG. 6 illustrates a link layer operation according to an embodiment of the present invention.
  • the link layer may be a layer between the physical layer and the network layer.
  • the transmitter may transmit data from the network layer to the physical layer
  • the receiver may transmit data from the physical layer to the network layer (t6010).
  • the purpose of the link layer may be to compress all input packet types into one format for processing by the physical layer, to ensure flexibility and future scalability for input packet types not yet defined. have.
  • the link layer may provide an option of compressing unnecessary information in the header of the input packet, so that the input data may be efficiently transmitted. Operations such as overhead reduction and encapsulation of the link layer may be referred to as a link layer protocol, and a packet generated using the corresponding protocol may be referred to as a link layer packet.
  • the link layer may perform functions such as packet encapsulation, overhead reduction, and / or signaling transmission.
  • the link layer ALP may perform an overhead reduction process on input packets and then encapsulate them into link layer packets.
  • the link layer may encapsulate the link layer packet without performing an overhead reduction process.
  • the use of the link layer protocol can greatly reduce the overhead for data transmission on the physical layer, and the link layer protocol according to the present invention can provide IP overhead reduction and / or MPEG-2 TS overhead reduction. have.
  • the link layer may sequentially perform IP header compression, adaptation, and / or encapsulation. In some embodiments, some processes may be omitted.
  • the RoHC module performs IP packet header compression to reduce unnecessary overhead, and context information may be extracted and transmitted out of band through an adaptation process.
  • the IP header compression and adaptation process may be collectively called IP header compression.
  • IP packets may be encapsulated into link layer packets through an encapsulation process.
  • the link layer may sequentially perform an overhead reduction and / or encapsulation process for the TS packet. In some embodiments, some processes may be omitted.
  • the link layer may provide sync byte removal, null packet deletion and / or common header removal (compression).
  • Sync byte elimination can provide overhead reduction of 1 byte per TS packet. Null packet deletion can be performed in a manner that can be reinserted at the receiving end. In addition, common information between successive headers can be deleted (compressed) in a manner that can be recovered at the receiving side. Some of each overhead reduction process may be omitted. Thereafter, TS packets may be encapsulated into link layer packets through an encapsulation process.
  • the link layer packet structure for encapsulation of TS packets may be different from other types of packets.
  • IP header compression will be described.
  • the IP packet has a fixed header format, but some information required in a communication environment may be unnecessary in a broadcast environment.
  • the link layer protocol may provide a mechanism to reduce broadcast overhead by compressing the header of the IP packet.
  • IP header compression may include a header compressor / decompressor and / or adaptation module.
  • the IP header compressor (RoHC compressor) may reduce the size of each IP packet header based on the RoHC scheme.
  • the adaptation module may then extract the context information and generate signaling information from each packet stream.
  • the receiver may parse signaling information related to the packet stream and attach context information to the packet stream.
  • the RoHC decompressor can reconstruct the original IP packet by recovering the packet header.
  • IP header compression may mean only IP header compression by a header compressor, or may mean a concept in which the IP header compression and the adaptation process by the adaptation module are combined. The same is true for decompressing.
  • the adaptation function may generate link layer signaling using context information and / or configuration parameters.
  • the adaptation function may periodically send link layer signaling over each physical frame using previous configuration parameters and / or context information.
  • the context information is extracted from the compressed IP packets, and various methods may be used according to the adaptation mode.
  • Mode # 1 is a mode in which no operation is performed on the compressed packet stream, and may be a mode in which the adaptation module operates as a buffer.
  • Mode # 2 may be a mode for extracting context information (static chain) by detecting IR packets in the compressed packet stream. After extraction, the IR packet is converted into an IR-DYN packet, and the IR-DYN packet can be transmitted in the same order in the packet stream by replacing the original IR packet.
  • context information static chain
  • Mode # 3 t6020 may be a mode for detecting IR and IR-DYN packets and extracting context information from the compressed packet stream.
  • Static chains and dynamic chains can be extracted from IR packets and dynamic chains can be extracted from IR-DYN packets.
  • the IR and IR-DYN packets can be converted into regular compressed packets.
  • the switched packets can be sent in the same order within the packet stream, replacing the original IR and IR-DYN packets.
  • the remaining packets after the context information is extracted may be encapsulated and transmitted according to the link layer packet structure for the compressed IP packet.
  • the context information may be transmitted by being encapsulated according to a link layer packet structure for signaling information as link layer signaling.
  • the extracted context information may be included in the RoHC-U Description Table (RTT) and transmitted separately from the RoHC packet flow.
  • the context information may be transmitted through a specific physical data path along with other signaling information.
  • a specific physical data path may mean one of general PLPs, a PLP to which LLS (Low Level Signaling) is delivered, a dedicated PLP, or an L1 signaling path. path).
  • the RDT may be signaling information including context information (static chain and / or dynamic chain) and / or information related to header compression.
  • the RDT may be transmitted whenever the context information changes.
  • the RDT may be transmitted in every physical frame. In order to transmit the RDT in every physical frame, a previous RDT may be re-use.
  • the receiver may first select PLP to acquire signaling information such as SLT, RDT, LMT, and the like. When the signaling information is obtained, the receiver may combine these to obtain a mapping between the service-IP information-context information-PLP. That is, the receiver can know which service is transmitted to which IP streams, which IP streams are delivered to which PLP, and can also obtain corresponding context information of the PLPs. The receiver can select and decode a PLP carrying a particular packet stream. The adaptation module can parse the context information and merge it with the compressed packets. This allows the packet stream to be recovered, which can be delivered to the RoHC decompressor. Decompression can then begin.
  • signaling information such as SLT, RDT, LMT, and the like.
  • the receiver may combine these to obtain a mapping between the service-IP information-context information-PLP. That is, the receiver can know which service is transmitted to which IP streams, which IP streams are delivered to which PLP, and can also obtain corresponding context information of the PLPs.
  • the receiver detects the IR packet and starts decompression from the first received IR packet according to the adaptation mode (mode 1), or detects the IR-DYN packet to perform decompression from the first received IR-DYN packet.
  • the link layer protocol may encapsulate all types of input packets, such as IP packets and TS packets, into link layer packets. This allows the physical layer to process only one packet format independently of the protocol type of the network layer (here, consider MPEG-2 TS packet as a kind of network layer packet). Each network layer packet or input packet is transformed into a payload of a generic link layer packet.
  • Segmentation may be utilized in the packet encapsulation process. If the network layer packet is too large to be processed by the physical layer, the network layer packet may be divided into two or more segments.
  • the link layer packet header may include fields for performing division at the transmitting side and recombination at the receiving side. Each segment may be encapsulated into a link layer packet in the same order as the original position.
  • Concatenation may also be utilized in the packet encapsulation process. If the network layer packet is small enough that the payload of the link layer packet includes several network layer packets, concatenation may be performed.
  • the link layer packet header may include fields for executing concatenation. In the case of concatenation, each input packet may be encapsulated into the payload of the link layer packet in the same order as the original input order.
  • the link layer packet may include a header and a payload, and the header may include a base header, an additional header, and / or an optional header.
  • the additional header may be added depending on the chaining or splitting, and the additional header may include necessary fields according to the situation.
  • an optional header may be further added to transmit additional information.
  • Each header structure may be predefined. As described above, when the input packet is a TS packet, a link layer header structure different from other packets may be used.
  • Link layer signaling may operate at a lower level than the IP layer.
  • the receiving side can acquire the link layer signaling faster than the IP level signaling such as LLS, SLT, SLS, and the like. Therefore, link layer signaling may be obtained before session establishment.
  • Link layer signaling may include internal link layer signaling and external link layer signaling.
  • Internal link layer signaling may be signaling information generated in the link layer.
  • the above-described RDT or LMT to be described later may correspond to this.
  • the external link layer signaling may be signaling information received from an external module, an external protocol, or an upper layer.
  • the link layer may encapsulate link layer signaling into a link layer packet and deliver it.
  • a link layer packet structure (header structure) for link layer signaling may be defined, and link layer signaling information may be encapsulated according to this structure.
  • FIG. 7 illustrates a link mapping table (LMT) according to an embodiment of the present invention.
  • the LMT may provide a list of higher layer sessions carried by the PLP.
  • the LMT may also provide additional information for processing link layer packets carrying higher layer sessions.
  • the higher layer session may be called multicast.
  • Information on which IP streams and which transport sessions are being transmitted through a specific PLP may be obtained through the LMT. Conversely, information on which PLP a specific transport session is delivered to may be obtained.
  • the LMT may be delivered to any PLP identified as carrying an LLS.
  • the PLP through which the LLS is delivered may be identified by the LLS flag of the L1 detail signaling information of the physical layer.
  • the LLS flag may be a flag field indicating whether LLS is delivered to the corresponding PLP for each PLP.
  • the L1 detail signaling information may correspond to PLS2 data to be described later.
  • the LMT may be delivered to the same PLP together with the LLS.
  • Each LMT may describe the mapping between PLPs and IP address / port as described above.
  • the LLS may include an SLT, where these IP addresses / ports described by the LMT are all IP addresses associated with any service described by the SLT forwarded to the same PLP as that LMT. It can be / ports.
  • the PLP identifier information in the above-described SLT, SLS, etc. may be utilized, so that information on which PLP the specific transmission session indicated by the SLT, SLS is transmitted may be confirmed.
  • the PLP identifier information in the above-described SLT, SLS, etc. may be omitted, and the PLP information for the specific transport session indicated by the SLT, SLS may be confirmed by referring to the information in the LMT.
  • the receiver may identify the PLP to know by combining LMT and other IP level signaling information.
  • PLP information in SLT, SLS, and the like is not omitted, and may remain in the SLT, SLS, and the like.
  • the LMT according to the illustrated embodiment may include a signaling_type field, a PLP_ID field, a num_session field, and / or information about respective sessions.
  • a PLP loop may be added to the LMT according to an embodiment, so that information on a plurality of PLPs may be described.
  • the LMT may describe PLPs for all IP addresses / ports related to all services described by the SLTs delivered together, in a PLP loop.
  • the signaling_type field may indicate the type of signaling information carried by the corresponding table.
  • the value of the signaling_type field for the LMT may be set to 0x01.
  • the signaling_type field may be omitted.
  • the PLP_ID field may identify a target PLP to be described. When a PLP loop is used, each PLP_ID field may identify each target PLP. From the PLP_ID field may be included in the PLP loop.
  • the PLP_ID field mentioned below is an identifier for one PLP in a PLP loop, and the fields described below may be fields for the corresponding PLP.
  • the num_session field may indicate the number of upper layer sessions delivered to the PLP identified by the corresponding PLP_ID field. According to the number indicated by the num_session field, information about each session may be included. This information may include an src_IP_add field, a dst_IP_add field, a src_UDP_port field, a dst_UDP_port field, a SID_flag field, a compressed_flag field, a SID field, and / or a context_id field.
  • the src_IP_add field, dst_IP_add field, src_UDP_port field, and dst_UDP_port field are the source IP address, destination IP address, source UDP port, destination UDP port for the transport session among the upper layer sessions forwarded to the PLP identified by the corresponding PLP_ID field. It can indicate a port.
  • the SID_flag field may indicate whether a link layer packet carrying a corresponding transport session has an SID field in its optional header.
  • a link layer packet carrying an upper layer session may have an SID field in its optional header, and the SID field value may be the same as an SID field in an LMT to be described later.
  • the compressed_flag field may indicate whether header compression has been applied to data of a link layer packet carrying a corresponding transport session.
  • the existence of the context_id field to be described later may be determined according to the value of this field.
  • the SID field may indicate a sub stream ID (SID) for link layer packets carrying a corresponding transport session.
  • SID sub stream ID
  • These link layer packets may include an SID having the same value as this SID field in the optional header.
  • the context_id field may provide a reference to a context id (CID) in the RDT.
  • the CID information of the RDT may indicate the context ID for the corresponding compressed IP packet stream.
  • the RDT may provide context information for the compressed IP packet stream. RDT and LMT may be associated with this field.
  • each field, element, or attribute may be omitted or replaced by another field, and additional fields, elements, or attributes may be added according to an embodiment. .
  • service components of one service may be delivered through a plurality of ROUTE sessions.
  • the SLS may be obtained through the bootstrap information of the SLT.
  • the SLS's USBD allows the S-TSID and MPD to be referenced.
  • the S-TSID may describe transport session description information for other ROUTE sessions to which service components are delivered, as well as a ROUTE session to which an SLS is being delivered.
  • all service components delivered through a plurality of ROUTE sessions may be collected. This may be similarly applied when service components of a service are delivered through a plurality of MMTP sessions.
  • one service component may be used simultaneously by a plurality of services.
  • bootstrapping for ESG services may be performed by a broadcast network or broadband.
  • URL information of the SLT may be utilized. ESG information and the like can be requested to this URL.
  • one service component of one service may be delivered to the broadcasting network and one to the broadband (hybrid).
  • the S-TSID may describe components delivered to a broadcasting network, so that a ROUTE client may acquire desired service components.
  • USBD also has base pattern information, which allows you to describe which segments (which components) are to be routed to which path. Therefore, the receiver can use this to know what segment to request to the broadband server and what segment to find in the broadcast stream.
  • scalable coding for a service may be performed.
  • the USBD may have all the capability information needed to render the service. For example, when a service is provided in HD or UHD, the capability information of the USBD may have a value of “HD or UHD”.
  • the receiver may know which component should be played in order to render the UHD or HD service using the MPD.
  • app components to be used for app-based enhancement / app-based service may be delivered through a broadcast network or through broadband as an NRT component.
  • app signaling for app-based enhancement may be performed by an application signaling table (AST) delivered with SLS.
  • an event which is a signaling of an operation to be performed by the app, may be delivered in the form of an event message table (EMT) with SLS, signaled in an MPD, or in-band signaled in a box in a DASH representation. . AST, EMT, etc. may be delivered via broadband.
  • App-based enhancement may be provided using the collected app components and such signaling information.
  • a CAP message may be included in the aforementioned LLS table for emergency alerting. Rich media content for emergency alerts may also be provided. Rich media may be signaled by the CAP message, and if rich media is present it may be provided as an EAS service signaled by the SLT.
  • the linear service components may be delivered through a broadcasting network according to the MMT protocol.
  • NRT data for example, an app component
  • data on the service may be delivered through a broadcasting network according to the ROUTE protocol.
  • data on the service may be delivered through broadband.
  • the receiver can access the MMTP session carrying the SLS using the bootstrap information of the SLT.
  • the USBD of the SLS according to the MMT may refer to the MP table so that the receiver may acquire linear service components formatted with the MPU delivered according to the MMT protocol.
  • the USBD may further refer to the S-TSID to allow the receiver to obtain NRT data delivered according to the ROUTE protocol.
  • the USBD may further reference the MPD to provide a playback description for the data delivered over the broadband.
  • the receiver may transmit location URL information for obtaining a streaming component and / or a file content item (such as a file) to the companion device through a method such as a web socket.
  • An application of a companion device may request the component, data, and the like by requesting the URL through an HTTP GET.
  • the receiver may transmit information such as system time information and emergency alert information to the companion device.
  • FIG. 8 shows a structure of a broadcast signal transmission apparatus for a next generation broadcast service according to an embodiment of the present invention.
  • a broadcast signal transmission apparatus for a next generation broadcast service includes an input format block 1000, a bit interleaved coding & modulation (BICM) block 1010, and a frame building block 1020, orthogonal frequency division multiplexing (OFDM) generation block (OFDM generation block) 1030, and signaling generation block 1040. The operation of each block of the broadcast signal transmission apparatus will be described.
  • BICM bit interleaved coding & modulation
  • OFDM generation block orthogonal frequency division multiplexing
  • signaling generation block 1040 The operation of each block of the broadcast signal transmission apparatus will be described.
  • IP streams / packets and MPEG2-TS may be main input formats, and other stream types are treated as general streams.
  • the input format block 1000 can demultiplex each input stream into one or multiple data pipes to which independent coding and modulation is applied.
  • the data pipe is the basic unit for controlling robustness, which affects the quality of service (QoS).
  • QoS quality of service
  • One or multiple services or service components may be delivered by one data pipe.
  • a data pipe is a logical channel at the physical layer that carries service data or related metadata that can carry one or multiple services or service components.
  • the BICM block 1010 may include a processing block applied to a profile (or system) to which MIMO is not applied and / or a processing block of a profile (or system) to which MIMO is applied, and for processing each data pipe. It may include a plurality of processing blocks.
  • the processing block of the BICM block to which MIMO is not applied may include a data FEC encoder, a bit interleaver, a constellation mapper, a signal space diversity (SSD) encoding block, and a time interleaver.
  • the processing block of the BICM block to which MIMO is applied is distinguished from the processing block of BICM to which MIMO is not applied in that it further includes a cell word demultiplexer and a MIMO encoding block.
  • the data FEC encoder performs FEC encoding on the input BBF to generate the FECBLOCK procedure using outer coding (BCH) and inner coding (LDPC).
  • Outer coding (BCH) is an optional coding method.
  • the bit interleaver interleaves the output of the data FEC encoder to achieve optimized performance with a combination of LDPC codes and modulation schemes.
  • Constellation Mapper uses QPSK, QAM-16, non-uniform QAM (NUQ-64, NUQ-256, NUQ-1024) or non-uniform constellation (NUC-16, NUC-64, NUC-256, NUC-1024)
  • the cell word from the bit interleaver or cell word demultiplexer can then be modulated to provide a power-normalized constellation point.
  • NUQ has any shape, while QAM-16 and NUQ have a square shape. Both NUQ and NUC are specifically defined for each code rate and are signaled by the parameter DP_MOD of PLS2 data.
  • the time interleaver may operate at the data pipe level. The parameters of time interleaving can be set differently for each data pipe.
  • the time interleaver of the present invention may be located between a BICM chain block and a frame builder.
  • the time interleaver according to the present invention may selectively use a convolution interleaver (CI) and a block interleaver (BI) according to a physical layer pipe (PLP) mode, or both.
  • PLP according to an embodiment of the present invention is a physical path used in the same concept as the above-described DP, the name can be changed according to the designer's intention.
  • the PLP mode according to an embodiment of the present invention may include a single PLP mode or a multiple PLP mode according to the number of PLPs processed by the broadcast signal transmitter or the broadcast signal transmitter.
  • time interleaving using different time interleaving methods according to the PLP mode may be referred to as hybrid time interleaving.
  • the hybrid time deinterleaver may perform an operation corresponding to the reverse operation of the aforementioned hybrid time interleaver.
  • the cell word demultiplexer is used to separate a single cell word stream into a dual cell word stream for MIMO processing.
  • the MIMO encoding block can process the output of the cell word demultiplexer using the MIMO encoding scheme.
  • the MIMO encoding scheme of the present invention may be defined as full-rate spatial multiplexing (FR-SM) to provide capacity increase with a relatively small complexity increase at the receiver side.
  • MIMO processing is applied at the data pipe level.
  • NUQ e1, i and e2, i
  • MIMO encoder output pairs g1, i and g2, i
  • the frame building block 1020 may map data cells of an input data pipe to OFDM symbols and perform frequency interleaving for frequency domain diversity within one frame.
  • a frame according to an embodiment of the present invention is divided into a preamble, one or more frame signaling symbols (FSS), and normal data symbols.
  • the preamble is a special symbol that provides a set of basic transmission parameters for efficient transmission and reception of a signal.
  • the preamble may signal a basic transmission parameter and a transmission type of the frame.
  • the preamble may indicate whether an emergency alert service (EAS) is provided in the current frame.
  • EAS emergency alert service
  • the main purpose of the FSS is to carry PLS data. For fast synchronization and channel estimation, and fast decoding of PLS data, the FSS has a higher density pilot pattern than normal data symbols.
  • the frame building block adjusts the timing between the data pipes and the corresponding PLS data so that a delay compensation block is provided at the transmitter to ensure co-time between the data pipes and the corresponding PLS data.
  • a cell mapper and a frequency interleaver for mapping a PLS, a data pipe, an auxiliary stream, and a dummy cell to an active carrier of an OFDM symbol in a frame.
  • the frequency interleaver may provide frequency diversity by randomly interleaving data cells received from the cell mapper.
  • the frequency interleaver uses a different interleaving seed order to obtain the maximum interleaving gain in a single frame.
  • the frequency interleaver uses a single symbol or data corresponding to an OFDM symbol pair consisting of two sequential OFDM symbols. Operate on corresponding data.
  • OFDM generation block 1030 modulates the OFDM carrier, inserts pilots, and generates time-domain signals for transmission by the cells generated by the frame building block. In addition, the block sequentially inserts a guard interval and applies a PAPR reduction process to generate a final RF signal.
  • the signaling generation block 1040 may generate physical layer signaling information used for the operation of each functional block.
  • Signaling information may include PLS data.
  • PLS provides a means by which a receiver can connect to a physical layer data pipe.
  • PLS data consists of PLS1 data and PLS2 data.
  • PLS1 data is the first set of PLS data delivered to the FSS in frames with fixed size, coding, and modulation that convey basic information about the system as well as the parameters needed to decode the PLS2 data.
  • PLS1 data provides basic transmission parameters including the parameters required to enable reception and decoding of PLS2 data.
  • PLS2 data carries more detailed PLS data about the data pipes and systems and is the second set of PLS data sent to the FSS.
  • PLS2 signaling further consists of two types of parameters: PLS2 static data (PLS2-STAT data) and PLS2 dynamic data (PLS2-DYN data).
  • PLS2 static data is PLS2 data that is static during the duration of a frame group
  • PLS2 dynamic data is PLS2 data that changes dynamically from frame to frame.
  • the PLS2 data may include FIC_FLAG information.
  • FIC Fast Information Channel
  • the FIC_FLAG information is a 1-bit field and indicates whether a fast information channel (FIC) is used in the current frame group.If the value of this field is set to 1, the FIC is provided in the current frame. If the value of the field is set to 0, the FIC is not transmitted in the current frame.
  • the BICM block 1010 may include a BICM block for protecting PLS data
  • the BICM block for protecting PLS data is a PLS FEC encoder. , Bit interleaver, and constellation mapper.
  • the PLS FEC encoder performs external encoding on scrambled PLS 1,2 data using a scrambler for scrambling PLS1 data and PLS2 data, shortened BCH code for PLS protection, and a BCH for inserting zero bits after BCH encoding.
  • An encoding / zero insertion block, an LDPC encoding block for performing encoding using an LDPC code, and an LDPC parity puncturing block may be included.
  • the output bits of zero insertion can be permutated before LDPC encoding.
  • the bit interleaver interleaves the respective shortened and punctured PLS1 data and PLS2 data, and the constellation mapper bit interleaves.
  • the PLS1 data and the PLS2 data can be mapped to the constellation.
  • the broadcast signal receiving apparatus for the next generation broadcast service may perform a reverse process of the broadcast signal transmitting apparatus for the next generation broadcast service described with reference to FIG. 8.
  • An apparatus for receiving broadcast signals for a next generation broadcast service includes a synchronization and demodulation module for performing demodulation corresponding to a reverse process of a procedure executed by a broadcast signal transmitting apparatus and an input signal.
  • a frame parsing module for parsing a frame, extracting data on which a service selected by a user is transmitted, converting an input signal into bit region data, and then deinterleaving the bit region data as necessary, and transmitting efficiency
  • a demapping and decoding module for performing demapping on the mapping applied for decoding, and correcting an error occurring in a transmission channel through decoding, of various compression / signal processing procedures applied by a broadcast signal transmission apparatus.
  • Demodulated by an output processor and a synchronization and demodulation module that executes the inverse process It may include a signaling decoding module for obtaining and processing the PLS information from the signal.
  • the frame parsing module, the demapping and decoding module, and the output processor may execute the function by using the PLS data output from the signaling decoding module.
  • a time interleaving group according to an embodiment of the present invention is directly mapped to one frame or spread over PI frames.
  • Each time interleaving group is also divided into one or more (NTI) time interleaving blocks.
  • NTI time interleaving time interleaving block
  • each time interleaving block corresponds to one use of the time interleaver memory.
  • the time interleaving block in the time interleaving group may include different numbers of XFECBLOCKs.
  • the time interleaver may also act as a buffer for data pipe data prior to the frame generation process.
  • the time interleaver according to an embodiment of the present invention is a twisted row-column block interleaver.
  • the twisted row-column block interleaver according to an embodiment of the present invention writes the first XFECBLOCK in the column direction to the first column of the time interleaving memory, the second XFECBLOCK to the next column and the remaining XFECBLOCKs in the time interleaving block in the same manner. You can fill in these. And in an interleaving array, cells can be read diagonally from the first row to the last row (starting from the leftmost column to the right along the row).
  • the interleaving array for the twisted row-column block interleaver may insert the virtual XFECBLOCK into the time interleaving memory to achieve a single memory deinterleaving at the receiver side regardless of the number of XFECBLOCKs in the time interleaving block.
  • the virtual XFECBLOCK must be inserted in front of the other XFECBLOCKs to achieve a single memory deinterleaving on the receiver side.
  • FIG 9 illustrates a writing operation of a time interleaver according to an embodiment of the present invention.
  • the block shown on the left side of the figure represents a TI memory address array, and the block shown on the right side of the figure shows that virtual FEC blocks are placed at the front of the TI group for two consecutive TI groups. It represents the writing operation when two and one are inserted respectively.
  • the frequency interleaver may include an interleaving address generator for generating an interleaving address for applying to data corresponding to a symbol pair.
  • FIG. 10 is a block diagram of an interleaving address generator composed of a main-PRBS generator and a sub-PRBS generator according to each FFT mode included in a frequency interleaver according to an embodiment of the present invention.
  • the interleaving process for an OFDM symbol pair uses one interleaving sequence and is described as follows.
  • xm, l, p the p-th cell of the l-th OFDM symbol in the m-th frame and Ndata is the number of data cells.
  • Ndata CFSS for the frame signaling symbol
  • Ndata Cdata for the normal data
  • Ndata CFES for the frame edge symbol.
  • vm, l, p xm, l, Hi (p)
  • p 0,... Is given by Ndata-1.
  • Hl (p) is an interleaving address generated based on the cyclic shift value (symbol offset) of the PRBS generator and the sub-PRBS generator.
  • FIG. 11 is a diagram illustrating a receiver of a next generation broadcast system according to an embodiment of the present invention.
  • a receiver includes a receiver (not shown), a channel synchronizer (J32010), a channel equalizer (J32020), a channel decoder (J32030), a signaling decoder (signaling).
  • Decoder; J32040 Baseband Operation Controller; J32050; Service Map DB; J32060; Transport Packet Interface; J32070; Broadband Packet Interface; J32080; Common Protocol Stack (J32090), Service Signaling Channel Processing Buffer & Parser (J32100), A / V Processor (J32110), Service Guide Processor Processor; J32120, Application Processor J32130 and / or Service Guide DB J32140. It can hamhal.
  • a receiver (not shown) receives a broadcast signal.
  • the channel synchronizer J32010 synchronizes a symbol frequency and timing to enable decoding of a signal received in the baseband.
  • the baseband refers to an area where a broadcast signal is transmitted / received.
  • the channel equalizer J32020 performs channel equalization on the received signal.
  • the channel equalizer J32020 compensates for the received signal when it is distorted due to a multipath, a Doppler effect, or the like.
  • the channel decoder J32030 recovers the received signal into a transport frame having a meaning.
  • the channel decoder J32030 performs forward error correction (FEC) on data or transmission frames included in the received signal.
  • FEC forward error correction
  • the signaling decoder J32040 extracts and decodes signaling data included in the received signal.
  • the signaling data includes signaling data and / or service information (SI) to be described later.
  • a baseband operation controller (J32050) controls the processing of signals in the baseband.
  • the service map database J32060 stores signaling data and / or service information.
  • the service map database J32060 may store signaling data transmitted in a broadcast signal and / or signaling data transmitted in a broadband packet.
  • the transport packet interface J32070 extracts a transport packet from a transport frame or a broadcast signal.
  • the transport packet interface J32070 extracts signaling data or an IP datagram from a transport packet.
  • the broadband packet interface J32080 receives a broadcast related packet through an internet network.
  • the broadband packet interface J32080 extracts a packet obtained through the Internet network, and combines or extracts signaling data or A / V data from the packet.
  • the Common Protocol Stack processes received packets according to the protocols contained in the protocol stack.
  • the common protocol stack J32090 may process the received packet according to the protocol stack described above.
  • the service signaling channel processing buffer and parser J32100 extracts signaling data included in the received packet.
  • the Service Signaling Channel Processing Buffer & Parser (J32100) extracts and parses signaling information related to the scan and / or acquisition of a service and / or content from an IP datagram or the like.
  • the signaling data may exist at a certain location or channel. Such a location or channel may be called a service signaling channel.
  • the service signaling channel may have a specific IP address, a UDP port number, a transport session identifier, and the like.
  • the receiver may recognize data transmitted through such a specific IP address, UDP port number, transmission session, etc. as signaling data.
  • the A / V Processor J32110 performs decoding and presentation processing on the received audio and video data.
  • the service guide processor (J32120) extracts announcement information from a received signal, manages a service guide database (J32140), and provides a service guide.
  • the application processor J32130 extracts and processes application data and / or application related information included in the received packet.
  • the service guide database (J32140) stores service guide data.
  • FIG. 12 is a block diagram of a hybrid broadcast receiver according to an embodiment of the present invention.
  • the hybrid broadcast receiver may receive a hybrid broadcast service through interlocking terrestrial broadcast and broadband in a DTV service of a next generation broadcast system.
  • the hybrid broadcast receiver may receive broadcast audio / video (Audio / Video, A / V) content transmitted through terrestrial broadcast, and receive enhancement data or a part of broadcast A / V content related thereto in real time through broadband.
  • broadcast audio / video (A / V) content may be referred to as media content.
  • Hybrid broadcast receivers include Physical Layer Controller (D55010), Tuner (Tuner, D55020), Physical Frame Parser (D55030), Link Layer Frame Parser (D55040), IP / UDP Datagram Filter (IP / UDP Datagram Filter, D55050), ATSC 3.0 Digital Television Control Engine (ATSC 3.0 DTV Control Engine, D55060), ALC / LCT + Client (ALC / LCT + Client, D55070), Timing Control (D55080), Signaling Signaling Parser (D55090), DASH Client (Dynamic Adaptive Streaming over HTTP Client, DASH Client, D55100), HTTP Access Client (HTTP Access Client, D55110), ISO BMFF Parser (ISO Base Media File Format Parser, ISO BMFF Parser, D55120) and / or a media decoder D55130.
  • D55010 Physical Layer Controller
  • Tuner Tuner
  • D55030 Physical Frame Parser
  • Link Layer Frame Parser D55040
  • the physical layer controller D55010 may control operations of the tuner D55020 and the physical frame parser D55030 using radio frequency (RF) information of a terrestrial broadcast channel intended to be received by the hybrid broadcast receiver. .
  • RF radio frequency
  • the tuner D55020 may receive and process a broadcast-related signal through a terrestrial broadcast channel and convert it to an appropriate form. For example, the tuner D55020 may convert the received terrestrial broadcast signal into a physical frame.
  • the physical frame parser D55030 may obtain a link layer frame through parsing the received physical frame and processing related thereto.
  • the link layer parser D55040 may perform a related operation for obtaining link layer signaling or the like from an link layer frame or obtaining an IP / UDP datagram.
  • the connection layer parser D55040 may output at least one IP / UDP datagram.
  • the IP / UDP datagram filter D55050 may filter a specific IP / UDP datagram from the received at least one IP / UDP datagram. That is, the IP / UDP datagram filter D55050 selectively selects the IP / UDP datagram selected by the ATSC 3.0 digital television control engine D55060 among at least one IP / UDP datagram output from the connection layer parser D55040. You can filter.
  • the IP / UDP datagram filter D55050 may output an application layer transport protocol packet such as ALC / LCT +.
  • the ATSC 3.0 digital television control engine (D55060) may be responsible for the interface between the modules included in each hybrid broadcast receiver. In addition, the ATSC 3.0 digital television control engine (D55060) transmits the parameters required for each module to each module, thereby controlling the operation of each module. In the present invention, the ATSC 3.0 digital television control engine D55060 may deliver a media presentation description (MPD) and / or an MPD URL to the DASH client D55100. Also, in the present invention, the ATSC 3.0 digital television control engine D55060 may transmit a delivery mode and / or a transport session identifier (TSI) to the ALC / LCT + client D55070.
  • MPD media presentation description
  • TSI transport session identifier
  • TSI may indicate an identifier of a session for transmitting a transport packet including a signaling message such as MPD or MPD URL related signaling, for example, an ALC / LCT + session or FLUTE session, which is an application layer transport protocol.
  • the transport session identifier may correspond to the asset id of the MMT.
  • the ALC / LCT + client D55070 may generate one or more ISO Base Media File Format (ISOBMFF) objects by processing application layer transport protocol packets such as ALC / LCT + and collecting and processing a plurality of packets.
  • the application layer transport protocol packet may include an ALC / LCT packet, an ALC / LCT + packet, a ROUTE packet, and / or an MMTP packet.
  • the timing controller D55080 may process a packet including system time information and control the system clock accordingly.
  • the signaling parser D55090 may acquire and parse DTV broadcast service related signaling, and generate and manage a channel map or the like based on the parsed signaling.
  • the signaling parser may parse extended MPD or MPD related information from signaling information.
  • the DASH client D55100 may perform operations related to real-time streaming or adaptive streaming.
  • the DASH client D55100 may receive the DASH content from the HTTP server through the HTTP connection client D55110.
  • the DASH client D55100 may output the ISO Base Media File Format object by processing the received DASH segment.
  • the DASH client D55100 may transmit the full Representation ID or the segment URL to the ATSC 3.0 digital television control engine D55060.
  • the entire Representation ID may mean, for example, an ID combining the MPD URL, period @ id, and representation @ id.
  • the DASH client D55100 may also receive an MPD or MPD URL from the ATSC 3.0 digital television control engine D55060.
  • the DASH client D55100 may receive a desired media stream or DASH segment from the HTTP server using the received MPD or MPD URL.
  • the DASH client D55100 may be referred to as a processor.
  • the HTTP access client D55110 may request specific information from the HTTP server, and receive and process a response from the HTTP server.
  • the HTTP server may process a request received from an HTTP connection client and provide a response thereto.
  • the ISO BMFF parser D55120 may extract audio / video data from an ISO Base Media File Format object.
  • the media decoder D55130 may decode the received audio and / or video data and perform processing for presenting the decoded audio / video data.
  • FIG. 13 is a block diagram of a hybrid broadcast receiver according to another embodiment of the present invention.
  • the hybrid broadcast receiver may receive a hybrid broadcast service through interlocking terrestrial broadcast and broadband in a DTV service of a next generation broadcast system.
  • the hybrid broadcast receiver may receive broadcast audio / video (Audio / Video, A / V) content transmitted through terrestrial broadcast, and receive enhancement data or a part of broadcast A / V content related thereto in real time through broadband.
  • broadcast audio / video (A / V) content may be referred to as media content.
  • Hybrid broadcast receivers include Physical Layer Controller (H13010), Tuner (Tuner, H13020), Physical Frame Parser (H13030), Link Layer Frame Parser (H13040), IP / UDP datagram Filter (IP / UDP Datagram Filter, H13050), ATSC 3.0 Digital Television Control Engine (ATSC 3.0 DTV Control Engine, H13060), ALC / LCT + Client (ALC / LCT + Client, H13070), Timing Control (H13080), Signaling Signaling Parser (H13090), DASH Client (Dynamic Adaptive Streaming over HTTP Client, DASH Client, H13100), HTTP Access Client (HTTP Access Client, H13110), ISO BMFF Parser (ISO Base Media File Format Parser, ISO BMFF Parser, H13120) and / or Media Decoder H13130.
  • H13010 Physical Layer Controller
  • Tuner Tuner
  • H13030 Physical Frame Parser
  • Link Layer Frame Parser H13040
  • the physical layer controller H13010 may control operations of the tuner H13020 and the physical frame parser H13030 using radio frequency (RF) information of the terrestrial broadcast channel that the hybrid broadcast receiver intends to receive. .
  • RF radio frequency
  • the tuner H13020 may receive and process a broadcast-related signal through a terrestrial broadcast channel and convert it to an appropriate form. For example, the tuner H13020 may convert the received terrestrial broadcast signal into a physical frame.
  • the physical frame parser H13030 may obtain a link layer frame through parsing the received physical frame and processing related thereto.
  • the link layer parser H13040 may perform a related operation for obtaining link layer signaling or the like from an link layer frame or obtaining an IP / UDP datagram.
  • the connection layer parser H13040 may output at least one IP / UDP datagram.
  • the IP / UDP datagram filter H13050 may filter a specific IP / UDP datagram from the received at least one IP / UDP datagram. That is, the IP / UDP datagram filter H13050 selectively selects the IP / UDP datagram selected by the ATSC 3.0 digital television control engine H13060 among at least one IP / UDP datagram output from the connection layer parser H13040. You can filter.
  • the IP / UDP datagram filter H13050 may output an application layer transport protocol packet such as ALC / LCT +.
  • the ATSC 3.0 digital television control engine H13060 may be responsible for the interface between the modules included in each hybrid broadcast receiver. In addition, the ATSC 3.0 digital television control engine (H13060) transmits the parameters required for each module to each module, thereby controlling the operation of each module. In the present invention, the ATSC 3.0 digital television control engine H13060 may deliver a media presentation description (MPD) and / or an MPD URL to the DASH client H13100. In addition, in the present invention, the ATSC 3.0 digital television control engine H13060 may transmit a delivery mode and / or a transport session identifier (TSI) to the ALC / LCT + client H13070.
  • MPD media presentation description
  • TSI transport session identifier
  • TSI may indicate an identifier of a session for transmitting a transport packet including a signaling message such as MPD or MPD URL related signaling, for example, an ALC / LCT + session or FLUTE session, which is an application layer transport protocol.
  • the transport session identifier may correspond to the asset id of the MMT.
  • the ALC / LCT + client H13070 may generate one or more ISO Base Media File Format (ISOBMFF) objects by processing application layer transport protocol packets such as ALC / LCT + and collecting and processing a plurality of packets.
  • the application layer transport protocol packet may include an ALC / LCT packet, an ALC / LCT + packet, a ROUTE packet, and / or an MMTP packet.
  • the ALC / LCT + client H13070 may provide an ISO Base Media File Format object to the ISO BMFF parser H13120 which will be described later.
  • the timing controller H13080 may process a packet including system time information and control the system clock accordingly.
  • the signaling parser H13090 may acquire and parse DTV broadcast service related signaling, and generate and manage a channel map or the like based on the parsed signaling.
  • the signaling parser may parse extended MPD or MPD related information from signaling information.
  • the DASH client H13100 may perform an operation related to real-time streaming or adaptive streaming.
  • the DASH client H13100 may receive the DASH content from the HTTP server through the HTTP connection client H13110.
  • the DASH client H13100 may output the ISO Base Media File Format object by processing the received DASH segment.
  • the DASH client H13100 may also receive an MPD or MPD URL from the ATSC 3.0 digital television control engine H13060.
  • the DASH client H13100 may receive a desired media stream or DASH segment from the HTTP server using the received MPD or MPD URL.
  • the DASH client H13100 may be referred to as a processor.
  • the HTTP access client H13110 may request specific information from the HTTP server, and receive and process a response from the HTTP server.
  • the HTTP server may process a request received from an HTTP connection client and provide a response thereto.
  • the ISO BMFF parser H13120 may extract audio / video data from an ISO Base Media File Format object.
  • the ISO BMFF parser H13120 may receive an ISO Base Media File Format object from the ALC / LCT + client H13070.
  • the media decoder H13130 may decode the received audio and / or video data and perform processing for presenting the decoded audio / video data.
  • FIG. 14 illustrates a broadcast receiver according to an embodiment of the present invention.
  • the broadcast receiver includes a service / content acquisition controller J2010, an internet interface J2020, a broadcast interface J2030, a signaling decoder J2040, a service map database J2050, a decoder J2060,
  • the targeting processor J2070, the processor J2080, the management unit J2090, and / or the redistribution module J2100 may be included.
  • an external management device J2110 is located outside and / or within a broadcast receiver.
  • the service / content acquisition controller J2010 receives the services and / or content and signaling data associated therewith via the broadcast / broadband channel. Alternatively, the service / content acquisition controller J2010 may perform control to receive the service and / or content and signaling data associated therewith.
  • the internet interface J2020 may include an internet access control module.
  • the internet access control module receives service, content and / or signaling data over a broadband channel.
  • the Internet access control module can control the operation of the receiver to obtain service, content and / or signaling data.
  • the broadcast interface J2030 may include a physical layer module and / or a physical layer I / F module.
  • the physical layer module receives a broadcast related signal through a broadcast channel.
  • the physical layer module processes (demodulates, decodes, etc.) the broadcast related signals received via the broadcast channel.
  • the physical layer I / F module obtains an IP datagram from information obtained from the physical layer module or converts the data into a specific frame (for example, a broadcast frame, an RS frame, or a GSE) using the obtained IP datagram.
  • the signaling decoder J2040 decodes signaling data or signaling information (hereinafter, referred to as “signaling data") obtained through a broadcast channel or the like.
  • the service map database J2050 stores signaling data or decoded signaling data processed by another apparatus (eg, signaling parser) of the receiver.
  • Decoder J2060 decodes the broadcast signal or data received by the receiver. Decoder J2060 may include a scheduled streaming decoder, file decoder, file database (DB), on-demand streaming decoder, component synchronizer, alert signaling parser, targeting signaling parser, service signaling parser, and / or application signaling parser.
  • DB file database
  • the scheduled streaming decoder extracts audio / video data for real-time audio / video (A / V) from the IP datagram and the like and decodes the extracted audio / video data.
  • the file decoder extracts file type data such as NRT data and applications from the IP datagram and decodes the extracted file type data.
  • the file DB stores the data extracted by the file decoder.
  • the on-demand streaming decoder extracts audio / video data for on-demand streaming from the IP datagram and decodes the extracted audio / video data.
  • the component synchronizer configures the content or service by performing the synchronization between the elements constituting the content or the elements constituting the service based on the data decoded by the scheduled streaming decoder, the file decoder and / or the on-demand streaming decoder.
  • the alert signaling parser extracts signaling information related to the alert from the IP datagram and the like and parses the extracted signaling information.
  • the targeting signaling parser extracts signaling information related to service / content personalization or targeting from the IP datagram and parses the extracted signaling information.
  • Targeting is an action of providing content or a service that satisfies a condition of a specific viewer.
  • targeting is the action of identifying content or services that meet the conditions of a particular viewer and providing the identified content or service to the viewer.
  • the service signaling parser extracts signaling information related to service / content and / or service scan from an IP datagram and the like and parses the extracted signaling information.
  • the signaling information related to the service / content includes broadcast system information and / or broadcast signaling information.
  • the application signaling parser extracts signaling information related to the acquisition of the application from the IP datagram and the like and parses the extracted signaling information.
  • the signaling information related to the acquisition of the application may include a trigger, a TDO parameter table (TPT) and / or a TDO parameter element.
  • TPT TDO parameter table
  • the targeting processor J2070 processes information related to service / content targeting parsed by the targeting signaling parser.
  • Processor J2080 performs a series of processes for displaying received data.
  • Processor J2080 may include an alert processor, an application processor, and / or an A / V processor.
  • the alert processor performs a process of controlling the receiver and displaying the alert data to obtain alert data via signaling information related to the alert.
  • the application processor processes the information related to the application and the display parameters associated with that application and the status of the downloaded application.
  • the A / V processor performs operations related to audio / video rendering based on the decoded audio data, video data and / or application data.
  • the management unit J2090 includes a device manager and / or a data sharing and communication unit.
  • the device manager manages external devices, such as adding / deleting / updating external devices that can be interlocked, including access and data exchange.
  • the data sharing and communication unit processes information related to data transmission and exchange between a receiver and an external device (eg, companion device) and performs operations related thereto.
  • the transportable and exchangeable data may be signaling data, PDI table, PDI user data, PDI Q & A, and / or A / V data.
  • the redistribution module J2100 performs the acquisition of information related to service / content and / or service / content data when the receiver cannot directly receive a broadcast signal.
  • the external management device J2110 is a module such as a broadcast service / content server located outside the broadcast receiver providing the broadcast service / content.
  • the module functioning as an external management device may be provided to the broadcast receiver.
  • a transmission frame for transmitting a concrete broadcast service will be described with reference to FIGS. 15 and 16.
  • FIG. 15 shows a broadcast transport frame according to an embodiment of the present invention.
  • the broadcast transmission frame includes a P1 part, an L1 part, a common PLP part, an interleaved PLP (scheduled & interleaved PLP's) part, and an auxiliary data part.
  • the broadcast transmission device transmits information for transport signal detection through the P1 part of the broadcast transport frame.
  • the broadcast transmission device may transmit tuning information for broadcast signal tuning through the P1 part.
  • the broadcast transmission device transmits a configuration of a broadcast transmission frame and characteristics of each PLP through an L1 part.
  • the broadcast reception device may obtain the configuration of the broadcast transmission frame and the characteristics of the PLP by decoding the L1 part based on P1.
  • the broadcast transmission device may transmit information commonly applied between PLPs through a common PLP part.
  • the broadcast transport frame may not include the common PLP part.
  • the broadcast transmission device transmits a plurality of components included in a broadcast service through an interleaved PLP part.
  • the interleaved PLP part includes a plurality of PLPs.
  • the broadcast transmission device may signal to which PLP a component constituting each broadcast service is transmitted through an L1 part or a common PLP part.
  • the broadcast reception device may acquire specific broadcast service information for a broadcast service scan or the like, the plurality of PLPs of the interleaved PLP part must be decoded.
  • the broadcast transmission device may transmit a broadcast transport frame including a separate part including information about a broadcast service transmitted through a broadcast transport frame and components included in the broadcast service.
  • the broadcast reception device may quickly obtain information about a broadcast service and components included in the broadcast service through separate parts. This will be described with reference to FIG. 16.
  • FIG. 16 shows a broadcast transport frame according to another embodiment of the present invention.
  • the broadcast transmission frame includes a P1 part, an L1 part, a fast information channel (FIC) part, an interleaved PLP (scheduled & interleaved PLP's) part, and an auxiliary data part.
  • FIC fast information channel
  • interleaved PLP scheduled & interleaved PLP's
  • the broadcast transmission device transmits fast information through the FIC part.
  • the fast information may include configuration information of a broadcast stream transmitted through a transport frame, brief broadcast service information, and service signaling associated with a corresponding service / component.
  • the broadcast service may be scanned based on the broadcast reception device FIC part.
  • the broadcast reception device may extract information about a broadcast service from the FIC part.
  • 17 is a diagram illustrating a state in which presentation attribute information is added to an MPD according to an embodiment of the present invention.
  • attribute information related to the presentation may be added on the DASH MPD. More specifically, an attribute (@presentationStartTime) indicating a presentation start time value may be added on the MPD.
  • an attribute (@presentationStartTime) indicating a presentationStartTime value may be added on the MPD.
  • the added attribute information may enable the DASH client to calculate the presentation start time for each segment. That is, the DASH client may calculate the presentation start time of each segment based on the corresponding @presentationStartTime value of each segment present in the MPD.
  • the attribute representing the value of the presentation start time may be expressed as a wall-clock time or UTC.
  • the presentationStartTime attribute is added to the MPD element.
  • the MPD element may mean a root element that delivers a media presentation description (MPD) for media presentation.
  • @presentationStartTime may describe an anchor for presentation time calculation for any segment in the media presentation. In one embodiment, when @type is dynamic, this attribute information must exist.
  • the presentation start time of each segment in the MPD may be calculated using various parameters.
  • the presentation start time of each segment in the MPD is calculated using information such as MPD @ presentationStartTime, MPD @ suggestedPresentationDelay, start time of period, presentationTimeOffset and / or segment start time and duration. Can be.
  • the start time of the segment of each representation of the MPD may be calculated as the sum of the durations of the previous segments as shown in the following equation.
  • Presentation start time of each segment can be calculated by considering period start time (Period @ start) and / or @presentationTimeOffset value in segment base of MPD based on MPD @ presentationStartTime. have.
  • the presentation start time of each segment may be calculated by the following equation.
  • the presentation start time of each segment may be calculated by the following equation. This embodiment considers an @presentationTimeOffset value and the like in the segment base of the MPD.
  • SegmentBase @ presentationTimeOffset * SegmentBase @ timescale + start time of Segment [i] + MPD @ suggestedPresentationDelay
  • FIG. 18 is a diagram illustrating an example of a process of calculating a presentation start time of a segment according to an embodiment of the present invention.
  • This example shows a process of calculating the presentation start time of segment [2]. More specifically, this example illustrates a process of calculating the presentation start time of segment [2] using the above equation.
  • the presentation start time of the Segment [2] is the presentation start time of the MPD, the start time of the Period, the presentation time offset of the segment base, the segment [2]. Segment [2] start time and Suggested presentation delay of the MPD may be calculated.
  • FIG 19 illustrates presentable component information according to an embodiment of the present invention.
  • an MPD extension method may be provided so that AdaptationSet, Representation, etc. described in the MPD may represent a presentable component.
  • the presentable component may mean a continuous component scheduled to be presented to a user.
  • the presentable component information may describe the presentable component by having the following attributes and / or elements.
  • Targeting / Personalization properties include targeting properties, personalization properties, etc. related to the component, and target device (s) is the target screen or device (s) (eg, primary device, companion) on which the component will be played and / or provided device, inset on primary device, etc.), and the Associated component (s) indicates information about the related component (for example, presentable audio component or information about a closed caption component that can be played together). It may include.
  • target device is the target screen or device (s) (eg, primary device, companion) on which the component will be played and / or provided device, inset on primary device, etc.)
  • the Associated component (s) indicates information about the related component (for example, presentable audio component or information about a closed caption component that can be played together). It may include.
  • the TargetScreen may be represented by an XML element or a descriptor on the MPD.
  • the TargetScreen may be the same as the target device (s) included in the presentable component information described above.
  • FIG. 20 is a diagram illustrating a TargetScreen according to an embodiment of the present invention.
  • TargetScreen according to an embodiment of the present invention has a structure as shown, and may include one or more attribute information.
  • Property information included in the TargetScreen according to an embodiment of the present invention may include schemeIdUri, value, and id.
  • schemeIdUri may represent an identifier for a schema representing TargetScreen.
  • value may mean a value indicating a corresponding TargetScreen.
  • Id may mean an identifier indicating an associated TargetScreen.
  • 21 is a diagram illustrating a value of a TargetScreen according to an embodiment of the present invention.
  • the TargetScreen may include a value attribute. You can indicate the target screen according to the value of value.
  • TargetScreen @ value ALL, all devices are directed to the target screen.
  • TargetScreen @ value is primary, the primary device (PD) is indicated as the target screen.
  • the companion is a companion device (companion device (CD)) to indicate the target screen, if TargetScreen @ value is Inset, Inset on the primary screen (Inset on primary screen) can be indicated to the target screen.
  • CD companion device
  • the targeting property may be expressed as an XML element or a descriptor on the MPD.
  • the targeting property may be the same as the targeting / personalization properties included in the presentable component information described above.
  • FIG. 22 is a diagram illustrating a TargetProperty according to an embodiment of the present invention.
  • TargeProperty may be represented by an XML element, as shown in the illustrated embodiment (H22100).
  • the TargetProperty has the same structure as the illustrated embodiment H22200 and may include one or more attribute information.
  • the attribute information included in the TargetProperty may include schemeIdUri, id, CriterionType, and CriterionValue.
  • schemeIdUri may represent an identifier for a schema representing a TargetProperty.
  • Id may mean an identifier indicating related targeting criteria.
  • CriterionType can indicate the type of filtering criteria, such as integer, Boolean, selection, or string.
  • CriterionValue may indicate a targeting value for the corresponding targeting criteria.
  • TargetProperty has a structure as shown and may include one or more attribute information.
  • the attribute information included in the TargetProperty may include schemeIdUri, value, and id.
  • schemeIdUri may represent an identifier for a schema representing a TargetProperty.
  • Id may mean an identifier indicating related target criteria.
  • the value may indicate a targeting value for the corresponding target criteria.
  • FIG. 23 is a diagram illustrating a state in which presentable component information is added to a common attribute / element in an MPD according to an embodiment of the present invention.
  • the presentable component information described above may be added to common attributes and / or elements (common attributes / elements) in the MPD.
  • the common attribute and / or element may be included in a common adaptation set, a common representation, and / or a common sub-representation in the MPD.
  • the DASH client can recognize the attributes and / or elements added on the common attributes and / or elements on the MPD to effectively handle the associated process.
  • the MPD may include a presentable property (@presentable), an associatedTo property (@associatedTo), a TargetingProperty element, and a TargetScreen element.
  • @presentable may indicate whether a component can be expressed. In one embodiment, this attribute corresponds to an optional, and a default value of the attribute may be set to true.
  • @associatedTo may describe an associated adaptation set or representation. More specifically, @associatedTo describes a related adaptation set or representation by providing a whitespace separated list of @ids of the associated adaptation set or representation. can do.
  • the TargetingProperty element may describe a targeting property or a personalization property. More specifically, the TargetingProperty element describes a targeting property or personalization property that is used for an associated adaptation set, representation, or sub-representation. can do.
  • the TargetScreen element may describe a targeting screen or targeting devices. More specifically, the TargetScreen element may be used for a targeting screen or device that can be offered or rendered with an associated adaptation set, representation or sub-representation. Can be described.
  • the RepresentationBaseType may be represented by an XML element or a descriptor on the MPD.
  • FIG. 24 is a diagram illustrating a state in which presentable component information is included in RepresentationBaseType according to an embodiment of the present invention.
  • RepresentationBaseType may be represented by an XML element, as shown in the illustrated embodiment (H24100). This embodiment shows that the above-described presentable component information is included in a common attribute / element.
  • the RepresentationBaseType has the same structure as the illustrated embodiment H24200 and may include one or more attribute information and elements.
  • Attribute information included in the RepresentationBaseType may include profiles, width, height, sar, framerate, audioSamplerate, mimeType, segmentProfiles, codecs, maximumSAPPeriod, startWithSAP, maxPlayoutRate, codingDependency, scanType, associatedTo, and presentable.
  • elements included in the RepresentationBaseType may include FramePacking, AudioChannelConfiguration, ContentProtection, EssentialProperty, SupplementalProperty, InbandEventStream, TargetingProperty, and TargetScreen.
  • 25 is a diagram illustrating a state in which presentable component information is added to a content component attribute / element in an MPD according to an embodiment of the present invention.
  • the presentable component information described above may be added to content component attributes and / or elements (content component attributes / elements) in the MPD.
  • the content component attribute and / or element may be included in a content component in the MPD.
  • the content component attributes and / or elements may be included in an adaptation set, a representation, and / or a sub-representation.
  • the DASH client may be aware of the attributes and / or elements added on the content component attributes and / or elements on the MPD to effectively handle the associated process.
  • the MPD may include a presentable property (@presentable), an associatedTo property (@associatedTo), a TargetingProperty element, and a TargetScreen element.
  • @presentable may indicate whether the presentable component is a presentable component. In one embodiment, this attribute corresponds to an optional, and a default value of the attribute may be set to true.
  • @associatedTo may describe an associated adaptation set or representation. More specifically, @associatedTo describes a related adaptation set or representation by providing a whitespace separated list of @ids of the associated adaptation set or representation. can do.
  • the TargetingProperty element may describe a targeting property or a personalization property. More specifically, the TargetingProperty element describes a targeting property or personalization property that is used for an associated adaptation set, representation, or sub-representation. can do.
  • the TargetScreen element may describe a targeting screen or a targeting device. More specifically, the TargetScreen element may be used for a targeting screen or device that can be offered or rendered with an associated adaptation set, representation or sub-representation. Can be described.
  • FIG. 26 is a diagram showing the presentable component information included in a ContentComponentType according to an embodiment of the present invention.
  • ContentComponentType may be represented by an XML element, as shown in the illustrated embodiment. This embodiment shows that the above-described presentable component information is included in a content component attribute / element.
  • the ContentComponentType may include one or more attribute information and elements. Attribute information included in the ContentComponentType may include id, lang, contentType, par, associatedTo, and presentable. In addition, elements included in ContentComponentType may include Accessibility, Role, TargetingProperty, TargetScreen, Rating, and Viewpoint.
  • FIG. 27 illustrates a continuous component and a composite component according to an embodiment of the present invention.
  • a continuous component and / or a composite component may be included in a next generation broadcast service.
  • the continuous component may refer to a content component represented in a continuous stream, such as audio, video, or closed captions.
  • the continuous component may be the same as the continuous component used to describe the presentable component described above.
  • the composite component may be a content component composed of a set of continuous components.
  • the continuous component may have the same content type, reproduce the same scene, and several combinations may be combined to generate a presentation.
  • a composite component may mix music, dialogs, and sound effects to form a finished audio, or the left and right 3D views may be merged to provide a 3D picture.
  • a new composite element / attribute capable of representing the aforementioned composite component may be added.
  • the illustrated embodiment may represent a composite element / attribute according to an embodiment of the present invention.
  • CompositeType may be represented by an XML element, as shown in the illustrated embodiment H28100.
  • the CompositeType has the same structure as the illustrated embodiment H28200 and may include one or more attribute information and elements.
  • Property information included in the CompositeType may include contain, id, presentable, and associatedTo.
  • elements included in the CompositeType may include TargetingProperty, AtscRating, and TargetScreen.
  • the contain attribute may describe an adaptation set or a representation included in the composite component.
  • the contain attribute may describe the relevant adaptation set or representation by providing a whitespace separated list of the adaptation set included in the composite component or @id included in the representation. Can be.
  • the id attribute may describe a unique identifier for the composite component.
  • the presentable attribute may indicate whether the presentable component is a presentable component.
  • the associatedTo attribute may describe an adaptation set or representation that is associated with the corresponding composite component. More specifically, the associatedTo attribute provides a whitespace-separated list of the associated Adaptation set or Representation's @ids, thereby providing an Adaptation set or Representation associated with that composite component ( Representation) can be described.
  • the TargetingProperty element may indicate a targeting property and / or a personalization property.
  • the TargetScreen element may represent a targeting screen or a targeting device.
  • the AtscRating element may indicate a rating property.
  • 29 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • a new composite element / attribute capable of representing the aforementioned composite component may be added.
  • the illustrated embodiment may represent a composite element / attribute according to another embodiment of the present invention.
  • CompositeType may be represented by an XML element, as shown in the illustrated embodiment H29100.
  • the CompositeType has the same structure as the illustrated embodiment H29200 and may include one or more attribute information and elements.
  • Property information included in the CompositeType may include contain, id, presentable, and associatedTo.
  • elements included in the CompositeType may include TargetingProperty, AtscRating, and TargetScreen.
  • the contain attribute may describe an adaptation set or a representation included in the composite component.
  • the contain attribute may describe the relevant adaptation set or representation by providing a whitespace separated list of the adaptation set included in the composite component or @id included in the representation. Can be.
  • the id attribute may describe a unique identifier for the composite component.
  • the presentable attribute may indicate whether the presentable component is a presentable component.
  • the associatedTo attribute may describe an adaptation set or representation that is associated with the corresponding composite component. More specifically, the associatedTo attribute provides a whitespace-separated list of the associated Adaptation set or Representation's @ids, thereby providing an Adaptation set or Representation associated with that composite component ( Representation) can be described.
  • the TargetingProperty element may indicate a targeting property and / or a personalization property.
  • the TargetScreen element may represent a targeting screen or a targeting device.
  • the AtscRating element may indicate a rating property.
  • TargetingProperty element may each include one or more property information in a type defined by DescriptorType.
  • Attribute information included in DescriptorType may include schemeIdUri, value, and id.
  • schemeIdUri may represent an identifier for a schema representing a corresponding element.
  • Id may mean an identifier indicating a corresponding element.
  • the value may indicate a value associated with the element.
  • FIG. 30 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • a new composite element / attribute capable of representing the above-described composite component may be added.
  • the illustrated embodiment may represent a composite element / attribute according to another embodiment of the present invention.
  • CompositeType may be represented by an XML element, as shown in the illustrated embodiment.
  • the CompositeType may include common elements and / or attributes of the MPD.
  • 31 is a diagram illustrating another example of adding composite information on an MPD according to an embodiment of the present invention.
  • a new composite element / attribute capable of representing the above-described composite component may be added.
  • the illustrated embodiment may represent a composite element / attribute according to another embodiment of the present invention.
  • CompositeType may be represented by an XML element, as shown in the illustrated embodiment.
  • the CompositeType may include common elements and / or attributes of the MPD.
  • the common elements and / or attributes of the MPD may include the presentable attribute, the associatedTo attribute, the TargetingProperty element, and / or the TargetScreen element.
  • a method of expressing a delivery path (delivery path) of a service component in an MPD may be provided.
  • the service component may mean a component of ATSC 3.0.
  • the transmission path (delivery path) may mean broadcast through a ROUTE / LCT session or unicast through an internet network.
  • the DASH client can recognize the delivery path (delivery path) of the service components specified in the MPD.
  • 32 is a diagram illustrating an example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • elements and attributes are shown for representing the aforementioned transmission path on the MPD. That is, in the illustrated embodiment, transmission path information for representing the aforementioned transmission path on the MPD is illustrated.
  • DeliveryPath element is an element describing a transmission path of an ATSC 3.0 component and may be expressed in a descriptor form.
  • the schemeIdUri attribute is a URI for identifying the scheme.
  • the schemeIdUri attribute may be given a unique value for identifying the atsc: DeliveryPath element.
  • the schemeIdUri attribute may be given a value such as "urn: org: atsc: delivery-path: 2015".
  • the value attribute may be a value indicating a transmission path for a component for which a transmission path is to be identified. That is, the value attribute may be a value representing a transmission path such as an adaptation set, a representation, etc. to identify the transmission path.
  • the value attribute is ⁇ broadcast> when a component is transmitted through a broadcast network, ⁇ broadcast> when it is transmitted through a unicast network, and ⁇ hybrid> when it is transmitted through both broadcast and unicast. Can be set.
  • the value attribute may have a default value, where default may be set to ⁇ broadcast> or ⁇ unicast>.
  • the id attribute may indicate an identifier for a component to identify a transmission path. That is, the id attribute may indicate an identifier such as an adaptation set, a representation, or the like for identifying a transmission path. For example, if a corresponding element describes a transmission path of a representation having an id value of ⁇ r0>, this attribute may be expressed as ⁇ rid r0>.
  • ⁇ rid> may represent that the category of id is a representation.
  • aid may indicate that a category of id is an adaptation set
  • pid may indicate that a category of id is a period.
  • Such category information and an actual id value may be distinguished through spaces inserted therebetween.
  • DeliveryPath element described above is described as a child element such as an adaptation set, a representation, a period, or the like, to identify a transmission path, this id attribute may be omitted. have.
  • DeliveryPath element may be represented by the following XML schema.
  • 33 is a diagram illustrating another example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • transmission path information may be inserted into each hierarchy of the MPD.
  • Such an embodiment may be used to represent the transmission path of elements of various levels.
  • transmission path information (atsc: DeliveryPath) is inserted as a child element of the MPD
  • transmission paths of all components in the MPD may be expressed.
  • the transmission path information (atsc: DeliveryPath) is inserted below the period
  • the transmission paths of components belonging to the period may be expressed.
  • the transmission path information (atsc: DeliveryPath) is inserted as a lower element of the adaptation set
  • the transmission paths of the components represented by the adaptation set may be expressed.
  • transmission paths of the components represented by the corresponding representation may be expressed.
  • the transmission path element (atsc: DeliveryPath) is inserted directly as a lower element of the adaptation set, the representation, or as a lower element of the common attribute element.
  • a transmission path of a set (AdaptationSet) and a representation may be expressed.
  • 34 is a diagram illustrating another example of transmitting transmission path information on an MPD according to an embodiment of the present invention.
  • transmission path information may be inserted into each hierarchy of the MPD.
  • Such an embodiment may be used to represent the transmission path of elements of various levels.
  • the transmission path information may have a form of an attribute called @atsc: deliveryPath.
  • the transmission path information having the form of an attribute may be inserted at various locations on the MPD to represent a transmission path that meets the purpose.
  • this attribute has a string form and may have the same usage as atsc: DeliveryPath @ value.
  • this attribute may take the form of an unsigned integer to identify the transmission path according to a given value. For example, when a value assigned to this attribute is 0, it may mean broadcast transmission, and when 1, it may mean unicast transmission.
  • FIG. 35 is a diagram illustrating a state in which transmission path information is added as a lower element of a BaseURL element, and added to ExtendedBaseURL according to an embodiment of the present invention.
  • the above-described transmission path information may be included in the BaseURL element.
  • the illustrated embodiment shows the schema of the BaseURL element when @atsc: deliveryPath is added as a child element of the BaseURL element.
  • the above-described transmission path element atsc: deliveryPath may be included in the ExtendedBaseURL element.
  • the ExtendedBaseURL element is an extended form of the above-described BaseURL element.
  • the ExtendedBaseURL element may include atsc: DeliveryPath and baseURL as subelements.
  • the ExtendedBaseURL element like the existing baseURL, is included in the MPD, Period, AdaptationSet, Representation, etc., and as each of these subelements, describes a transmission path of components of a corresponding level (for example, an ATSC 3.0 component). Can play a role.
  • Various embodiments of the present invention described above provide a transmission path identification method of service components (eg, ATSC 3.0 components) through extension of MPD.
  • the DASH client may recognize the transmission path of the individual representations configuring the MPD before starting the corresponding service. According to these embodiments, since unnecessary operation of the DASH client can be reduced, receiver operation efficiency can be improved.
  • the DASH client may be configured as a local cache (DASH client).
  • the segment should be requested to the local cash, etc. to confirm that the broadcast reception of the segment is invalid.
  • the segment can be immediately requested through unicast without performing such an operation.
  • the transmission path information according to an embodiment of the present invention may be expressed in the form of an element (atsC: DeliveryPath element) or an attribute (@atsc: DeliveryPath).
  • atsC DeliveryPath element
  • @atsc DeliveryPath
  • a method of extending an existing element to express transmission paths of service components eg, ATSC 3.0 components
  • 36 is a diagram illustrating an example of transmitting transmission path information on an MPD according to another embodiment of the present invention.
  • an element for representing the aforementioned transmission path information on the MPD is shown.
  • This embodiment extends an existing EssentialProperty element or SupplementalProperty element to represent a transmission path of service components.
  • the EssentialProperty element and the SupplementalProperty element are subelements of the Common Attribute element.
  • the EssentialProperty element and the SupplementalProperty element can be inserted at a lower level of the AdaptationSet or Representation as shown, and are represented through the corresponding AdaptationSet or Representation. It may represent a transmission path of service components (eg, ATSC 3.0 compoents).
  • @value may be given a value that can distinguish a transmission path of a corresponding Representation or AdaptationSet, such as ⁇ broadcast> and ⁇ unicast>. If the component is transmitted in both broadcast and unicast, @value may have a value of ⁇ hybrid>.
  • @id may mean an identifier (id) of a Representation representing a transmission path through the current element when the EssentialProperty element or SupplementalProperty element exists at a lower level of the adaptation set. . This attribute may be omitted.
  • FIG. 37 is a diagram illustrating an example of expressing transmission path information by an XML element according to another embodiment of the present invention.
  • the illustrated embodiment relates to a case where the transmission path information is included in the supplementalProperty element.
  • the transmission path information may be included in the EssentialProperty element as described above.
  • the transmission path information is included at the lower level of the representation and at the lower level of the adaptation set.
  • the transmission path included in the lower level of the first representation is broadcast, and the transmission path included in the lower level of the second representation is unicast.
  • the transmission path included in the lower level of the adaptation set is broadcast.
  • the service includes three components, each having a Represenation @ id value of ⁇ v0>, ⁇ v1>, and ⁇ a0>.
  • ⁇ v0> is transmitted via broadcast and ⁇ v1> is transmitted via unicast.
  • Each transmission path of ⁇ v0> and ⁇ v1> is represented through SupplementalProperty under Representation.
  • ⁇ a0> is transmitted in broadcast, and a transmission path of ⁇ a0> is expressed through SupplementalProperty under AdaptationSet.
  • 38 is a diagram illustrating another example of representing transmission path information using an XML element according to another embodiment of the present invention.
  • the illustrated embodiment relates to a case where the transmission path information is included in the supplementalProperty element.
  • the transmission path information may be included in the EssentialProperty element as described above.
  • the illustrated embodiment is a modification of the foregoing embodiment.
  • ⁇ v0> and ⁇ v1> may be expressed as in the present modified example by using SupplementalProperty under the adaptation set.
  • SupplementalProperty @ id is matched with an id of an individual representation.
  • 39 is a view showing another example of expressing transmission path information by an XML element according to another embodiment of the present invention.
  • the illustrated embodiment relates to a case where the transmission path information is included in the supplementalProperty element.
  • the transmission path information may be included in the EssentialProperty element as described above.
  • ⁇ v0> and ⁇ a0> are composed of two representations, ⁇ a0> is transmitted through a broadcast network, and ⁇ v0> is transmitted through both a broadcast network and a unicast network.
  • SupplementalProperty @ value for ⁇ v0> may have a value of ⁇ hybrid>.
  • FIG. 40 illustrates a broadcast signal transmission method according to an embodiment of the present invention.
  • the broadcast signal transmission method includes the steps of generating service data of a broadcast service (HS40100), generating service signaling information for signaling a broadcast service (HS40200) and service data and service signaling information. It may include the step (HS40300) for transmitting a broadcast signal including.
  • the service data may include a service component included in the broadcast service.
  • the service component may be an ATSC 3.0 component.
  • the service component may mean an audio component, a video component, a closed caption component, or the like.
  • the service component may include one or more segments. One or more segments constituting the service component may be a DASH segment.
  • the service signaling information signaling the broadcast service may include a media presentation descriptor (MPD).
  • the media presentation descriptor may mean an MPD used throughout the specification.
  • the media presentation descriptor may provide a playback description for the service components.
  • the media presentation descriptor may include presentation attribute information related to the presentation of the service component.
  • the presentation attribute information included in the media presentation descriptor may mean the presentation attribute information described with reference to FIGS. 17 to 18. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentation attribute information may include a first attribute indicating a presentation start time. This first attribute may be used to calculate a presentation start time for each segment that constitutes a service component. That is, the first attribute may be used as an anchor for calculating the presentation start time for each segment. On the other hand, in calculating the presentation start time for each segment, the aforementioned various parameters can be used.
  • the media presentation descriptor may include presentable component information.
  • the presentable component information included in the media presentation descriptor may be presentable component information described with reference to FIGS. 19 to 26. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentable component information may include the aforementioned target device information, targeting property information / personalization property information, and the like.
  • the presentable component information may further include the information described in FIGS. 19 to 26.
  • the presentable component information may include the above information in the form of an attribute and / or an element.
  • presentable component information may include one or more attributes and one or more elements. These attributes and elements may be added at specific locations within the media presentation descriptor.
  • an attribute / element of the presentable component may be added to a common attribute / element in the media descriptor. Specific examples of this are shown in FIGS. 23 and 24. In another embodiment, an attribute / element of the presentable component may be added to a content component attribute in the media descriptor. Specific examples of this are shown in FIGS. 25 and 26.
  • the media presentation descriptor may include composite information.
  • the composite information included in the media presentation descriptor may be composite information described with reference to FIGS. 27 to 31.
  • Composite information may describe a composite component composed of a set of service components. Accordingly, the drawings and related descriptions can be applied as is.
  • the composite information may include specific attributes / elements described in FIGS. 27 to 31.
  • the media presentation descriptor may include transmission path information.
  • the transmission path information included in the media presentation descriptor may be transmission path information described with reference to FIGS. 32 to 39. Accordingly, the drawings and related descriptions may be applied as is.
  • the transmission path information may describe the path through which the service component is transmitted. More specifically, the transmission path information may include an attribute / element, and the attribute / element may include various information for describing the transmission path.
  • the transmission path information may include a value attribute and an identifier attribute.
  • the value attribute may indicate whether the service component is transmitted in broadcast, unicast or hybrid. Here, the hybrid refers to a case in which both broadcast and unicast are transmitted.
  • the identifier attribute may provide identification information about the service component.
  • Such transmission path information may be included in various locations within the media description.
  • the media presentation descriptor forms a hierarchical key composed of several levels, and transmission path information can be inserted at this level unit.
  • the level may have a hierarchical structure of a period, an adaptation set, a representation, and a sub representation. Transmission path information may be inserted in each level to indicate transmission paths of elements of the corresponding level.
  • the presentation attribute information, presentable component information, composite information, and transmission path included in the media presentation descriptor may be included together in the media presentation descriptor as long as the information is described in a separate embodiment, but the information does not contradict. have.
  • 41 is a view showing a broadcast signal transmission apparatus according to an embodiment of the present invention.
  • the apparatus for transmitting broadcast signals includes a service data generator H41100 for generating service data of a broadcast service, a signaling generator H41200 for generating service signaling information for signaling a broadcast service, and service data. And a transmitter H41300 for transmitting a broadcast signal including service signaling information.
  • the service data may include a service component included in the broadcast service.
  • the service component may be an ATSC 3.0 component.
  • the service component may mean an audio component, a video component, a closed caption component, or the like.
  • the service component may include one or more segments. One or more segments constituting the service component may be a DASH segment.
  • the service signaling information signaling the broadcast service may include a media presentation descriptor (MPD).
  • the media presentation descriptor may mean an MPD used throughout the specification.
  • the media presentation descriptor may provide a playback description for the service components.
  • the media presentation descriptor may include presentation attribute information related to the presentation of the service component.
  • the presentation attribute information included in the media presentation descriptor may mean the presentation attribute information described with reference to FIGS. 17 to 18. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentation attribute information may include a first attribute indicating a presentation start time. This first attribute may be used to calculate a presentation start time for each segment that constitutes a service component. That is, the first attribute may be used as an anchor for calculating the presentation start time for each segment. On the other hand, in calculating the presentation start time for each segment, the aforementioned various parameters can be used.
  • the media presentation descriptor may include presentable component information.
  • the presentable component information included in the media presentation descriptor may be presentable component information described with reference to FIGS. 19 to 26. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentable component information may include the aforementioned target device information, targeting property information / personalization property information, and the like.
  • the presentable component information may further include the information described in FIGS. 19 to 26.
  • the presentable component information may include the above information in the form of an attribute and / or an element.
  • presentable component information may include one or more attributes and one or more elements. These attributes and elements may be added at specific locations within the media presentation descriptor.
  • an attribute / element of the presentable component may be added to a common attribute / element in the media descriptor. Specific examples of this are shown in FIGS. 23 and 24. In another embodiment, an attribute / element of the presentable component may be added to a content component attribute in the media descriptor. Specific examples of this are shown in FIGS. 25 and 26.
  • the media presentation descriptor may include composite information.
  • the composite information included in the media presentation descriptor may be composite information described with reference to FIGS. 27 to 31.
  • Composite information may describe a composite component composed of a set of service components. Accordingly, the drawings and related descriptions can be applied as is.
  • the composite information may include specific attributes / elements described in FIGS. 27 to 31.
  • the media presentation descriptor may include transmission path information.
  • the transmission path information included in the media presentation descriptor may be transmission path information described with reference to FIGS. 32 to 39. Accordingly, the drawings and related descriptions may be applied as is.
  • the transmission path information may describe the path through which the service component is transmitted. More specifically, the transmission path information may include an attribute / element, and the attribute / element may include various information for describing the transmission path.
  • the transmission path information may include a value attribute and an identifier attribute.
  • the value attribute may indicate whether the service component is transmitted in broadcast, unicast or hybrid. Here, the hybrid refers to a case in which both broadcast and unicast are transmitted.
  • the identifier attribute may provide identification information about the service component.
  • Such transmission path information may be included in various locations within the media description.
  • the media presentation descriptor forms a hierarchical key composed of several levels, and transmission path information can be inserted at this level unit.
  • the level may have a hierarchical structure of a period, an adaptation set, a representation, and a sub representation. Transmission path information may be inserted in each level to indicate transmission paths of elements of the corresponding level.
  • the presentation attribute information, presentable component information, composite information, and transmission path included in the media presentation descriptor may be included together in the media presentation descriptor as long as the information is described in a separate embodiment, but the information does not contradict. have.
  • FIG. 42 is a view showing a broadcast signal receiving method according to an embodiment of the present invention.
  • receiving a broadcast signal including service data of a broadcast service and service signaling information signaling the broadcast service (HS42100), and service signaling from the received broadcast signal.
  • Decoding information (HS42200), decoding the service component using the decoded service signaling information (HS42300) and presenting the broadcast service using the decoded service component (HS42400). can do.
  • the service data may include a service component included in the broadcast service.
  • the service component may be an ATSC 3.0 component.
  • the service component may mean an audio component, a video component, a closed caption component, or the like.
  • the service component may include one or more segments. One or more segments constituting the service component may be a DASH segment.
  • the service signaling information signaling the broadcast service may include a media presentation descriptor (MPD).
  • the media presentation descriptor may mean an MPD used throughout the specification.
  • the media presentation descriptor may provide a playback description for the service components.
  • the media presentation descriptor may include presentation attribute information related to the presentation of the service component.
  • the presentation attribute information included in the media presentation descriptor may mean the presentation attribute information described with reference to FIGS. 17 to 18. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentation attribute information may include a first attribute indicating a presentation start time. This first attribute may be used to calculate a presentation start time for each segment that constitutes a service component. That is, the first attribute may be used as an anchor for calculating the presentation start time for each segment. On the other hand, in calculating the presentation start time for each segment, the aforementioned various parameters can be used.
  • the media presentation descriptor may include presentable component information.
  • the presentable component information included in the media presentation descriptor may be presentable component information described with reference to FIGS. 19 to 26. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentable component information may include the aforementioned target device information, targeting property information / personalization property information, and the like.
  • the presentable component information may further include the information described in FIGS. 19 to 26.
  • the presentable component information may include the above information in the form of an attribute and / or an element.
  • presentable component information may include one or more attributes and one or more elements. These attributes and elements may be added at specific locations within the media presentation descriptor.
  • an attribute / element of the presentable component may be added to a common attribute / element in the media descriptor. Specific examples of this are shown in FIGS. 23 and 24. In another embodiment, an attribute / element of the presentable component may be added to a content component attribute in the media descriptor. Specific examples of this are shown in FIGS. 25 and 26.
  • the media presentation descriptor may include composite information.
  • the composite information included in the media presentation descriptor may be composite information described with reference to FIGS. 27 to 31.
  • Composite information may describe a composite component composed of a set of service components. Accordingly, the drawings and related descriptions can be applied as is.
  • the composite information may include specific attributes / elements described in FIGS. 27 to 31.
  • the media presentation descriptor may include transmission path information.
  • the transmission path information included in the media presentation descriptor may be transmission path information described with reference to FIGS. 32 to 39. Accordingly, the drawings and related descriptions may be applied as is.
  • the transmission path information may describe the path through which the service component is transmitted. More specifically, the transmission path information may include an attribute / element, and the attribute / element may include various information for describing the transmission path.
  • the transmission path information may include a value attribute and an identifier attribute.
  • the value attribute may indicate whether the service component is transmitted in broadcast, unicast or hybrid. Here, the hybrid refers to a case in which both broadcast and unicast are transmitted.
  • the identifier attribute may provide identification information about the service component.
  • Such transmission path information may be included in various locations within the media description.
  • the media presentation descriptor forms a hierarchical key composed of several levels, and transmission path information can be inserted at this level unit.
  • the level may have a hierarchical structure of a period, an adaptation set, a representation, and a sub representation. Transmission path information may be inserted in each level to indicate transmission paths of elements of the corresponding level.
  • the presentation attribute information, presentable component information, composite information, and transmission path included in the media presentation descriptor may be included together in the media presentation descriptor as long as the information is described in a separate embodiment, but the information does not contradict. have.
  • the broadcast signal receiving method according to an embodiment of the present invention may be performed by some components of the receiver described with reference to FIGS. 11 to 13.
  • FIG. 43 is a diagram illustrating a broadcast signal receiving apparatus according to an embodiment of the present invention.
  • an apparatus for receiving broadcast signals includes a receiver (H43100) for receiving a broadcast signal including service data of a broadcast service and service signaling information for signaling the broadcast service, and service signaling from the received broadcast signal.
  • the service data may include a service component included in the broadcast service.
  • the service component may be an ATSC 3.0 component.
  • the service component may mean an audio component, a video component, a closed caption component, or the like.
  • the service component may include one or more segments. One or more segments constituting the service component may be a DASH segment.
  • the service signaling information signaling the broadcast service may include a media presentation descriptor (MPD).
  • the media presentation descriptor may mean an MPD used throughout the specification.
  • the media presentation descriptor may provide a playback description for the service components.
  • the media presentation descriptor may include presentation attribute information related to the presentation of the service component.
  • the presentation attribute information included in the media presentation descriptor may mean the presentation attribute information described with reference to FIGS. 17 to 18. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentation attribute information may include a first attribute indicating a presentation start time. This first attribute may be used to calculate a presentation start time for each segment that constitutes a service component. That is, the first attribute may be used as an anchor for calculating the presentation start time for each segment. On the other hand, in calculating the presentation start time for each segment, the aforementioned various parameters can be used.
  • the media presentation descriptor may include presentable component information.
  • the presentable component information included in the media presentation descriptor may be presentable component information described with reference to FIGS. 19 to 26. Accordingly, the drawings and related descriptions can be applied as is.
  • the presentable component information may include the aforementioned target device information, targeting property information / personalization property information, and the like.
  • the presentable component information may further include the information described in FIGS. 19 to 26.
  • the presentable component information may include the above information in the form of an attribute and / or an element.
  • presentable component information may include one or more attributes and one or more elements. These attributes and elements may be added at specific locations within the media presentation descriptor.
  • an attribute / element of the presentable component may be added to a common attribute / element in the media descriptor. Specific examples of this are shown in FIGS. 23 and 24. In another embodiment, an attribute / element of the presentable component may be added to a content component attribute in the media descriptor. Specific examples of this are shown in FIGS. 25 and 26.
  • the media presentation descriptor may include composite information.
  • the composite information included in the media presentation descriptor may be composite information described with reference to FIGS. 27 to 31.
  • Composite information may describe a composite component composed of a set of service components. Accordingly, the drawings and related descriptions can be applied as is.
  • the composite information may include specific attributes / elements described in FIGS. 27 to 31.
  • the media presentation descriptor may include transmission path information.
  • the transmission path information included in the media presentation descriptor may be transmission path information described with reference to FIGS. 32 to 39. Accordingly, the drawings and related descriptions may be applied as is.
  • the transmission path information may describe the path through which the service component is transmitted. More specifically, the transmission path information may include an attribute / element, and the attribute / element may include various information for describing the transmission path.
  • the transmission path information may include a value attribute and an identifier attribute.
  • the value attribute may indicate whether the service component is transmitted in broadcast, unicast or hybrid. Here, the hybrid refers to a case in which both broadcast and unicast are transmitted.
  • the identifier attribute may provide identification information about the service component.
  • Such transmission path information may be included in various locations within the media description.
  • the media presentation descriptor forms a hierarchical key composed of several levels, and transmission path information can be inserted at this level unit.
  • the level may have a hierarchical structure of a period, an adaptation set, a representation, and a sub representation. Transmission path information may be inserted in each level to indicate transmission paths of elements of the corresponding level.
  • the presentation attribute information, presentable component information, composite information, and transmission path included in the media presentation descriptor may be included together in the media presentation descriptor as long as the information is described in a separate embodiment, but the information does not contradict. have.
  • the module or unit may be processors that execute successive procedures stored in a memory (or storage unit). Each of the steps described in the above embodiments may be performed by hardware / processors. Each module / block / unit described in the above embodiments can operate as a hardware / processor.
  • the methods proposed by the present invention can be executed as code. This code can be written to a processor readable storage medium and thus read by a processor provided by an apparatus.
  • Apparatus and method according to the present invention is not limited to the configuration and method of the embodiments described as described above, the above-described embodiments may be selectively all or part of each embodiment so that various modifications can be made It may be configured in combination.
  • the processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor.
  • Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and may also be implemented in the form of a carrier wave such as transmission over the Internet.
  • the processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.
  • the present invention is used in the field of providing a series of broadcast signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne un procédé d'émission d'un signal de radiodiffusion. Le procédé d'émission d'un signal de radiodiffusion, selon la présente invention, présente un système pouvant prendre en charge un service de radiodiffusion de prochaine génération dans un environnement prenant en charge une diffusion hybride de prochaine génération, qui utilise un réseau de radiodiffusion terrestre et un réseau Internet. La présente invention concerne en outre un procédé de signalisation efficace pouvant couvrir à la fois un réseau de radiodiffusion terrestre et un réseau Internet dans un environnement prenant en charge une radiodiffusion hybride de prochaine génération.
PCT/KR2016/008118 2015-07-25 2016-07-25 Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion et procédé de réception de signal de radiodiffusion WO2017018768A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562196942P 2015-07-25 2015-07-25
US62/196,942 2015-07-25
US201562198648P 2015-07-29 2015-07-29
US62/198,648 2015-07-29

Publications (1)

Publication Number Publication Date
WO2017018768A1 true WO2017018768A1 (fr) 2017-02-02

Family

ID=57884717

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/008118 WO2017018768A1 (fr) 2015-07-25 2016-07-25 Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion et procédé de réception de signal de radiodiffusion

Country Status (1)

Country Link
WO (1) WO2017018768A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8151342B2 (en) * 2005-01-26 2012-04-03 Eletronics And Telecommunications Research Institute Contents execution device equipped with independent authentication means and contents re-distribution method
US20130182643A1 (en) * 2012-01-16 2013-07-18 Qualcomm Incorporated Method and system for transitions of broadcast dash service receptions between unicast and broadcast
US20130291040A1 (en) * 2011-01-18 2013-10-31 Samsung Electronics Co. Ltd Transmission method and transmission apparatus for a combined broadcasting and communication service
US8793743B2 (en) * 2012-03-16 2014-07-29 Intel Corporation Multicast broadcast multimedia service-assisted content distribution
US20140250479A1 (en) * 2011-09-23 2014-09-04 Lg Electronics Inc. Method for receiving broadcast service and reception device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8151342B2 (en) * 2005-01-26 2012-04-03 Eletronics And Telecommunications Research Institute Contents execution device equipped with independent authentication means and contents re-distribution method
US20130291040A1 (en) * 2011-01-18 2013-10-31 Samsung Electronics Co. Ltd Transmission method and transmission apparatus for a combined broadcasting and communication service
US20140250479A1 (en) * 2011-09-23 2014-09-04 Lg Electronics Inc. Method for receiving broadcast service and reception device thereof
US20130182643A1 (en) * 2012-01-16 2013-07-18 Qualcomm Incorporated Method and system for transitions of broadcast dash service receptions between unicast and broadcast
US8793743B2 (en) * 2012-03-16 2014-07-29 Intel Corporation Multicast broadcast multimedia service-assisted content distribution

Similar Documents

Publication Publication Date Title
WO2017014586A1 (fr) Dispositif et procédé d'émission et de réception de signal de radiodiffusion
WO2016060422A1 (fr) Dispositif et procédé d'émission de signal de diffusion, dispositif et procédé de réception de signal de diffusion
WO2016140486A1 (fr) Appareil et procédé d'émission/réception de signal de diffusion
WO2016186407A1 (fr) Appareil et procédé d'émission ou de réception de signal de diffusion
WO2017007224A1 (fr) Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2016093576A1 (fr) Appareil de transmission de signal de radiodiffusion, appareil de réception de signal de radiodiffusion, procédé de transmission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2016111526A1 (fr) Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2016126116A1 (fr) Dispositif d'émission de signal de diffusion, dispositif de réception de signal de diffusion, procédé d'émission de signal de diffusion et procédé de réception de signal de diffusion
WO2017209514A1 (fr) Dispositif et procédé d'émission et de réception de signal de diffusion
WO2016076569A1 (fr) Appareil de transmission de signaux de diffusion, appareil de réception de signaux de diffusion, procédé de transmission de signaux de diffusion, et procédé de réception de signaux de diffusion
WO2017204546A1 (fr) Dispositif et procédé d'émission/réception de signaux de diffusion
WO2017061792A1 (fr) Dispositif et procédé d'émission/réception de signal de diffusion
WO2016060416A1 (fr) Dispositif d'émission d'un signal de diffusion, dispositif de réception d'un signal de diffusion, procédé d'émission d'un signal de diffusion, et procédé de réception d'un signal de diffusion
WO2018101566A1 (fr) Dispositif et procédé d'émission/réception de signal de radiodiffusion
WO2016068564A1 (fr) Appareil et procédé d'émission de signal de diffusion, appareil et procédé de réception de signal de diffusion
WO2016122269A1 (fr) Dispositif de transmission de signaux de radiodiffusion, dispositif de réception de signaux de radiodiffusion, procédé de transmission de signaux de radiodiffusion et procédé de réception de signaux de radiodiffusion
WO2016178549A1 (fr) Dispositif d'émission de signaux de diffusion, dispositif de réception de signaux de diffusion, procédé d'émission de signaux de diffusion, et procédé de réception de signaux de diffusion
WO2016190662A1 (fr) Dispositif de transmission de signal de diffusion, dispositif de réception de signal de diffusion, procédé de transmission de signal de diffusion, et procédé de réception de signal de diffusion
WO2016129904A1 (fr) Appareil d'émission de signal de radiodiffusion, appareil de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2016114638A1 (fr) Appareil de transmission de signal de radiodiffusion, appareil de réception de signal de radiodiffusion, procédé de transmission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2017164595A1 (fr) Dispositif et procédé d'émission/réception de signal de diffusion
WO2016190720A1 (fr) Dispositif d'émission et de réception de signal de radiodiffusion, et procédé d'émission et de réception de signal de radiodiffusion
WO2016072725A1 (fr) Dispositif de transmission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé de transmission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2017026714A1 (fr) Dispositif d'émission de signal de radiodiffusion, dispositif de réception de signal de radiodiffusion, procédé d'émission de signal de radiodiffusion, et procédé de réception de signal de radiodiffusion
WO2017014553A1 (fr) Appareil d'émission de signal de diffusion, appareil de réception de signal de diffusion, procédé d'émission de signal de diffusion et procédé de réception de signal de diffusion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16830801

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16830801

Country of ref document: EP

Kind code of ref document: A1