[go: up one dir, main page]

WO2025215340A1 - Methods and drivers for providing media data from a media file to a media editing application - Google Patents

Methods and drivers for providing media data from a media file to a media editing application

Info

Publication number
WO2025215340A1
WO2025215340A1 PCT/GB2025/050702 GB2025050702W WO2025215340A1 WO 2025215340 A1 WO2025215340 A1 WO 2025215340A1 GB 2025050702 W GB2025050702 W GB 2025050702W WO 2025215340 A1 WO2025215340 A1 WO 2025215340A1
Authority
WO
WIPO (PCT)
Prior art keywords
media
driver
data
media data
remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/GB2025/050702
Other languages
French (fr)
Inventor
James Westland CAIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Media Anywhere Intellectual Property BV
Original Assignee
Media Anywhere Intellectual Property BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Media Anywhere Intellectual Property BV filed Critical Media Anywhere Intellectual Property BV
Publication of WO2025215340A1 publication Critical patent/WO2025215340A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/11Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8352Generation of protective data, e.g. certificates involving content or source identification data, e.g. Unique Material Identifier [UMID]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/835Generation of protective data, e.g. certificates
    • H04N21/8358Generation of protective data, e.g. certificates involving watermark
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Definitions

  • the present invention concerns methods and drivers for providing media data from a media file to a media editing application. More particularly, but not exclusively, the invention concerns drivers that provide media data to a media editing application from a remote media server.
  • Media editing applications are used to edit video and sound for film, television and the like.
  • the files of media data used by such media editing applications can be very large, often many GB or more in size.
  • Media editing applications need to be able to access large amounts of data from these media files quickly.
  • a common task performed by a user when using a media editing application is to search through a media file for a particular part it contains. This is done by displayingthe media file and movingthe timeline marker to quickly view frames of video from different parts of the media file. This requires the media editing application to quickly obtain large amounts of data from different parts of the media file, so that the frames can be displayed without excessive delay to the user.
  • media editing applications are generally arranged to use only files that are stored locally.
  • media files that are stored remotely.
  • footage for a sporting event may be recorded at a location remote from the location where the media editing application is used.
  • files can be very large, say multiple TB for footage of a sporting event, obtaining the files can be very time-consuming.
  • the present invention seeks to mitigate the above-mentioned problems. Alternatively or additionally, the present invention seeks to provide improved methods of providing media data from a media file to a media editing application, improved drivers for media editing applications, and improved media editing systems.
  • a media editing application is arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type usingthe driver associated with the specified media file type; and wherein the media editing application comprises a driver associated with a media file type, wherein each media file of the media file type is arranged to indicate a remote media server from which media data for the media file can be obtained; the method comprising the steps of: the media editing application requesting, from the driver, first media data from a media file; the driver requesting, from the remote media server indicated by the media file, the first media data; the driver receiving, from the remote media server, the first media data; and the driver sending, to the media editing application, the first media data.
  • Media editing applications are commonly able to receive new drivers, to allow them to work with media files of new media file types.
  • a media file type the media files of which are arranged to indicate a remote media server from which media data forthe media file can be obtained
  • a media file that is stored locally can be used by the media editing application to obtain media data in the usual way, i.e. as a locally stored media file.
  • the driver for the media file type can in practice obtain any media data that the media editing application requests from the remote media server.
  • the media editing application merely identifies the locally stored media file, and then requests and receives media data from the media file usingthe driver, the fact that the media data returned by the driver was obtained from a remote location is invisible to the media editing application.
  • the media editing application can obtain media data that is stored remotely, using only functionality intended use with media files that are stored locally.
  • the media editing application can also be immediately provided with the media data it requests, without it being necessary to copy the entirety of the media data for the media file so that it is stored locally before the media file can be used by the media editing application.
  • a clip is created that uses media data stored on the remote media server, it can be used by a second media editing application at a different location, with the second media editing application again obtainingthe media data from the remote media server without the entirety of the underlying media files needing to be obtained.
  • the method may further comprise the steps of: the media editing application requesting, from the driver, second media data from the media file; the driver requesting, from the remote media server, the second media data, wherein the second media data is requested priorto the driver receiving the first media data from the remote media server; the driver receiving, from the remote media server, the second media data; and the driver sending, to the media editing application, the second media data.
  • the media editing application requesting, from the driver, second media data from the media file
  • the driver requesting, from the remote media server, the second media data, wherein the second media data is requested priorto the driver receiving the first media data from the remote media server; the driver receiving, from the remote media server, the second media data; and the driver sending, to the media editing application, the second media data.
  • the remote media server is indicated by the media file using a Uniform Resource Identifier (URI).
  • URI Uniform Resource Identifier
  • the URI may be a Uniform Resource Locator (URL).
  • the driver may communicate with the remote media server using a protocol that allows multiple concurrent requests for data to be made.
  • the driver may communicate with the remote media server using an HTTP protocol.
  • the HTTP protocol may be HTTP/2 or HTTP/3.
  • HTTP/2 and HTTP/3 allow multiple streams of data to sent concurrently, ratherthan requirin them to be sent sequentially.
  • HTTP/2 and HTTP/3 run over TLS, and are generally encrypted by default, as is a requirement for authentication schemes such as OAuth 2.0 and OIDC mentioned below to operate.
  • the HTTP protocol may be a later protocol than HTTP/2 and HTTP/3.
  • the method may further comprise the step of the driver storing the first media data in a local cache.
  • the method may further comprise the steps of: the media editing application requesting, from the driver, cached media data that is stored in the local cache; the driver obtaining, from the local cache, the cached media data; and the driver sending, to the media editing application, the cached media data. In this way, media data that has been requested previously can be obtained locally, and does not need to be requested from the remote media server again.
  • the first media data may be received from the remote media server in compressed form, and the method may further comprise, prior to the driver sending the first media data to the media editing application, the step of the driver decompressingthe first media data.
  • the driver may decompress the first media data using a graphics processing unit (GPU).
  • GPU graphics processing unit
  • the media editing application simply receives the first media data and does not know that it was sent by the remote media server in compressed form.
  • the driver is able to use, for example, a GPU to decompress the first media data quickly.
  • the driver may use the CPU of the device on which the media editing application is runningto decompress the first media data.
  • the method may further comprise the step, prior to the driver receiving the first media data, of the driver authenticating with the remote media server that the media editing application is authorised to receive the first media data.
  • the remote media server can control whether the media editing application is able to receive the first media data. This can be done even where the media editing application has received the first media data on a previous occasion.
  • the authentication may be done using OAuth 2.0, OpenlD Connect (OIDC), or any other desired authentication scheme.
  • the first media data may be received from the remote media server in encrypted form, and the method may further comprise, prior to the driver sending the first media data to the media editing application, the step of the driver decrypting the first media data.
  • the decryption of the first media data can be done by the driver invisibly to the media editing application.
  • the cached media data may be stored in encrypted form, and may be decrypted and sent to the media editing application by the driver only if the driver authenticates with the remote media server that the media editing application is authorised to receive the first media data. In this way, the remote media server can also control whether the media editing application is able to receive cached media data.
  • the method may further comprise the step, prior to the driver sending the first media data to the media editing application, of the driver watermarking the first media data using identification data.
  • the method may further comprise the steps of: the driver sending identification data to the remote media server; and the remote media server watermarking the first media data using the identification data prior to sending the first media data to the driver.
  • the first media data can be watermarked with identification data, for example to indicate the user of the media editing application who requested the first media data.
  • the method may further comprise the steps of: the media editing application requesting, from the driver, metadata for the media file; the driver requesting, from the remote media server metadata for the media file; the driver receiving, from the remote media server, the metadata forthe media file; and the driver sending, to the media editing application, the metadata forthe media file.
  • metadata for the media file can be provided by the remote media server.
  • the remote media server can dynamically change the metadata. For example, where the remotely stored media data relates to a sporting event that is being recorded live, the remote media server can adjust the metadata to reflect the amount of media data that is available.
  • the remote media server may receive media data for the media file subsequent to the remote media server sendingthe first media data to the driver,
  • the method may further comprise the step, prior to the driver requesting the first media data from the remote media server, of the driver determining a quality forthe first media data; wherein the request from the driver for the first media data from the remote media server includes the determined quality; and wherein the remote media server sends the first media data to the driver at the determined quality.
  • the media editing application can be provided with media data at different qualities, for example at different compressions/bitrates, on different occasions.
  • the quality may be determined based on network performance, or any other desired conditions or combination of conditions.
  • the media editing application when the media editing application is being used in a remote location with a poor Internet connection, the media data can be sent at a low quality/bitrate (high compression), whereas when a better connection is available, a higher quality of media data can be sent.
  • the same media data can be requested by the media editing application, and so in each case the media editing application is from its point of view editing the “same” media data, with the difference in quality being handled by the driver and invisible to the media editing application.
  • the first media data may be one of: an image; a frame; a Group of Pictures; an audio frame; and an audio clip.
  • a driverfor a media editing application arranged, in response to a request from a media editing application forfirst media data from a media file, to: request, from a remote media server indicated by the media file, the first media data; receive, from the remote media server, the first media data; and send the first media data to the media editing application.
  • the driver may be further arranged, in response to a request from a media editing application for second media data from a media file, to: request, from the remote media server, the second media data, wherein driver is arranged to request the second media data prior to the driver receiving the first media data from the remote media server.
  • the remote media server may be indicated by the media file using a Uniform Resource Identifier.
  • the driver may be arranged to communicate with the remote media server using a protocol that allows multiple concurrent requests to be made.
  • the driver may communicate with the remote media server using an HTTP protocol.
  • the HTTP protocol may be HTTP/2 or HTTP/3.
  • the driver may be further arranged to store the first media data in a local cache.
  • the driver may be further arranged, in response to a request from the media editing application for media data that is stored in the local cache, to: obtain the media data from the local cache; and send it to the media editing application.
  • the driver may be further arranged to: receive the first media data from the remote media server in compressed form; and prior to sending the first media data to the media editing application, decompress the first media data.
  • the driver may be arranged to decompress the first media data using a graphics processing unit.
  • the driver may be further arranged, prior to receiving the first media data, to authenticate with the remote media server that it is authorised to receive the first media data.
  • the driver may be further arranged to: receive the first media data from the remote media server in encrypted form; prior to sending the first media data to the media editing application, decryptthe first media data.
  • the driver may be further arranged to send identification data to the remote media server.
  • the driver may be further arranged, in response to a request from a media editing application for metadata from a media file, to: request, from a remote media server indicated by the media file, metadata for the media file; receive, from the remote media server, the metadata for the media file; and send the metadata for the media file to the media editing application.
  • the metadata may be descriptive of media data forthe media file that is available to the remote media server, for example it may include an indication of the duration of media data available.
  • the metadata may indicate a duration of media data available, but where such media data is not yet available to the remote media server, but is expected to be, for example where the media data is in the process of being uploaded to the remote media server.
  • the remote media server can make available media data for a media file for an event such as a sporting event while the event is still in progress, and so while media data for the media file is still being generated by recording the event.
  • the first media data may be one of: an image; a frame; a Group of
  • a media editing system comprising: a media editing application arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type using the driver associated with the specified media file type; a driver as described above; and a media server having stored on it first media data, and arranged to: receive a request from the driver for the first media data; and send the first media data to the driver.
  • a computer program product arranged, when executed on a computing system comprising one or more processors and memory, to cause the computing system to provide a driver as described above.
  • Figure 1 is a schematic diagram of a media editing system in accordance with embodiments of the invention.
  • Figure 2 is a flow chart showing the operation of the media editing system when a media file is first opened
  • Figure 3 is a flow chart showing the operation of the media editing system when media data from the media file is requested
  • Figure 4 is a flow chart showing the operation of the media editing system of an embodiment in which media data is locally cached;
  • Figure 5 is a flow chart showing the operation of the media editing system of an embodiment in which media data is sent by the remote media server in compressed form;
  • Figure 6 is a flow chart showing the operation of the media editing system of an embodiment in which authentication is performed
  • Figure 7 is a flow chart showing the operation of the media editing system of an embodiment in which media data is watermarked
  • Figure 8 is a flow chart showing the operation of the media editing system of an embodiment in which media data is requested at a determined quality
  • Figure 9 is a schematic diagram of a computing device in accordance with embodiments.
  • FIG. 1 is a schematic diagram of the media editing system 100.
  • the media editing system 100 comprises a media editing application 101 , and local storage 102 (i.e. local to the media editing system 100) on which media files are stored.
  • the media editing application 101 is a conventional, known media editing application, provided on, for example, a personal computer, for use editing media files includingthe media files stored on the local storage 102.
  • the media editing application 101 is in communication with drivers 111 and 112, which are in turn in communication with the local storage 102.
  • the drivers 111 and 112 allow the media editing application 101 to work with media files of different file types, with the drivers having the required functionality to extract media data from the media files and provide it in the form required by the media editing application 101 .
  • the driver 111 is used for media files of type MPEG-4.
  • the media editing application 101 requires a particular frame from a media file of type MPEG-4, it requests it from the driver 111.
  • the driver 111 obtains the required frame from the media file and passes it backto the media editing application 101 in the form it requires.
  • the driver 112 provides similar functionality, but for files of type AVI. It will be appreciated that it is often desirable for a media editing application to be able to use files of a multiplicity of different file types, and so multiple drivers may be provided to allow this.
  • the drivers 111 and 112 allow the media editing application 101 to work with media files of different file types, without the media editing application 101 needing to have any knowledge of the file types itself, other than the file extension with which that can be identified (e.g. .mp4 for MPEG-4, .avi for AVI), and the corresponding driver to use.
  • the media editing application 101 can be provided with new drivers that provide the functionality required to work with media files of new file types.
  • new drivers are written in a language such as C++, and provided as a file that can be added to the media editing application
  • Such drivers are often referred to as “plugins”.
  • the media editing application 101 is also in communication with a further driver 121.
  • the driver 121 is not in communication with local storage. Instead, the driver 121 is in communication with the Internet 131 .
  • the driver 121 is used for media files with the extension .mea, though it will be appreciated that in other embodiments other extensions could be used. The operation of the driver 121 is described in detail below.
  • the driver 121 is also in communication with a local cache 122, and a Graphics Processing Unit (GPU) 103.
  • GPU Graphics Processing Unit
  • the media editing system 100 also comprises a remote media server 141 that is in communication with the Internet 131 .
  • the remote media server 141 is also in communication with remote storage 142, on which media files are stored.
  • the remote storage 142 is remote from the media editing application 101 , but is local to the remote media server 141 .
  • the remote media service 141 provides a REST API using which the driver 121 can request media data, amongst other things, as discussed in detail below. However, in other embodiments the remote media server 141 can communicate with the driver121 using any other desired communication scheme.
  • the media file being opened is a file videol .mea that is stored on the local storage 102.
  • the file videol .mea is a file in JSON format, and contains only a URL https://remoteserver.com/video1 . where remoteserver.com is the domain name of the remote media server 141 .
  • the file videol .mea does not contain any media data, or any metadata for any media data. Instead, the media data for the file videol .mea is stored on the remote storage 142 of the remote media server 141 .
  • the media editing application 101 sends a request to the driver 121 to open the file videol .mea (step 201). This can occur, for example, because a user, using the file openingfunctionality of the media editing application 101 , has requested that the file videol .mea stored on the local storage 102 is opened.
  • the driver 121 then obtains the URL from the file videol .mea (step 202), i.e. https://remoteserver.com/video1, and uses this URL to send an HTTP request over the Internet 131 (step 203).
  • the HTTP request uses the domain name remoteserver.com for the remote media server 141 , and so is passed to the remote media server 141 .
  • the remote media server 141 interprets this as a request for metadata relating to the media data it has for the file videol .mea stored on the remote storage 142.
  • the remote media server 141 obtains the metadata for the media data for the file videol .mea from the remote storage 142 (step 204), and sends it to the driver 121 (step 205).
  • This metadata can include, for example, the duration of the media data, the frame rate, the size of the frames of the media data, and any other desirable metadata.
  • the driver 121 then sends the metadata received from the remote media server 141 to the media editing application 101 (step 206).
  • the behaviour of the driver 121 when asked to open a media file is the same as that of the drivers 111 and 112 when asked to open a media file stored on the local storage 102.
  • the file videol .mea appears to be a media file containing media data that is stored on the local storage 102. This is despite the fact that the media file that is opened, videol .mea, does not actually include any media data or metadata, and instead the media data and metadata for the file videol .mea is stored remotely on the remote media server 142.
  • the media editing application 101 sends a request for first media data from the file videol .mea to the driver 121 (step 301), in the present example this first media data being the frame number 1234 of the media file. This can occur, for example, because the file videol .mea is being viewed in the media editing application 101 , and the user moves the timeline marker to the position correspondingto frame 1234, and so the media editing application 101 needs to display frame 1234.
  • the driver 121 sends a request for frame 1234 for the file videol .mea to the remote media server 141 (step 302). It does this by sending an HTTP request usingthe Because of the number 1234 at the end of the path, this request is interpreted by the remote media server 141 as a request for frame 1234 of the media data it has for the file videol .mea stored on the remote storage 142.
  • the remote media server 141 obtains frame 1234 for the file videol .mea from the remote storage 142 (step 303), sends it to the driver 121 (step 304), which in turn sends it to the media editing application 101 (step 305). In this way, to the media editing application 101 it appears as if frame 1234 has been obtained from the file videol .mea stored on the local storage 102, even though in practice it has been obtained from the remote media server 141 .
  • the media editing application 101 sends a request for second media data from the file videol .mea to the driver 121 (step 306), in the present example this second media data being the subsequent frame number 1235 of the media file.
  • this second media data being the subsequent frame number 1235 of the media file.
  • the driver 121 then sends a request for frame 1235 to the remote media server 141 (step 307), in this case by sending an HTTP request usingthe URL https //remotes® ⁇ which is interpreted by the remote media server 141 as a request for frame 1235 of the file videol .mea.
  • the driver 121 makes the request for frame 1235 before the request for frame 1234 has been satisfied. It is able to do this because the HTTP request is done using HTTP/2, which allows multiple streams of data to be sent concurrently.
  • the remote media server 141 then obtains frame 1235 for the file videol .mea from the remote storage 142 (step 308), sends to the driver 121 (step 309), which in turn sends it to the media editing application 101 (step 310).
  • Frame 1235 could even be received by the media editing application 101 before frame 1234, if it was requested quickly enough following frame 1234, and smaller than frame 1234 so took less time to be received in its entirety by the driver 121 . This is the case even if the latency of the communication between the driver 121 and the remote media server 141 is high.
  • the driver 121 may retain received media data if necessary, so that it is always returned to the media editing application 101 in the order requested.
  • Bottleneck Bandwidth and Round-trip propagation time may be used to give improved performance when receiving media data.
  • TCP window size would limit the rate atwhich media data can be received
  • multiple TCP connections can be opened to allow the rate at which media data can be received to be increased.
  • Characteristics such as the bandwidth-delay product of the data link between the remote media server 141 and the media editing application 101 can be used to determine the parameters for the communication, such as the number of TCP connections to open. (The bandwidth-delay product is the product of the data link’s capacity in bits per second and round-trip delay in seconds, and acts as a measure of the amount of data that is “stored” in the data link while being transmitted.)
  • the driver 121 performs local caching of received media data, as explained below.
  • the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 401 ), in the present example this first media data being the frame number 3456 of the media file.
  • the driver 121 checks whether the requested frame 3456 is stored in the local cache 122 (step 402), as can occur under the circumstances explained below. In the present example, frame 3456 is not present in the local cache 122.
  • the driver 121 requests the frame from the remote media server 141 (step 403), the remote media server 141 obtains the frame from the remote storage 142 (step 405), and the remote media server 141 sends the frame to the driver 121 (step 406).
  • the driver stores the frame in the local cache 122 (step 406), before sending it to the media editing application 101 (step 408) in the usual way.
  • the driver 121 obtains media data from the remote media server 141 and sends it to the media editing application 101 as in the previous embodiment, but also stores any obtained media data in the local cache 122.
  • the media data may be stored in the local cache 122 subsequently to, or in parallel with, it being sent to the media editing application
  • the operation of the driver 121 is different. This will occur when the frame 3456 has been requested on a previous occasion, and as part of that request stored in the local cache 122.
  • it obtains it from the local cache 122 step 407, and then again returns the frame to the media editing application 101 . In this way, the need to obtain previously requested media data from a remote location is avoided.
  • media data is sent from the remote media server 141 to the driver 121 in compressed form, as explained below.
  • the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 501 ), in the present example the first media data being the frame number 4567 of the media file.
  • the driver 121 requests the frame from the remote media server 141 (step 502), the remote media server 141 obtains the frame from the remote storage 142 (step 503), and the remote media server 141 sends the frame to the driver 121 (step 504).
  • Frame 4567 is sent to the driver 121 in compressed form.
  • Frame 4567 may be compressed by the remote media server 141 prior to sending it to the driver 121 , or alternatively may be stored on the remote storage 142 in compressed form.
  • the driver 121 decompresses frame 4567 using the GPU 103 (step 505), and then sends the decompressed frame to the media editing application 101 (step 506).
  • media data can be sent from the remote media server 141 to the driver 121 more quickly, as it is in compressed form so requires less data.
  • the media data mustthen be decompressed by the driver 121 , which takes time.
  • the driver 121 uses the GPU 103 to decompress the file, which due to the processing power and specialised nature of the GPU 103 is able to decompress the file very quickly, and so a problematic delay in providing the media data to the media editing application 101 due to it being sent by the remote media server 141 is compressed form is not introduced.
  • the fact that the media data is sent in compressed form is invisible to the media editing application 101 , which simply receives the required media data from the driver 121 in the uncompressed form it requires.
  • the media data may be sent from the remote media server 141 to the driver 121 alternatively or additionally in encrypted form, and decrypted prior to sending to the media editing application 101 .
  • the use of the GPU 103 to decompress the media data is just one possibility, and in other embodiments the driver 121 may use other local resources to decompress the media data.
  • the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 601 ), in the present example this first media data being the frame number 5678 of the media file.
  • this first media data being the frame number 5678 of the media file.
  • the driver 121 and the remote media server 141 perform authentication (step 602), i.e. confirm that the driver 212 is allowed to receive media data for the file videol .mea from the remote media server 141 .
  • This can be done using any desired authentication system, for example OAuth 2.0 or OIDC.
  • the remote media server 141 may be authenticating that the user of the media editing application 101 is allowed to receive the media data, using credentials of the user used bythe driver 121 for the authentication.
  • the driver 121 requests frame 4567 from the remote media server 141 (step 604), the remote media server 141 obtains the frame from the remote storage 142 (step 605), the remote media server 141 sends the frame to the driver 121 (step 606), and the driver 121 sends the frame to the media editing application 101 (step 607).
  • the driver 121 will be unable to obtain the requested media data from the remote media server 141 .
  • the driver 121 instead of returning any media data it instead sends a failure message to the media editing application 101 , to indicate that the requested media data, frame 4567, is not available.
  • the authentication is done when media data is requested, in other embodiments it could alternatively or additionally be done when the file videol .mea is first opened (or attempted to be opened) and its metadata requested.
  • the driver 121 could perform authentication with the remote media server 141 prior to sendingthe cached media data to the media editing application 101 . This allows the remote media server 141 to control the use of media data even when it has been sent previously and locally cached, particularly if the media data is stored in the local cache 122 in encrypted form, so needs to be decrypted by the driver 121 before it can be used.
  • the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 701 ), in the present example this first media data being the frame number 5678 of the media file.
  • the driver 121 requests frame 5678 from the remote media server 141 (step 702); the remote media server 141 obtains the frame from the remote storage 142 (step 703); and the remote media server 141 sends the frame to the driver 121 (step 704).
  • the driver 121 watermarks the frame 5678 with identification data identifying the user of the media editing application 101 (step 705). It obtains the identification data from the media editing application 101 , for example using licence details, login details, or other identifying data the user has that allows them to use the media editing application 101.
  • the step of watermarking the frame 5678 is performed subsequently to any decompression and/or decryption of the frame that is required when the frame is received from the remote media server 141 .
  • the driver 121 then sends the watermarked frame to the media editing application 101 (step 706).
  • any media data for the file videol .mea obtained by a user of the media editing application 101 is necessarily watermarked.
  • the watermarking is done by the driver 121
  • the remote media server 141 watermarks the media data prior to sending it to the driver 121 , and prior to compressing and/or encrypting the media data, if appropriate.
  • the identification data could be provided at a different time, for example when the file videol .mea is first opened.
  • the identification data could identify something other than the user of the media editing application 101 , for example a serial number relating to the media editing application 101 itself, login details for the device on which the media editing application 101 is being used, a MAC address for the device on which the media editing application 101 and driver 121 are being run, or any other desired identification data.
  • the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 801 ), in the present example this first media data being the frame number 6789 of the media file.
  • the driver 121 determines a quality at which to request the frame 5678 (step 802).
  • the determined quality may for example be a particular compression rate or bitrate, and may be determined based on current network performance, for example. Where current network performance is low, i.e. the connection between the driver 121 and the remote media server 141 is poor, a low quality may be determined; in contrast, where current network performance is high, i.e. the connection between the driver 121 and the remote media server 141 is good, a high quality may be determined.
  • the driver 121 requests frame 6789 from the remote media server 141 (step 803), but in the present embodiment the driver 121 requests frame 6789 at the determined quality.
  • the remote media server 141 then obtains the frame from the remote storage 142 at the determined quality (step 804); the remote media server 141 sends the frame (at the determined quality) to the driver 121 (step 805), and the driver 121 then sends the frame (at the determined quality) to the media editing application 101 (step 806).
  • the media editing application 101 can be provided with media data of a quality that is appropriate to the network connection to the remote media server 141 .
  • media data of a quality For example, when the network connection is poor, low quality data that is smaller in size can be sent, whereas when the network connection is poor, high quality data that is larger in size can be sent.
  • the media editing application 101 it is receiving the “same” media data, with the difference in quality being handled by the driver invisibly to the media editing application.
  • Embodiments of the invention include at least some of the methods described above performed on a computing device, such as the computing device 1100 shown in Figure 9.
  • the computing device 1100 comprises a data interface 1101 , through which data can be sent or received, for example over a network.
  • the computing device 1100 further comprises a processor 1102 in communication with the data interface 1101 , and memory 1103 in communication with the processor 1102.
  • the computing device 1100 can receive data, such as media data, via the data interface 1101 , and the processor 1102 can store the received data in the memory 1103, and process it so as to perform the methods described herein.
  • At least some of the methods described herein may be performed by a computing system comprising one or more such computing devices 1100.
  • Each device, module, component, machine or function as described in relation to any of the examples described herein may comprise a processor and/or processing system or may be comprised in apparatus comprising a processor and/or processing system.
  • One or more aspects of the embodiments described herein comprise processes performed by apparatus.
  • the apparatus comprises one or more processing systems or processors configured to carry out these processes.
  • embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).
  • Embodiments also extend to computer programs, particularly computer programs on or in a carrier, adapted for puttingthe above described embodiments into practice.
  • the program may be in the form of non-transitory source code, object code, or in any other non-transitory form suitable for use in the implementation of processes accordingto embodiments.
  • the carrier may be any entity or device capable of carrying the program, such as a RAM, a ROM, or an optical memory device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

A method of providing media data from a media file to a media editing application, where the media editing application is arranged to receive a plurality of drivers, each driver being associated with a media file type, and request media data from a media file of a specified media file type using the driver associated with the specified media file type. The media editing application comprises a driver associated with a media file type, wherein each media file of the media file type is arranged to indicate a remote media server from which media data for the media file can be obtained. The method comprises the steps of: the media editing application requesting, from the driver, first media data from a media file; the driver requesting, from the remote media server indicated by the media file, the first media data; the driver receiving, from the remote media server, the first media data; and the driver sending, to the media editing application, the first media data.

Description

Methods and drivers for providing media data from a media file to a media editing application
Field of the Invention
The present invention concerns methods and drivers for providing media data from a media file to a media editing application. More particularly, but not exclusively, the invention concerns drivers that provide media data to a media editing application from a remote media server.
Background of the Invention
Media editing applications are used to edit video and sound for film, television and the like. The files of media data used by such media editing applications can be very large, often many GB or more in size. Media editing applications need to be able to access large amounts of data from these media files quickly. For example, a common task performed by a user when using a media editing application is to search through a media file for a particular part it contains. This is done by displayingthe media file and movingthe timeline marker to quickly view frames of video from different parts of the media file. This requires the media editing application to quickly obtain large amounts of data from different parts of the media file, so that the frames can be displayed without excessive delay to the user.
In order to be able to access data from the media files sufficiently quickly, media editing applications are generally arranged to use only files that are stored locally. However, it can be desirable to use media files that are stored remotely. For example, footage for a sporting event may be recorded at a location remote from the location where the media editing application is used. It may also be desirable to allow multiple users to use the same video files. In order to allow this, conventionally it is necessary for each user to obtain a local copy of the video files before they can use them for editing. As such files can be very large, say multiple TB for footage of a sporting event, obtaining the files can be very time-consuming.
To try to overcome these issues, systems exist that will create a local copy of media data that is stored remotely. In order to try to provide the required performance, such systems use techniques such as pre-fetching the data it is predicted will be needed. However, such systems still generally require the entirety of a media file to be locally stored before it can be used. Further, until an edit is finalised it is commonly stored as a “clip”, which is a small file containing the information required to construct the edit from the underlying media data files, but not containing the media data itself. It can be desirable to share a clip with a user at a remote location. However, before the user is able to use the clip, the entirety of all of the underlying media data files must be available to them locally at their remote location, even if the clip only uses a small section of them.
The present invention seeks to mitigate the above-mentioned problems. Alternatively or additionally, the present invention seeks to provide improved methods of providing media data from a media file to a media editing application, improved drivers for media editing applications, and improved media editing systems.
Summary of the Invention
In accordance with a first aspect of the invention there is provided method of providing media data from a media file to a media editing application, wherein the media editing application is arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type usingthe driver associated with the specified media file type; and wherein the media editing application comprises a driver associated with a media file type, wherein each media file of the media file type is arranged to indicate a remote media server from which media data for the media file can be obtained; the method comprising the steps of: the media editing application requesting, from the driver, first media data from a media file; the driver requesting, from the remote media server indicated by the media file, the first media data; the driver receiving, from the remote media server, the first media data; and the driver sending, to the media editing application, the first media data.
Media editing applications are commonly able to receive new drivers, to allow them to work with media files of new media file types. By having a media file type, the media files of which are arranged to indicate a remote media server from which media data forthe media file can be obtained, a media file that is stored locally can be used by the media editing application to obtain media data in the usual way, i.e. as a locally stored media file. However, the driver for the media file type can in practice obtain any media data that the media editing application requests from the remote media server. As the media editing application merely identifies the locally stored media file, and then requests and receives media data from the media file usingthe driver, the fact that the media data returned by the driver was obtained from a remote location is invisible to the media editing application. In this way, the media editing application can obtain media data that is stored remotely, using only functionality intended use with media files that are stored locally. The media editing application can also be immediately provided with the media data it requests, without it being necessary to copy the entirety of the media data for the media file so that it is stored locally before the media file can be used by the media editing application. Where a clip is created that uses media data stored on the remote media server, it can be used by a second media editing application at a different location, with the second media editing application again obtainingthe media data from the remote media server without the entirety of the underlying media files needing to be obtained.
The method may further comprise the steps of: the media editing application requesting, from the driver, second media data from the media file; the driver requesting, from the remote media server, the second media data, wherein the second media data is requested priorto the driver receiving the first media data from the remote media server; the driver receiving, from the remote media server, the second media data; and the driver sending, to the media editing application, the second media data. In this way, when multiple pieces of media data are requested, they can be provided promptly to the media editing application, as the driver does not need to wait until the first media data is received before requesting the second media data. Generally, where media data is stored remotely, latency (the time between requesting and receiving data) can be significant, whereas bandwidth (how much data can be transferred at one time) is often not limiting. While there will be a delay before the first media data is received due to latency, it is usual forthere to be a similar delay with locally stored media data as well. The latency for the second media data will then be mitigated, as the second media data is requested from the remote media server immediately after the first media data is requested, rather than waiting until the first media data has been returned before doing so. Particularly where, as is common, the media editing application attempts to predict which media data will be required and request it in advance, any effects due to latency will be mitigated.
The remote media server is indicated by the media file using a Uniform Resource Identifier (URI). The URI may be a Uniform Resource Locator (URL).
The driver may communicate with the remote media server using a protocol that allows multiple concurrent requests for data to be made. The driver may communicate with the remote media server using an HTTP protocol. The HTTP protocol may be HTTP/2 or HTTP/3. HTTP/2 and HTTP/3 allow multiple streams of data to sent concurrently, ratherthan requirin them to be sent sequentially. HTTP/2 and HTTP/3 run over TLS, and are generally encrypted by default, as is a requirement for authentication schemes such as OAuth 2.0 and OIDC mentioned below to operate. The HTTP protocol may be a later protocol than HTTP/2 and HTTP/3.
The method may further comprise the step of the driver storing the first media data in a local cache. The method may further comprise the steps of: the media editing application requesting, from the driver, cached media data that is stored in the local cache; the driver obtaining, from the local cache, the cached media data; and the driver sending, to the media editing application, the cached media data. In this way, media data that has been requested previously can be obtained locally, and does not need to be requested from the remote media server again.
The first media data may be received from the remote media server in compressed form, and the method may further comprise, prior to the driver sending the first media data to the media editing application, the step of the driver decompressingthe first media data. The driver may decompress the first media data using a graphics processing unit (GPU). As the operation of the driver to obtain the first media data is invisible to the media editing application, the media editing application simply receives the first media data and does not know that it was sent by the remote media server in compressed form. Further, the driver is able to use, for example, a GPU to decompress the first media data quickly. Alternatively, for example, the driver may use the CPU of the device on which the media editing application is runningto decompress the first media data. It is common for modern CPUs to have specialised hardware that can perform decompression of video data quickly and efficiently. The method may further comprise the step, prior to the driver receiving the first media data, of the driver authenticating with the remote media server that the media editing application is authorised to receive the first media data. In this way, the remote media server can control whether the media editing application is able to receive the first media data. This can be done even where the media editing application has received the first media data on a previous occasion. The authentication may be done using OAuth 2.0, OpenlD Connect (OIDC), or any other desired authentication scheme.
The first media data may be received from the remote media server in encrypted form, and the method may further comprise, prior to the driver sending the first media data to the media editing application, the step of the driver decrypting the first media data. Similarly to when the first media data is received in compressed form, the decryption of the first media data can be done by the driver invisibly to the media editing application. Where locally cached media data is requested by the media editing application, the cached media data may be stored in encrypted form, and may be decrypted and sent to the media editing application by the driver only if the driver authenticates with the remote media server that the media editing application is authorised to receive the first media data. In this way, the remote media server can also control whether the media editing application is able to receive cached media data.
The method may further comprise the step, prior to the driver sending the first media data to the media editing application, of the driver watermarking the first media data using identification data. Alternatively, the method may further comprise the steps of: the driver sending identification data to the remote media server; and the remote media server watermarking the first media data using the identification data prior to sending the first media data to the driver. In this way, the first media data can be watermarked with identification data, for example to indicate the user of the media editing application who requested the first media data.
The method may further comprise the steps of: the media editing application requesting, from the driver, metadata for the media file; the driver requesting, from the remote media server metadata for the media file; the driver receiving, from the remote media server, the metadata forthe media file; and the driver sending, to the media editing application, the metadata forthe media file. In this way, metadata for the media file can be provided by the remote media server. This allows, for example, the remote media server to dynamically change the metadata. For example, where the remotely stored media data relates to a sporting event that is being recorded live, the remote media server can adjust the metadata to reflect the amount of media data that is available.
The remote media server may receive media data for the media file subsequent to the remote media server sendingthe first media data to the driver,
The method may further comprise the step, prior to the driver requesting the first media data from the remote media server, of the driver determining a quality forthe first media data; wherein the request from the driver for the first media data from the remote media server includes the determined quality; and wherein the remote media server sends the first media data to the driver at the determined quality. In this way, the media editing application can be provided with media data at different qualities, for example at different compressions/bitrates, on different occasions. The quality may be determined based on network performance, or any other desired conditions or combination of conditions. For example, when the media editing application is being used in a remote location with a poor Internet connection, the media data can be sent at a low quality/bitrate (high compression), whereas when a better connection is available, a higher quality of media data can be sent. In each case the same media data can be requested by the media editing application, and so in each case the media editing application is from its point of view editing the “same” media data, with the difference in quality being handled by the driver and invisible to the media editing application.
The first media data may be one of: an image; a frame; a Group of Pictures; an audio frame; and an audio clip.
In accordance with a second aspect of the invention there is provided a driverfor a media editing application arranged, in response to a request from a media editing application forfirst media data from a media file, to: request, from a remote media server indicated by the media file, the first media data; receive, from the remote media server, the first media data; and send the first media data to the media editing application.
The driver may be further arranged, in response to a request from a media editing application for second media data from a media file, to: request, from the remote media server, the second media data, wherein driver is arranged to request the second media data prior to the driver receiving the first media data from the remote media server.
The remote media server may be indicated by the media file using a Uniform Resource Identifier.
The driver may be arranged to communicate with the remote media server using a protocol that allows multiple concurrent requests to be made.
The driver may communicate with the remote media server using an HTTP protocol. The HTTP protocol may be HTTP/2 or HTTP/3.
The driver may be further arranged to store the first media data in a local cache. The driver may be further arranged, in response to a request from the media editing application for media data that is stored in the local cache, to: obtain the media data from the local cache; and send it to the media editing application.
The driver may be further arranged to: receive the first media data from the remote media server in compressed form; and prior to sending the first media data to the media editing application, decompress the first media data. The driver may be arranged to decompress the first media data using a graphics processing unit.
The driver may be further arranged, prior to receiving the first media data, to authenticate with the remote media server that it is authorised to receive the first media data.
The driver may be further arranged to: receive the first media data from the remote media server in encrypted form; prior to sending the first media data to the media editing application, decryptthe first media data. The driver may be further arranged to send identification data to the remote media server.
The driver may be further arranged, in response to a request from a media editing application for metadata from a media file, to: request, from a remote media server indicated by the media file, metadata for the media file; receive, from the remote media server, the metadata for the media file; and send the metadata for the media file to the media editing application. The metadata may be descriptive of media data forthe media file that is available to the remote media server, for example it may include an indication of the duration of media data available. Alternatively, for example, the metadata may indicate a duration of media data available, but where such media data is not yet available to the remote media server, but is expected to be, for example where the media data is in the process of being uploaded to the remote media server. In this way, the remote media server can make available media data for a media file for an event such as a sporting event while the event is still in progress, and so while media data for the media file is still being generated by recording the event. The first media data may be one of: an image; a frame; a Group of
Pictures; an audio frame; an audio clip.
In accordance with a third aspect of the invention there is provided a media editing system, comprising: a media editing application arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type using the driver associated with the specified media file type; a driver as described above; and a media server having stored on it first media data, and arranged to: receive a request from the driver for the first media data; and send the first media data to the driver.
In accordance with a fourth aspect of the invention there is provided a computer program product arranged, when executed on a computing system comprising one or more processors and memory, to cause the computing system to provide a driver as described above.
It will of course be appreciated that features described in relation to one aspect of the present invention may be incorporated into other aspects of the present invention. For example, the method of the invention may incorporate any of the features described with reference to the apparatus of the invention and vice versa. Description of the Drawings
Embodiments of the present invention will now be described byway of example only with reference to the accompanying schematic drawings of which:
Figure 1 is a schematic diagram of a media editing system in accordance with embodiments of the invention;
Figure 2 is a flow chart showing the operation of the media editing system when a media file is first opened;
Figure 3 is a flow chart showing the operation of the media editing system when media data from the media file is requested;
Figure 4 is a flow chart showing the operation of the media editing system of an embodiment in which media data is locally cached;
Figure 5 is a flow chart showing the operation of the media editing system of an embodiment in which media data is sent by the remote media server in compressed form;
Figure 6 is a flow chart showing the operation of the media editing system of an embodiment in which authentication is performed;
Figure 7 is a flow chart showing the operation of the media editing system of an embodiment in which media data is watermarked;
Figure 8 is a flow chart showing the operation of the media editing system of an embodiment in which media data is requested at a determined quality; and
Figure 9 is a schematic diagram of a computing device in accordance with embodiments.
Detailed Description
A media editing system in accordance with an embodiment of the invention is now described with reference to Figures 1 to 3. Figure 1 is a schematic diagram of the media editing system 100. The media editing system 100 comprises a media editing application 101 , and local storage 102 (i.e. local to the media editing system 100) on which media files are stored. The media editing application 101 is a conventional, known media editing application, provided on, for example, a personal computer, for use editing media files includingthe media files stored on the local storage 102.
As can be seen in Figure 1 , the media editing application 101 is in communication with drivers 111 and 112, which are in turn in communication with the local storage 102. The drivers 111 and 112 allow the media editing application 101 to work with media files of different file types, with the drivers having the required functionality to extract media data from the media files and provide it in the form required by the media editing application 101 . In the present example, the driver 111 is used for media files of type MPEG-4. When the media editing application 101 requires a particular frame from a media file of type MPEG-4, it requests it from the driver 111. The driver 111 then obtains the required frame from the media file and passes it backto the media editing application 101 in the form it requires. The driver 112 provides similar functionality, but for files of type AVI. It will be appreciated that it is often desirable for a media editing application to be able to use files of a multiplicity of different file types, and so multiple drivers may be provided to allow this.
Thus, the drivers 111 and 112 allow the media editing application 101 to work with media files of different file types, without the media editing application 101 needing to have any knowledge of the file types itself, other than the file extension with which that can be identified (e.g. .mp4 for MPEG-4, .avi for AVI), and the corresponding driver to use.
It is often desirable to allow a media editing application to use a newfile type, for example a file type that was not available when the media editing application was written, or for which a driver had not been implemented at the time. To allow this, as is common the media editing application 101 can be provided with new drivers that provide the functionality required to work with media files of new file types. Such new drivers are written in a language such as C++, and provided as a file that can be added to the media editing application
101 . Such drivers are often referred to as “plugins”.
The media editing application 101 is also in communication with a further driver 121. However, unlike a conventional driver, the driver 121 is not in communication with local storage. Instead, the driver 121 is in communication with the Internet 131 . The driver 121 is used for media files with the extension .mea, though it will be appreciated that in other embodiments other extensions could be used. The operation of the driver 121 is described in detail below.
The driver 121 is also in communication with a local cache 122, and a Graphics Processing Unit (GPU) 103.
The media editing system 100 also comprises a remote media server 141 that is in communication with the Internet 131 . The remote media server 141 is also in communication with remote storage 142, on which media files are stored. (The remote storage 142 is remote from the media editing application 101 , but is local to the remote media server 141 .) The remote media service 141 provides a REST API using which the driver 121 can request media data, amongst other things, as discussed in detail below. However, in other embodiments the remote media server 141 can communicate with the driver121 using any other desired communication scheme.
The operation of the media editing system 100, and in particularthe driver 121 , when first opening a media file with the extension .mea is now described with reference to the flowchart of Figure 2. In the present example, the media file being opened is a file videol .mea that is stored on the local storage 102. The file videol .mea is a file in JSON format, and contains only a URL https://remoteserver.com/video1 . where remoteserver.com is the domain name of the remote media server 141 . In particular, the file videol .mea does not contain any media data, or any metadata for any media data. Instead, the media data for the file videol .mea is stored on the remote storage 142 of the remote media server 141 . First, the media editing application 101 sends a request to the driver 121 to open the file videol .mea (step 201). This can occur, for example, because a user, using the file openingfunctionality of the media editing application 101 , has requested that the file videol .mea stored on the local storage 102 is opened.
The driver 121 then obtains the URL from the file videol .mea (step 202), i.e. https://remoteserver.com/video1, and uses this URL to send an HTTP request over the Internet 131 (step 203). The HTTP request uses the domain name remoteserver.com for the remote media server 141 , and so is passed to the remote media server 141 . As the URL includes the path videol and nothing further, the remote media server 141 interprets this as a request for metadata relating to the media data it has for the file videol .mea stored on the remote storage 142. The remote media server 141 obtains the metadata for the media data for the file videol .mea from the remote storage 142 (step 204), and sends it to the driver 121 (step 205). This metadata can include, for example, the duration of the media data, the frame rate, the size of the frames of the media data, and any other desirable metadata.
The driver 121 then sends the metadata received from the remote media server 141 to the media editing application 101 (step 206).
In this way, to the media editing application 101 , the behaviour of the driver 121 when asked to open a media file is the same as that of the drivers 111 and 112 when asked to open a media file stored on the local storage 102. In other words, the file videol .mea appears to be a media file containing media data that is stored on the local storage 102. This is despite the fact that the media file that is opened, videol .mea, does not actually include any media data or metadata, and instead the media data and metadata for the file videol .mea is stored remotely on the remote media server 142.
The operation of the media editing system 100 when requesting media data from the file videol .mea is now described with reference to the flowchart of Figure 3. First, the media editing application 101 sends a request for first media data from the file videol .mea to the driver 121 (step 301), in the present example this first media data being the frame number 1234 of the media file. This can occur, for example, because the file videol .mea is being viewed in the media editing application 101 , and the user moves the timeline marker to the position correspondingto frame 1234, and so the media editing application 101 needs to display frame 1234.
In response to the request from the media editing application 101 , the driver 121 sends a request for frame 1234 for the file videol .mea to the remote media server 141 (step 302). It does this by sending an HTTP request usingthe Because of the number 1234 at the end of the path, this request is interpreted by the remote media server 141 as a request for frame 1234 of the media data it has for the file videol .mea stored on the remote storage 142. The remote media server 141 obtains frame 1234 for the file videol .mea from the remote storage 142 (step 303), sends it to the driver 121 (step 304), which in turn sends it to the media editing application 101 (step 305). In this way, to the media editing application 101 it appears as if frame 1234 has been obtained from the file videol .mea stored on the local storage 102, even though in practice it has been obtained from the remote media server 141 .
Additionally in the present example, after the media editing application 101 has requested frame 1234 but before it has been returned, the media editing application 101 sends a request for second media data from the file videol .mea to the driver 121 (step 306), in the present example this second media data being the subsequent frame number 1235 of the media file. However, it will be appreciated that while the media editing application will often request consecutive frames, it will not necessarily do so, and the following discussion applies equally where non-consecutive frames are requested.
Similarly to the first media data, the driver 121 then sends a request for frame 1235 to the remote media server 141 (step 307), in this case by sending an HTTP request usingthe URL https //remotes®^ which is interpreted by the remote media server 141 as a request for frame 1235 of the file videol .mea. However, it does this concurrently with the request for frame 1234, i.e. the driver 121 makes the request for frame 1235 before the request for frame 1234 has been satisfied. It is able to do this because the HTTP request is done using HTTP/2, which allows multiple streams of data to be sent concurrently.
As with frame 1234, the remote media server 141 then obtains frame 1235 for the file videol .mea from the remote storage 142 (step 308), sends to the driver 121 (step 309), which in turn sends it to the media editing application 101 (step 310). This results in frame 1235 being received by the media editing application 101 only shortly after frame 1234. Frame 1235 could even be received by the media editing application 101 before frame 1234, if it was requested quickly enough following frame 1234, and smaller than frame 1234 so took less time to be received in its entirety by the driver 121 . This is the case even if the latency of the communication between the driver 121 and the remote media server 141 is high.
(In other embodiments, the driver 121 may retain received media data if necessary, so that it is always returned to the media editing application 101 in the order requested.)
While in the present example two frames of media data are requested concurrently, it will be appreciated that many more frames (or other pieces of media data) could be requested concurrently, limited only by the ability of the communication protocol used between the driver 121 and the remote media server 141 to handle concurrent requests. For example, HTTP/2 implementations often allow up to 100 concurrent requests to be made. In contrast, earlier versions of the HTTP protocol such as HTTP/1 .1 only allow requests to be made sequentially, i.e. a new request can be made only after an existing request has been satisfied.
Further, known algorithms such as Bottleneck Bandwidth and Round-trip propagation time (BBR), a TCP congestion control algorithm, may be used to give improved performance when receiving media data. In addition, where TCP window size would limit the rate atwhich media data can be received, multiple TCP connections can be opened to allow the rate at which media data can be received to be increased. Characteristics such as the bandwidth-delay product of the data link between the remote media server 141 and the media editing application 101 can be used to determine the parameters for the communication, such as the number of TCP connections to open. (The bandwidth-delay product is the product of the data link’s capacity in bits per second and round-trip delay in seconds, and acts as a measure of the amount of data that is “stored” in the data link while being transmitted.)
The operation of the media editing system 100 when requesting media data in accordance with another embodiment of the invention is now described with reference to the flowchart of Figure 4. In this embodiment, the driver 121 performs local caching of received media data, as explained below.
As in the previous embodiment, first the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 401 ), in the present example this first media data being the frame number 3456 of the media file. The driver 121 then checks whether the requested frame 3456 is stored in the local cache 122 (step 402), as can occur under the circumstances explained below. In the present example, frame 3456 is not present in the local cache 122. As a result, similarly to the previous embodiment the driver 121 requests the frame from the remote media server 141 (step 403), the remote media server 141 obtains the frame from the remote storage 142 (step 405), and the remote media server 141 sends the frame to the driver 121 (step 406).
Next, unlike in the previous embodiment, the driver stores the frame in the local cache 122 (step 406), before sending it to the media editing application 101 (step 408) in the usual way. As can be seen, in this way the driver 121 obtains media data from the remote media server 141 and sends it to the media editing application 101 as in the previous embodiment, but also stores any obtained media data in the local cache 122. It will be appreciated that in other embodiments, the media data may be stored in the local cache 122 subsequently to, or in parallel with, it being sent to the media editing application
101.
However, if the frame 3456 is present in the local cache 122 (step 402 again), the operation of the driver 121 is different. This will occur when the frame 3456 has been requested on a previous occasion, and as part of that request stored in the local cache 122. In this case, rather than requesting frame 3456 from the remote media server 141 , instead it obtains it from the local cache 122 (step 407), and then again returns the frame to the media editing application 101 . In this way, the need to obtain previously requested media data from a remote location is avoided.
The operation of the media editing system 100 when requesting media data in accordance with another embodiment of the invention is now described with reference to the flowchart of Figure 5. In this embodiment, media data is sent from the remote media server 141 to the driver 121 in compressed form, as explained below.
As with previous embodiments, first the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 501 ), in the present example the first media data being the frame number 4567 of the media file. The driver 121 then requests the frame from the remote media server 141 (step 502), the remote media server 141 obtains the frame from the remote storage 142 (step 503), and the remote media server 141 sends the frame to the driver 121 (step 504).
However, as mentioned above, in the present embodiment the frame 4567 is sent to the driver 121 in compressed form. Frame 4567 may be compressed by the remote media server 141 prior to sending it to the driver 121 , or alternatively may be stored on the remote storage 142 in compressed form.
Next, the driver 121 decompresses frame 4567 using the GPU 103 (step 505), and then sends the decompressed frame to the media editing application 101 (step 506). In this way, media data can be sent from the remote media server 141 to the driver 121 more quickly, as it is in compressed form so requires less data. The media data mustthen be decompressed by the driver 121 , which takes time. However, the driver 121 uses the GPU 103 to decompress the file, which due to the processing power and specialised nature of the GPU 103 is able to decompress the file very quickly, and so a problematic delay in providing the media data to the media editing application 101 due to it being sent by the remote media server 141 is compressed form is not introduced. Similarly to previous embodiments, the fact that the media data is sent in compressed form is invisible to the media editing application 101 , which simply receives the required media data from the driver 121 in the uncompressed form it requires.
In other embodiments, the media data may be sent from the remote media server 141 to the driver 121 alternatively or additionally in encrypted form, and decrypted prior to sending to the media editing application 101 .
It will be appreciated that the use of the GPU 103 to decompress the media data is just one possibility, and in other embodiments the driver 121 may use other local resources to decompress the media data.
The operation of the media editing system 100 when requesting media data in accordance with another embodiment of the invention is now described with reference to the flowchart of Figure 6. In this embodiment, authentication occurs between the driver 121 and the remote media server 141 , as explained below.
As with previous embodiments, first the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 601 ), in the present example this first media data being the frame number 5678 of the media file. However, next the driver 121 and the remote media server 141 perform authentication (step 602), i.e. confirm that the driver 212 is allowed to receive media data for the file videol .mea from the remote media server 141 . This can be done using any desired authentication system, for example OAuth 2.0 or OIDC. In practice, the remote media server 141 may be authenticating that the user of the media editing application 101 is allowed to receive the media data, using credentials of the user used bythe driver 121 for the authentication.
If the authentication is successful (step 603), then as in previous embodiments the driver 121 requests frame 4567 from the remote media server 141 (step 604), the remote media server 141 obtains the frame from the remote storage 142 (step 605), the remote media server 141 sends the frame to the driver 121 (step 606), and the driver 121 sends the frame to the media editing application 101 (step 607).
However, if the authentication is not successful (step 603 again), the driver 121 will be unable to obtain the requested media data from the remote media server 141 . In this case, instead of returning any media data it instead sends a failure message to the media editing application 101 , to indicate that the requested media data, frame 4567, is not available.
While in the present embodiment the authentication is done when media data is requested, in other embodiments it could alternatively or additionally be done when the file videol .mea is first opened (or attempted to be opened) and its metadata requested. In embodiments where received media data is locally cached, the driver 121 could perform authentication with the remote media server 141 prior to sendingthe cached media data to the media editing application 101 . This allows the remote media server 141 to control the use of media data even when it has been sent previously and locally cached, particularly if the media data is stored in the local cache 122 in encrypted form, so needs to be decrypted by the driver 121 before it can be used.
The operation of the media editing system 100 when requesting media data in accordance with another embodiment of the invention is now described with reference to the flowchart of Figure 7. In this embodiment, watermarking of media data is performed, as explained below.
As with previous embodiments, first the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 701 ), in the present example this first media data being the frame number 5678 of the media file. Next, as in previous embodiments, the driver 121 requests frame 5678 from the remote media server 141 (step 702); the remote media server 141 obtains the frame from the remote storage 142 (step 703); and the remote media server 141 sends the frame to the driver 121 (step 704).
However, unlike in previous embodiments, next the driver 121 watermarks the frame 5678 with identification data identifying the user of the media editing application 101 (step 705). It obtains the identification data from the media editing application 101 , for example using licence details, login details, or other identifying data the user has that allows them to use the media editing application 101. The step of watermarking the frame 5678 is performed subsequently to any decompression and/or decryption of the frame that is required when the frame is received from the remote media server 141 . The driver 121 then sends the watermarked frame to the media editing application 101 (step 706).
In this way, any media data for the file videol .mea obtained by a user of the media editing application 101 is necessarily watermarked.
While in the present embodiment the watermarking is done by the driver 121 , in other embodiments it could instead be done by the remote media server 141 , based on identification data sent by the driver 121 with the request for the media data. In this case, the remote media server 141 watermarks the media data prior to sending it to the driver 121 , and prior to compressing and/or encrypting the media data, if appropriate.
Similarly, the identification data could be provided at a different time, for example when the file videol .mea is first opened. The identification data could identify something other than the user of the media editing application 101 , for example a serial number relating to the media editing application 101 itself, login details for the device on which the media editing application 101 is being used, a MAC address for the device on which the media editing application 101 and driver 121 are being run, or any other desired identification data. The operation of the media editing system 100 when requesting media data in accordance with another embodiment of the invention is now described with reference to the flowchart of Figure 8. In this embodiment, the media data is requested at a determined quality, as explained below.
As with previous embodiments, first the media editing application 101 sends a request for media data from the file videol .mea to the driver 121 (step 801 ), in the present example this first media data being the frame number 6789 of the media file. However, unlike in previous embodiments, next the driver 121 determines a quality at which to request the frame 5678 (step 802). The determined quality may for example be a particular compression rate or bitrate, and may be determined based on current network performance, for example. Where current network performance is low, i.e. the connection between the driver 121 and the remote media server 141 is poor, a low quality may be determined; in contrast, where current network performance is high, i.e. the connection between the driver 121 and the remote media server 141 is good, a high quality may be determined.
Similarly to previous embodiments, the driver 121 requests frame 6789 from the remote media server 141 (step 803), but in the present embodiment the driver 121 requests frame 6789 at the determined quality. The remote media server 141 then obtains the frame from the remote storage 142 at the determined quality (step 804); the remote media server 141 sends the frame (at the determined quality) to the driver 121 (step 805), and the driver 121 then sends the frame (at the determined quality) to the media editing application 101 (step 806).
In this way, the media editing application 101 can be provided with media data of a quality that is appropriate to the network connection to the remote media server 141 . For example, when the network connection is poor, low quality data that is smaller in size can be sent, whereas when the network connection is poor, high quality data that is larger in size can be sent. In each case, from the point of view of the media editing application 101 it is receiving the “same” media data, with the difference in quality being handled by the driver invisibly to the media editing application.
Embodiments of the invention include at least some of the methods described above performed on a computing device, such as the computing device 1100 shown in Figure 9. The computing device 1100 comprises a data interface 1101 , through which data can be sent or received, for example over a network. The computing device 1100 further comprises a processor 1102 in communication with the data interface 1101 , and memory 1103 in communication with the processor 1102. In this way, the computing device 1100 can receive data, such as media data, via the data interface 1101 , and the processor 1102 can store the received data in the memory 1103, and process it so as to perform the methods described herein. At least some of the methods described herein may be performed by a computing system comprising one or more such computing devices 1100.
Each device, module, component, machine or function as described in relation to any of the examples described herein may comprise a processor and/or processing system or may be comprised in apparatus comprising a processor and/or processing system. One or more aspects of the embodiments described herein comprise processes performed by apparatus. In some examples, the apparatus comprises one or more processing systems or processors configured to carry out these processes. In this regard, embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware). Embodiments also extend to computer programs, particularly computer programs on or in a carrier, adapted for puttingthe above described embodiments into practice. The program may be in the form of non-transitory source code, object code, or in any other non-transitory form suitable for use in the implementation of processes accordingto embodiments. The carrier may be any entity or device capable of carrying the program, such as a RAM, a ROM, or an optical memory device, etc.
While the present invention has been described and illustrated with reference to particular embodiments, it will be appreciated by those of ordinary skill in the art that the invention lends itself to many different variations not specifically illustrated herein.
Where in the foregoing description, integers or elements are mentioned which have known, obvious or foreseeable equivalents, then such equivalents are herein incorporated as if individually set forth. Reference should be made to the claims for determining the true scope of the present invention, which should be construed so as to encompass any such equivalents. It will also be appreciated by the reader that integers or features of the invention that are described as preferable, advantageous, convenient or the like are optional and do not limit the scope of the independent claims. Moreover, it is to be understood that such optional integers or features, whilst of possible benefit in some embodiments of the invention, may not be desirable, and may therefore be absent, in other embodiments.

Claims

Claims
1 . A method of providing media data from a media file to a media editing application, wherein the media editing application is arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type using the driver associated with the specified media file type; and wherein the media editing application comprises a driver associated with a media file type, wherein each media file of the media file type is arranged to indicate a remote media server from which media data for the media file can be obtained; the method comprising the steps of: the media editing application requesting, from the driver, first media data from a media file; the driver requesting, from the remote media server indicated by the media file, the first media data; the driver receiving, from the remote media server, the first media data; and the driver sending, to the media editing application, the first media data.
2. A method as claimed in claim 1 , further comprising the steps of: the media editing application requesting, from the driver, second media data from the media file; the driver requesting, from the remote media server, the second media data, wherein the second media data is requested prior to the driver receiving the first media data from the remote media server; the driver receiving, from the remote media server, the second media data; and the driver sending, to the media editing application, the second media data.
3. A method as claimed in claim 1 or 2, wherein the remote media server is indicated by the media file using a Uniform Resource Identifier.
4. A method as claimed in any preceding claim, wherein the driver communicates with the remote media server using a protocol that allows multiple concurrent requests for data to be made.
5. A method as claimed in any preceding claim, wherein the driver communicates with the remote media server using an HTTP protocol.
6. A method as claimed in claim 5, wherein the HTTP protocol is HTTP/2 or HTTP/3.
7. A method as claimed in any preceding claim, further comprisingthe step of the driver storingthe first media data in a local cache.
8. A method as claimed in claim 7, further comprisingthe steps of: the media editing application requesting, from the driver, cached media data that is stored in the local cache; the driver obtaining, from the local cache, the cached media data; and the driver sending, to the media editing application, the cached media data.
9. A method as claimed in any preceding claim, wherein the first media data is received from the remote media server in compressed form, and wherein the method further comprises, prior to the driver sendingthe first media data to the media editing application, the step of the driver decompressing the first media data. - 1 -
10. A method as claimed in claim 9, wherein the driver decompresses the first media data using a graphics processing unit.
11 . A method as claimed in any preceding claim, further comprisingthe step, prior to the driver receivingthe first media data, of the driver authenticating with the remote media server that the media editing application is authorised to receive the first media data.
12. A method as claimed in any preceding claim, wherein the first media data is received from the remote media server in encrypted form, and wherein the method further comprises, prior to the driver sendingthe first media data to the media editing application, the step of the driver decrypting the first media data.
13. A method as claimed in any preceding claim, further comprisingthe step, prior to the driver sending the first media data to the media editing application, of the driver watermarking the first media data using identification data.
14. A method as claimed in any preceding claim, further comprisingthe steps of: the media editing application requesting, from the driver, metadata for the media file; the driver requesting, from the remote media server metadata for the media file; the driver receiving, from the remote media server, the metadata for the media file; and the driver sending, to the media editing application, the metadata for the media file.
15. A method as claimed in any preceding claim, wherein the remote media server receives media data for the media file subsequent to the remote media server sending the first media data to the driver,
16. A method as claimed in any preceding claim, further comprisin the step, prior to the driver requesting the first media data from the remote media server, of the driver determining a quality for the first media data; wherein the request from the driver for the first media data from the remote media server includes the determined quality; and wherein the remote media server sends the first media data to the driver at the determined quality.
17. A method as claimed in any preceding claim, wherein the first media data is one of: an image; a frame; a Group of Pictures; an audio frame; and an audio clip.
18. A driver for a media editing application arranged, in response to a request from a media editing application for first media data from a media file, to: request, from a remote media server indicated by the media file, the first media data; receive, from the remote media server, the first media data; and send the first media data to the media editing application.
19. A driver as claimed in claim 18, further arranged, in response to a request from a media editing application for second media data from a media file, to: request, from the remote media server, the second media data, wherein driver is arranged to request the second media data prior to the driver receiving the first media data from the remote media server.
20. A driver as claimed in claim 18 or 19, wherein the remote media server is indicated by the media file using a Uniform Resource Identifier.
21 . A driver as claimed in any of claims 18 to 20, wherein the driver is arranged to communicate with the remote media server using a protocol that allows multiple concurrent requests to be made.
22. A drive as claimed in any of claims 18 to 21 , wherein the driver communicates with the remote media server using an HTTP protocol.
23. A driver as claimed in claim 22, wherein the HTTP protocol is HTTP/2 or HTTP/3.
24. A driver as claimed in any of claims 18 to 23, further arranged to store the first media data in a local cache.
25. A driver as claimed in claim 24, further arranged, in response to a request from the media editing application for media data that is stored in the local cache, to: obtain the media data from the local cache; and send it to the media editing application.
26. A driver as claimed in any of claims 18 to 25, further arranged to: receive the first media data from the remote media server in compressed form; and prior to sending the first media data to the media editing application, decompress the first media data.
27. A driver as claimed in claim 26, wherein the driver is arranged to decompress the first media data using a graphics processing unit.
28. A driver as claimed in any of claims 18 to 27, further arranged, prior to receivingthe first media data, to authenticate with the remote media server that the media editing application is authorised to receive the first media data.
29. A driver as claimed in any of claims 18 to 28, further arranged to: receive the first media data from the remote media server in encrypted form; prior to sending the first media data to the media editing application, decrypt the first media data.
30. A driver as claimed in any of claims 18 to 29, further arranged to send identification data to the remote media server.
31 . A driver as claimed in any of claims 18 to 30, further arranged, in response to a request from a media editing application for metadata from a media file, to: request, from a remote media server indicated by the media file, metadata for the media file; receive, from the remote media server, the metadata for the media file; and send the metadata for the media file to the media editing application.
32. A driver as claimed in any of claims 18 to 31 , wherein the first media data is one of: an image; a frame; a Group of Pictures; an audio frame; an audio clip.
33. A media editing system, comprising: a media editing application arranged to: receive a plurality of drivers, each driver being associated with a media file type; and request media data from a media file of a specified media file type usingthe driver associated with the specified media file type; a driver as claimed in any of claims 18 to 32; and a media server having stored on it first media data, and arranged to: receive a request from the driver for the first media data; and send the first media data to the driver.
34. A computer program product arranged, when executed on a computing system comprising one or more processors and memory, to cause the computing system to provide a driver as claimed in any of claims 18 to 32.
PCT/GB2025/050702 2024-04-09 2025-04-02 Methods and drivers for providing media data from a media file to a media editing application Pending WO2025215340A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2405055.1 2024-04-09
GB202405055 2024-04-09

Publications (1)

Publication Number Publication Date
WO2025215340A1 true WO2025215340A1 (en) 2025-10-16

Family

ID=95399425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2025/050702 Pending WO2025215340A1 (en) 2024-04-09 2025-04-02 Methods and drivers for providing media data from a media file to a media editing application

Country Status (1)

Country Link
WO (1) WO2025215340A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050193205A1 (en) * 2004-01-09 2005-09-01 Widevine Technologies, Inc. Method and system for session based watermarking of encrypted content
US20140304603A1 (en) * 2013-04-05 2014-10-09 Avid Technology, Inc. Full fidelity remote video editing
US20200089701A1 (en) * 2018-09-13 2020-03-19 Grass Valley Limited System and method for dynamically accessing media assets

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050193205A1 (en) * 2004-01-09 2005-09-01 Widevine Technologies, Inc. Method and system for session based watermarking of encrypted content
US20140304603A1 (en) * 2013-04-05 2014-10-09 Avid Technology, Inc. Full fidelity remote video editing
US20200089701A1 (en) * 2018-09-13 2020-03-19 Grass Valley Limited System and method for dynamically accessing media assets

Similar Documents

Publication Publication Date Title
AU2018202004B2 (en) Enhanced streaming media playback
US12526331B2 (en) Adaptive media streaming method and apparatus according to decoding performance
US10567809B2 (en) Selective media playing method and apparatus according to live streaming and recorded streaming
US9584556B2 (en) Client proxy for adaptive bitrate selection in HTTP live streaming
US6728763B1 (en) Adaptive media streaming server for playing live and streaming media content on demand through web client's browser with no additional software or plug-ins
US10554651B2 (en) Merged video streaming, authorization, and metadata requests
US8725947B2 (en) Cache control for adaptive stream player
JP2007529970A (en) Media data stream processing technology
WO2015120766A1 (en) Video optimisation system and method
US12273601B2 (en) Live video streaming architecture with real-time frame and subframe level live watermarking
CN110933517A (en) Code rate switching method, client and computer readable storage medium
CN107438051A (en) Streaming media quick start method, device and system
US11647237B1 (en) Method and apparatus for secure video manifest/playlist generation and playback
WO2011143916A1 (en) Media adaptation method and apparatus
US20140289257A1 (en) Methods and systems for providing file data for media files
US20210021659A1 (en) Delivery apparatus, delivery method, and program
CN108702542A (en) Client operation method for streaming service
WO2025215340A1 (en) Methods and drivers for providing media data from a media file to a media editing application
US20130024543A1 (en) Methods for generating multiple responses to a single request message and devices thereof
US20220311817A1 (en) Media streaming
CN115834925A (en) Video transcoding method, device, equipment and medium
CN115604248B (en) File transmission method and device
JP2020140393A (en) Receiver, control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25719082

Country of ref document: EP

Kind code of ref document: A1