[go: up one dir, main page]

CN113099237A - Video processing method and device - Google Patents

Video processing method and device Download PDF

Info

Publication number
CN113099237A
CN113099237A CN202110361083.7A CN202110361083A CN113099237A CN 113099237 A CN113099237 A CN 113099237A CN 202110361083 A CN202110361083 A CN 202110361083A CN 113099237 A CN113099237 A CN 113099237A
Authority
CN
China
Prior art keywords
frame rate
video
target
played
target frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110361083.7A
Other languages
Chinese (zh)
Other versions
CN113099237B (en
Inventor
张民
吕德政
崔刚
张彤
张艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Frame Color Film And Television Technology Co ltd
Original Assignee
Shenzhen Frame Color Film And Television Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Frame Color Film And Television Technology Co ltd filed Critical Shenzhen Frame Color Film And Television Technology Co ltd
Priority to CN202110361083.7A priority Critical patent/CN113099237B/en
Publication of CN113099237A publication Critical patent/CN113099237A/en
Application granted granted Critical
Publication of CN113099237B publication Critical patent/CN113099237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application provides a video processing method and a video processing device, wherein the method comprises the following steps: the method comprises the steps of obtaining a target frame rate mark of a video to be played, determining a decoding frame rate as a target frame rate corresponding to the target frame rate mark, decoding the video to be played by adopting the decoding frame rate, determining a playing parameter of the video to be played as a target playing parameter corresponding to the target frame rate, and playing the decoded video based on the target playing parameter. The video playing method and device can improve the visual effect of video playing.

Description

Video processing method and device
Technical Field
The present application relates to the field of signal processing technologies, and in particular, to a video processing method and apparatus.
Background
The Frame rate (Frame rate) is a frequency (rate) at which bitmap images in units of frames continuously appear on the display. The visual effects of video presentations at different frame rates are different. Taking video shooting as an example, currently, video shooting is usually performed at a frame rate of 24 frames per second (fps). Accordingly, the player performs video playback using the same frame rate.
However, with the continuous improvement of the shooting devices, shooting can be performed at various frame rates, such as 24fps, 48fps, 120fps, etc., so that the Digital Cinema Package (DCP) formed at the later stage can include various frame rates. However, the current players such as a movie server only support video playing at one frame rate, which results in poor visual effect of video playing.
Disclosure of Invention
The application provides a video processing method and device to improve the visual effect of video playing.
In a first aspect, the present application provides a video processing method, including:
acquiring a target frame rate mark of a video to be played;
determining a decoding frame rate as a target frame rate corresponding to the target frame rate mark;
decoding a video to be played by adopting a decoding frame rate;
determining the playing parameters of the video to be played as target playing parameters corresponding to the target frame rate;
and playing the decoded video based on the target playing parameter.
Optionally, determining the decoding frame rate as the target frame rate corresponding to the target frame rate flag includes: determining a target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark; and determining the decoding frame rate as the target frame rate.
Optionally, before determining the target frame rate corresponding to the target frame rate mark according to the corresponding relationship between the frame rate and the frame rate mark, the video processing method may further include: and acquiring the corresponding relation between the frame rate and the frame rate mark from the configuration file.
Optionally, determining that the playing parameter of the video to be played is the target playing parameter corresponding to the target frame rate includes: determining a target playing parameter corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameter; and determining the playing parameters of the video to be played as target playing parameters.
Optionally, the obtaining of the target frame rate flag of the video to be played includes: acquiring configuration data of a video to be played, wherein the configuration data comprises a target frame rate mark; and acquiring the target frame rate mark of the video to be played from the configuration data.
Optionally, the video to be played includes video segments with at least two frame rates, and the target frame rates corresponding to different video segments are marked differently. The method for acquiring the target frame rate mark of the video to be played comprises the following steps: and acquiring a target frame rate mark of the corresponding video segment.
In a second aspect, the present application provides a video processing apparatus comprising:
the acquisition module is used for acquiring a target frame rate mark of a video to be played;
the first determining module is used for determining the decoding frame rate as a target frame rate corresponding to the target frame rate mark;
the decoding module is used for decoding the video to be played by adopting a decoding frame rate;
the second determining module is used for determining that the playing parameters of the video to be played are target playing parameters corresponding to the target frame rate;
and the playing module is used for playing the decoded video based on the target playing parameter.
Optionally, the first determining module is specifically configured to: determining a target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark; and determining the decoding frame rate as the target frame rate.
Optionally, the first determining module is further configured to: before determining the target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark, acquiring the corresponding relation between the frame rate and the frame rate mark from a configuration file.
Optionally, the second determining module is specifically configured to: determining a target playing parameter corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameter; and determining the playing parameters of the video to be played as target playing parameters.
Optionally, the obtaining module is specifically configured to: acquiring configuration data of a video to be played, wherein the configuration data comprises a target frame rate mark; and acquiring the target frame rate mark of the video to be played from the configuration data.
Optionally, the video to be played includes video segments with at least two frame rates, and the target frame rates corresponding to different video segments are marked differently. At this time, the obtaining module may be specifically configured to: and acquiring a target frame rate mark of the corresponding video segment.
In a third aspect, the present application provides an electronic device, comprising: a memory and a processor;
the memory is used for storing program instructions;
the processor is configured to invoke program instructions in the memory to perform the video processing method according to the first aspect of the application.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon computer program instructions which, when executed, implement a video processing method as described in the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements a video processing method as described in the first aspect of the present application.
According to the video processing method and device, the target frame rate mark of the video to be played is obtained, the decoding frame rate is determined to be the target frame rate corresponding to the target frame rate mark, the video to be played is decoded by adopting the decoding frame rate, the playing parameter of the video to be played is determined to be the target playing parameter corresponding to the target frame rate, and the decoded video is played based on the target playing parameter. According to the video playing method and device, the corresponding target frame rate can be determined according to the target frame rate mark of the video to be played, and then the corresponding decoding frame rate and playing parameters are determined according to the target frame rate, so that the video to be played is decoded and played respectively by adopting the decoding frame rate and the playing parameters corresponding to the target frame rate, and the visual effect of video playing is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a flowchart of a video processing method according to an embodiment of the present application;
fig. 3 is a flowchart of a video processing method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present, movies are usually shot and shown at 24fps, and the frame rate is increased on the basis of 24fps, for example, from 24fps to 48fps, which is equivalent to increasing one frame content between each frame of pictures, so that more information is obtained in a unit time of a viewer, and the viewing effect becomes better.
With the continuous improvement of the photographing apparatus, the photographing apparatus can perform photographing at various different frame rates. For example, movies shot at 24fps, 48fps, and 120fps include a plurality of frame rates in the later-formed DCP. However, the current playing devices such as a movie server only support video playing at a frame rate, which results in poor visual effect of video playing.
Based on the above problem, the present application provides a video processing method and apparatus, where a corresponding frame rate tag is added to a video, a frame rate corresponding to the video to be played is determined according to the frame rate tag of the video to be played, and the video to be played is decoded and played based on the frame rate, so as to achieve an enhancement of a playing effect including the video, and further enhance a visual effect of video playing.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 1, in this application scenario, a movie including a multi-frame rate is played by the movie server 110 and projected by the projector 120 onto the screen 130 for imaging. The specific implementation process of the server 110 playing the movie containing multiple frame rates can be seen in the schemes of the following embodiments.
It should be noted that fig. 1 is only a schematic diagram of an application scenario provided in this embodiment, and this embodiment of the present application does not limit the devices included in fig. 1, and also does not limit the positional relationship between the devices in fig. 1. For example, in the application scenario shown in fig. 1, a data storage device may be further included, and the data storage device may be an external memory with respect to the server 110 or an internal memory integrated in the server 110.
Fig. 2 is a flowchart of a video processing method according to an embodiment of the present application. The method of the embodiment of the application can be applied to electronic equipment, and the electronic equipment can be a server or video playing equipment such as a television, a projector, a tablet computer or a desktop computer.
As shown in fig. 2, the video processing method according to the embodiment of the present application includes the following steps:
s201, obtaining a target frame rate mark of a video to be played.
In practical applications, when a video is generated, a frame rate flag corresponding to the video may be added, where the frame rate flag is used to identify a frame rate of the corresponding video. For convenience of description, in the embodiment of the present application, the frame rate flag of the video to be played is referred to as a target frame rate flag. In a first specific implementation, video segments with the same frame rate in a video to be played are identified by a target frame rate mark; in a second specific implementation, each frame in the video to be played has a corresponding target frame rate tag. In comparison, in the second specific implementation, the data amount of the target frame rate flag corresponding to the video to be played is greater than the data amount of the target frame rate flag corresponding to the video to be played in the first specific implementation.
Illustratively, the target frame rate of the video to be played is 24fps, and accordingly, if the frame rate flag corresponding to 24fps is 0001, the target frame rate flag of the video to be played is 0001.
Or, the target frame rate of the video to be played includes 24fps and 48fps, and accordingly, the frame rate flag corresponding to 24fps is 0001, and the frame rate flag corresponding to 48fps is 0010, and then the target frame rate flags of the video to be played are 0001 and 0010. In this example, assuming that the video to be played contains 100 frames, where the frame rates of the 1 st to 50 th frames are 24fps and the frame rates of the 51 st to 100 th frames are 48fps, corresponding to the first specific implementation, the 1 st to 50 th frames may be marked by one 0001, and the 51 st to 100 th frames may be marked by one 0010; alternatively, corresponding to the second specific implementation, the 1 st to 50 th frames may be respectively marked with 50 0001, and the 51 st to 100 th frames may be respectively marked with 50 0010; alternatively, frames 1 to 50 may be marked with one 0001, and frames 51 to 100 may be marked with 50 0010, respectively; and the specific implementation can be set according to actual requirements, and the embodiment of the application is not limited.
Correspondingly, in the playing process, the target frame rate mark corresponding to each frame may be sequentially obtained according to the playing sequence of each frame of the video to be played, or the target frame rate marks corresponding to multiple frames may be obtained. Illustratively, the target frame rate tag of the video to be played may be stored in the configuration data of the DCP, in which case the target frame rate tag of the video to be played may be obtained from the configuration data.
In some embodiments, the video to be played includes video segments with at least two frame rates, where the target frame rate markers corresponding to different video segments are different, and the obtaining of the target frame rate marker of the video to be played may include: and acquiring a target frame rate mark of the corresponding video segment.
Illustratively, the video to be played contains video segments at three frame rates, which are 24fps, 48fps and 120fps respectively. Optionally, the frame rate flag corresponding to the video segment of 24fps is, for example, 0001, the frame rate flag corresponding to the video segment of 48fps is, for example, 0010, and the frame rate flag corresponding to the video segment of 120fps is, for example, 0011, and the target frame rate flag of the video segment of each frame rate may be obtained. The target frame rate markers corresponding to different video segments included in the video to be played are stored in the configuration data, so that the target frame rate markers corresponding to the video segments can be obtained from the configuration data.
S202, determining the decoding frame rate as the target frame rate corresponding to the target frame rate mark.
After the target frame rate mark of the video to be played is obtained, the decoding frame rate can be determined to be the target frame rate corresponding to the target frame rate mark.
And S203, decoding the video to be played by adopting the decoding frame rate.
In practical applications, video coding is usually performed on video to remove redundancy in spatial and temporal dimensions for the convenience of storage, transportation or reduction of data volume. It is understood that the video encoding process refers to performing related processing on the original video by using a video encoding mode. The video coding mode comprises a mode of converting a file in an original video format into a file in another video format through a compression technology. Therefore, the video to be played is decoded during actual playing.
Specifically, a decoder is set based on the decoding frame rate to decode the video to be played. Illustratively, the image portion of the DCP is a Joint Photographic Experts Group (JPEG) 2000 compressed file, and a Field Programmable Gate Array (FPGA) based decoder may be used to decode the video to be played. It should be noted that, the specific decoding manner may refer to the related art, and is not described herein again.
S204, determining the playing parameter of the video to be played as a target playing parameter corresponding to the target frame rate.
The playing parameters corresponding to different frame rates may be the same or different. Specifically, the playing parameter with a fixed frame rate when the playing effect is the best can be set as the playing parameter corresponding to the frame rate. Therefore, in the playing process, the electronic equipment can determine the target playing parameter according to the target frame rate of the video to be played, so that the video to be played has a good playing effect, and the visual experience of a user is further improved.
Optionally, the playing parameter may include at least one of a frame rate, a resolution, a sound format, a color space, a code rate, and a refresh rate, which is not limited in this application. The frame rate, i.e., the frequency (rate) at which the bitmap image in units of frames continuously appears on the display, is 24 fps; the resolution refers to the number of pixels displayed on the screen, for example, the resolution of a movie screen is 2048 × 858, that is, the number of horizontal pixels is 2048, the number of vertical pixels is 858, and the higher the resolution is, the larger the number of pixels is, the finer the display effect is; sound format, i.e. digital audio format, such as dolby 5.1 channel, dolby panound; the color space is also called color gamut, is a method for encoding a color, and also refers to the sum of colors that a technical system can generate, namely the area that can be covered in color, for example, the color space is a P3 color gamut, which can match as much as possible all color gamuts that can be shown in a movie scene, and can better satisfy the experience of human vision; the code rate refers to the number of data bits transmitted in unit time during data transmission, and the higher the code rate is, the higher the precision is, for example, the code rate is 250M/s; the refresh rate refers to the number of times the electron beam repeatedly scans the image on the screen. The higher the refresh rate, the better the displayed image (picture) stability. The refresh rate can be divided into a vertical refresh rate and a horizontal refresh rate, and the commonly mentioned refresh rate is generally referred to as a vertical refresh rate. The vertical refresh rate indicates how many times an image of the screen is redrawn per second, i.e., the number of screen refreshes per second, in Hz (hertz). The higher the refresh rate, the more stable the image, the more natural and clear the image display, and the less impact on the eyes. The lower the refresh frequency, the more flickering and jittering the image, and the faster the eye becomes tired.
And S205, playing the decoded video based on the target playing parameter.
The video processing method provided by the embodiment of the application determines that the decoding frame rate is the target frame rate corresponding to the target frame rate mark by acquiring the target frame rate mark of the video to be played, decodes the video to be played by adopting the decoding frame rate, determines that the playing parameter of the video to be played is the target playing parameter corresponding to the target frame rate, and plays the decoded video based on the target playing parameter. According to the video playing method and device, the corresponding target frame rate can be determined according to the target frame rate mark of the video to be played, and then the corresponding decoding frame rate and playing parameters are determined according to the target frame rate, so that the video to be played is decoded and played respectively by adopting the decoding frame rate and the playing parameters corresponding to the target frame rate, and the visual effect of video playing is improved.
Fig. 3 is a flowchart of a video processing method according to another embodiment of the present application. On the basis of the above embodiments, how to decode and play the video to be played according to the target frame rate mark is further described in the embodiments of the present application. As shown in fig. 3, a video processing method according to an embodiment of the present application may include:
s301, obtaining configuration data of a video to be played, wherein the configuration data comprises a target frame rate mark.
Illustratively, the DCP corresponding to the video to be played includes a metadata file for storing the configuration data. The configuration data includes a target frame rate flag corresponding to each frame, for example, for a first frame of the video to be played in the configuration data, the corresponding target frame rate flag is 0001, the target frame rate flag corresponding to a second frame is 0010, and different target frame rate flags respectively correspond to different frame rates of the video to be played.
S302, obtaining the target frame rate mark of the video to be played from the configuration data.
In this embodiment, S301 and S302 are further refinements of the above step S201.
S303, acquiring the corresponding relation between the frame rate and the frame rate mark from the configuration file.
The configuration file comprises the corresponding relation between the frame rate and the frame rate mark. Illustratively, as shown in table 1, the frame rate flag 0001 corresponds to a frame rate of 24fps, the frame rate flag 0010 corresponds to a frame rate of 48fps, the frame rate flag 0011 corresponds to a frame rate of 120fps, and so on are contained in the configuration file.
TABLE 1
Serial number Frame rate marking Frame rate
1 0001 24fps
2 0010 48fps
3 0011 120fps
…… …… ……
S304, according to the corresponding relation between the frame rate and the frame rate mark, determining the target frame rate corresponding to the target frame rate mark.
In obtaining the target frame rate tag of the video to be played, from the corresponding relationship between the frame rate and the frame rate tag represented in table 1, the corresponding relationship between the frame rate and the frame rate tag may be queried according to the target frame rate tag, and the target frame rate corresponding to the target frame rate tag is determined.
Illustratively, if the target frame rate of the video to be played is marked as 0001, and the frame rate corresponding to the frame rate in the configuration file is 24fps when the frame rate is marked as 0001, it may be determined that the target frame rate corresponding to the target frame rate mark 0001 is 24fps, and so on.
S305, determining the decoding frame rate as the target frame rate.
It should be noted that steps S303 to S305 are further detailed in the above step S202.
S306, decoding the video to be played by adopting the decoding frame rate.
For a detailed description of this step, reference may be made to the related description of S203 in the embodiment shown in fig. 2, and details are not repeated here.
S307, determining target playing parameters corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameters.
Optionally, the configuration file may further include a corresponding relationship between the frame rate and the playing parameter. Illustratively, the configuration file includes the following table 2, as shown in table 2, the frame rate is 24fps corresponding to a refresh rate of 24 frames, the frame rate is 48ps corresponding to a playback parameter of 48 frames, and so on. It should be noted that, table 2 is described by taking the playback parameter as the refresh rate as an example, but the present application is not limited thereto.
In this case, the electronic device may query a corresponding relationship between the frame rate and the playing parameter in the configuration file according to the target frame rate, and determine the target playing parameter corresponding to the target frame rate.
TABLE 2
Figure BDA0003005562820000081
Figure BDA0003005562820000091
S308, determining the playing parameters of the video to be played as target playing parameters.
Wherein, S307 and S308 are further refinements of the above step S204.
S309, playing the decoded video based on the target playing parameter.
The video processing method provided by the embodiment of the application, by acquiring configuration data of a video to be played, wherein the configuration data comprises a target frame rate mark, acquiring the target frame rate mark of the video to be played from the configuration data, acquiring a corresponding relation between a frame rate and the frame rate mark from a configuration file, determining a target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark, determining a decoding frame rate as the target frame rate, decoding the video to be played by adopting the decoding frame rate, determining a target playing parameter corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameter, determining the playing parameter of the video to be played as the target playing parameter, and playing the decoded video based on the target playing parameter. According to the method and the device, the corresponding target frame rate can be determined according to the target frame rate mark of the video to be played, and then the corresponding decoding frame rate and the corresponding playing parameter can be determined according to the target frame rate, so that the video to be played is decoded and played by adopting the decoding frame rate and the playing parameter corresponding to the target frame rate, and the visual effect of video playing is improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 4 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application, and as shown in fig. 4, a video processing apparatus 400 according to an embodiment of the present application includes: an acquisition module 401, a first determination module 402, a decoding module 403, a second determination module 404, and a playing module 405. Wherein:
an obtaining module 401, configured to obtain a target frame rate flag of a video to be played;
a first determining module 402, configured to determine the decoding frame rate as a target frame rate corresponding to the target frame rate mark;
a decoding module 403, configured to decode a video to be played at a decoding frame rate;
a second determining module 404, configured to determine that a playing parameter of a video to be played is a target playing parameter corresponding to a target frame rate;
a playing module 405, configured to play the decoded video based on the target playing parameter.
On the basis of any of the above illustrated embodiments, the first determining module 402 may specifically be configured to: determining a target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark; and determining the decoding frame rate as the target frame rate.
On the basis of any of the illustrated embodiments, the first determining module 402 may be further configured to: before determining the target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark, acquiring the corresponding relation between the frame rate and the frame rate mark from a configuration file.
On the basis of any of the above illustrated embodiments, the second determining module 404 may specifically be configured to: determining a target playing parameter corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameter; and determining the playing parameters of the video to be played as target playing parameters.
On the basis of any of the above illustrated embodiments, the obtaining module 401 may specifically be configured to: acquiring configuration data of a video to be played, wherein the configuration data comprises a target frame rate mark; and acquiring the target frame rate mark of the video to be played from the configuration data.
On the basis of any one of the above-described embodiments, the video to be played includes video segments with at least two frame rates, and the target frame rates corresponding to different video segments are marked differently, and the obtaining module 401 may be specifically configured to: and acquiring a target frame rate mark of the corresponding video segment.
The apparatus of this embodiment may be configured to implement the technical solution of any one of the above-mentioned method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application. Illustratively, the electronic device may be provided as a computer such as a server. Referring to fig. 5, an electronic device 500 includes a processing component 501 that further includes one or more processors and memory resources, represented by memory 502, for storing instructions, such as applications, that are executable by the processing component 501. The application programs stored in memory 502 may include one or more modules that each correspond to a set of instructions. Furthermore, the processing component 501 is configured to execute instructions to perform any of the above-described method embodiments.
The electronic device 500 may also include a power component 503 configured to perform power management of the electronic device 500, a wired or wireless network interface 504 configured to connect the electronic device 500 to a network, and an input/output (I/O) interface 505. The electronic device 500 may operate based on an operating system, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like, stored in the memory 502.
Fig. 6 is a schematic structural diagram of an electronic device according to another embodiment of the present application. Exemplarily, the electronic device may be provided as an electronic device including a display screen (screen) such as a television, a projector, or a desktop. Referring to fig. 6, electronic device 600 may include one or more of the following components: processing component 602, memory 604, power component 606, multimedia component 608, audio/video playing component 610, input/output (I/O) interface 612, decoding component 601, sensor component 614, and communication component 616.
The processing component 602 generally controls overall operation of the electronic device 600, such as operations associated with display, data communication. The bandwidth of the processing component 602 is, for example, 1G, and can be used for processing a video with a high frame rate, which is not limited in this application. The processing component 602 may include one or more processors 620 to execute instructions to perform all or a portion of the steps of the video processing method described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602. For example, the processing component 602 may include a decode module to facilitate interaction between the decode component 601 and the processing component 602. For example, the processing component 602 may include an audiovisual playback module to facilitate interaction between the audiovisual playback component 610 and the processing component 602.
The decoding component 601 is configured to perform processing such as audio and video type identification and decoding on the received audio and video signals to obtain final audio data and video data, and transmit the audio data and the video data to the audio and video playing component 610 through the processing component 602.
The audio/video playing component 610 is configured to play the received audio data and video data.
The memory 604 is configured to store various types of data to support operations at the electronic device 600. Examples of such data include instructions for any application or method operating on the electronic device 600, pictures, videos, and so forth. The Memory 604 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk.
Power supply component 606 provides power to the various components of electronic device 600. The power components 606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 600.
The multimedia component 608 includes a screen that provides an output interface between the electronic device 600 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules.
The sensor component 614 includes one or more sensors for providing status assessment of various aspects of the electronic device 600. For example, the sensor component 614 may detect an open/closed state of the electronic device 600, the relative positioning of components, such as a display and keypad of the electronic device 600, the sensor component 614 may also detect a change in position of the electronic device 600 or a component of the electronic device 600, the presence or absence of user contact with the electronic device 600, orientation or acceleration/deceleration of the electronic device 600, and a change in temperature of the electronic device 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 614 may also include a photosensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge-coupled Device (CCD) photosensitive imaging element, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communications between the electronic device 600 and other devices in a wired or wireless manner. The electronic device 600 may access a Wireless network based on a communication standard, such as Wireless-Fidelity (WiFi), 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the Communication component 616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic Device 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), FPGAs, controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described video Processing methods.
The present application also provides a computer-readable storage medium, in which computer-executable instructions are stored, and when the processor executes the computer-executable instructions, the scheme of the above video processing method is implemented.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, implements an aspect of the video processing method as above.
The computer readable storage medium may be any type of volatile or non-volatile storage device or combination thereof, such as SRAM, EEPROM, EPROM, PROM, ROM, magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an ASIC. Of course, the processor and the readable storage medium may also reside as discrete components in a video processing apparatus.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A video processing method, comprising:
acquiring a target frame rate mark of a video to be played;
determining a decoding frame rate as a target frame rate corresponding to the target frame rate mark;
decoding the video to be played by adopting the decoding frame rate;
determining the playing parameters of the video to be played as target playing parameters corresponding to the target frame rate;
and playing the decoded video based on the target playing parameter.
2. The method of claim 1, wherein the determining the decoding frame rate as the target frame rate corresponding to the target frame rate flag comprises:
determining a target frame rate corresponding to the target frame rate mark according to the corresponding relation between the frame rate and the frame rate mark;
and determining the decoding frame rate as the target frame rate.
3. The video processing method according to claim 2, wherein before determining the target frame rate corresponding to the target frame rate tag according to the correspondence between the frame rates and the frame rate tags, the method further comprises:
and acquiring the corresponding relation between the frame rate and the frame rate mark from the configuration file.
4. The video processing method according to any one of claims 1 to 3, wherein the determining that the playing parameter of the video to be played is a target playing parameter corresponding to the target frame rate includes:
determining a target playing parameter corresponding to the target frame rate according to the corresponding relation between the frame rate and the playing parameter;
and determining the playing parameters of the video to be played as the target playing parameters.
5. The video processing method according to any one of claims 1 to 3, wherein the obtaining the target frame rate flag of the video to be played comprises:
acquiring configuration data of the video to be played, wherein the configuration data comprises the target frame rate mark;
and acquiring the target frame rate mark of the video to be played from the configuration data.
6. The video processing method according to any one of claims 1 to 3, wherein the video to be played includes video segments with at least two frame rates, the target frame rate tags corresponding to different video segments are different, and the obtaining the target frame rate tag of the video to be played includes:
and acquiring a target frame rate mark of the corresponding video segment.
7. A video processing apparatus, comprising:
the acquisition module is used for acquiring a target frame rate mark of a video to be played;
a first determining module, configured to determine a decoding frame rate as a target frame rate corresponding to the target frame rate mark;
the decoding module is used for decoding the video to be played by adopting the decoding frame rate;
the second determining module is used for determining that the playing parameters of the video to be played are target playing parameters corresponding to the target frame rate;
and the playing module is used for playing the decoded video based on the target playing parameter.
8. An electronic device, comprising: a memory and a processor;
the memory is to store program instructions;
the processor is configured to invoke program instructions in the memory to perform the video processing method of any of claims 1 to 6.
9. A computer-readable storage medium having computer program instructions stored therein which, when executed, implement the video processing method of any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the video processing method according to any one of claims 1 to 6 when executed by a processor.
CN202110361083.7A 2021-04-02 2021-04-02 Video processing method and device Active CN113099237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110361083.7A CN113099237B (en) 2021-04-02 2021-04-02 Video processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110361083.7A CN113099237B (en) 2021-04-02 2021-04-02 Video processing method and device

Publications (2)

Publication Number Publication Date
CN113099237A true CN113099237A (en) 2021-07-09
CN113099237B CN113099237B (en) 2023-06-27

Family

ID=76673847

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110361083.7A Active CN113099237B (en) 2021-04-02 2021-04-02 Video processing method and device

Country Status (1)

Country Link
CN (1) CN113099237B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867668A (en) * 2021-09-22 2021-12-31 联想(北京)有限公司 An information processing method and electronic device
CN114938461A (en) * 2022-04-01 2022-08-23 网宿科技股份有限公司 Video processing method, device and equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072339A (en) * 2007-06-12 2007-11-14 北京中星微电子有限公司 Method and system for controlling play frame rate synchronization
CN106162182A (en) * 2015-03-25 2016-11-23 杭州海康威视数字技术股份有限公司 The control method for playing back of a kind of Video coding code stream and system
CN107172486A (en) * 2017-05-24 2017-09-15 维沃移动通信有限公司 A kind of video encoding/decoding method and mobile terminal
US20190042856A1 (en) * 2014-03-07 2019-02-07 Dean Drako Surveillance Video Activity Summary System and Access Method of operation (VASSAM)
CN109688461A (en) * 2019-01-16 2019-04-26 京东方科技集团股份有限公司 Video broadcasting method and device
CN110225405A (en) * 2019-07-12 2019-09-10 青岛一舍科技有限公司 A kind of panoramic video playback method and device
US20200296453A1 (en) * 2017-12-06 2020-09-17 Hong Kong Liveme Corporation Limited Video playing method and apparatus, and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101072339A (en) * 2007-06-12 2007-11-14 北京中星微电子有限公司 Method and system for controlling play frame rate synchronization
US20190042856A1 (en) * 2014-03-07 2019-02-07 Dean Drako Surveillance Video Activity Summary System and Access Method of operation (VASSAM)
CN106162182A (en) * 2015-03-25 2016-11-23 杭州海康威视数字技术股份有限公司 The control method for playing back of a kind of Video coding code stream and system
CN107172486A (en) * 2017-05-24 2017-09-15 维沃移动通信有限公司 A kind of video encoding/decoding method and mobile terminal
US20200296453A1 (en) * 2017-12-06 2020-09-17 Hong Kong Liveme Corporation Limited Video playing method and apparatus, and electronic device
CN109688461A (en) * 2019-01-16 2019-04-26 京东方科技集团股份有限公司 Video broadcasting method and device
CN110225405A (en) * 2019-07-12 2019-09-10 青岛一舍科技有限公司 A kind of panoramic video playback method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113867668A (en) * 2021-09-22 2021-12-31 联想(北京)有限公司 An information processing method and electronic device
CN114938461A (en) * 2022-04-01 2022-08-23 网宿科技股份有限公司 Video processing method, device and equipment and readable storage medium

Also Published As

Publication number Publication date
CN113099237B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
US11758187B2 (en) Methods, devices and stream for encoding and decoding volumetric video
US11025955B2 (en) Methods, devices and stream for encoding and decoding volumetric video
US11463700B2 (en) Video picture processing method and apparatus
US11245939B2 (en) Generating and transmitting metadata for virtual reality
CN110677672B (en) Method and system for encoding video with overlay
US11122245B2 (en) Display apparatus, method for controlling the same and image providing apparatus
KR102617258B1 (en) Image processing method and apparatus
US20180192063A1 (en) Method and System for Virtual Reality (VR) Video Transcode By Extracting Residual From Different Resolutions
CN109478344B (en) Method and apparatus for synthesizing image
CN102918855B (en) For the method and apparatus of the activity space of reasonable employment frame packing form
CN108063976B (en) Video processing method and device
US20160316274A1 (en) Method and system for switching video playback resolution
US20180102082A1 (en) Apparatus, system, and method for video creation, transmission and display to reduce latency and enhance video quality
CN113099237B (en) Video processing method and device
US11494162B2 (en) Display apparatus and audio outputting method
US20180012369A1 (en) Video overlay modification for enhanced readability
US20220150543A1 (en) Method and apparatus for depth encoding and decoding
CN114466228B (en) Method, equipment and storage medium for improving smoothness of screen projection display
US20130258051A1 (en) Apparatus and method for processing 3d video data
US20240013475A1 (en) Transparency range for volumetric video
CN108810574B (en) Video information processing method and terminal
WO2021052040A1 (en) Video image enhancement method, device, apparatus, chip, and storage medium
CN112328145A (en) Image display method, apparatus, device, and computer-readable storage medium
CN114745597A (en) Video processing method and apparatus, electronic device, and computer-readable storage medium
CN113630638A (en) Method and device for processing virtual reality data of television

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant