CN110996088A - Video processing method and related device - Google Patents
Video processing method and related device Download PDFInfo
- Publication number
- CN110996088A CN110996088A CN201911252493.7A CN201911252493A CN110996088A CN 110996088 A CN110996088 A CN 110996088A CN 201911252493 A CN201911252493 A CN 201911252493A CN 110996088 A CN110996088 A CN 110996088A
- Authority
- CN
- China
- Prior art keywords
- data
- party application
- media service
- service module
- metadata
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 16
- 238000012545 processing Methods 0.000 claims abstract description 66
- 238000000034 method Methods 0.000 claims abstract description 49
- 230000001360 synchronised effect Effects 0.000 claims abstract description 41
- 230000006870 function Effects 0.000 claims abstract description 36
- 230000015654 memory Effects 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 17
- 238000004891 communication Methods 0.000 claims description 13
- 238000012795 verification Methods 0.000 claims description 7
- 238000010586 diagram Methods 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013475 authorization Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 101100450130 Oryza sativa subsp. japonica HAL3 gene Proteins 0.000 description 1
- 101150106604 SIS2 gene Proteins 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/261—Image signal generators with monoscopic-to-stereoscopic image conversion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
- H04N21/4307—Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4405—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video stream decryption
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Telephone Function (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The embodiment of the application discloses a video processing method and a related device, wherein the method comprises the following steps: the third party application acquires target data and sends the target data to the media service module; the media service module receives the target data, and synchronizes the first metadata and the second metadata to obtain synchronized video data; calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application; the third party application receives the target video data. According to the embodiment of the application, the 3D video function is realized through the OMedia framework, and the implementation cost of the three-party application is reduced.
Description
Technical Field
The present application relates to the field of electronic devices, and in particular, to a video processing method and related apparatus.
Background
At present, various third-party applications are increasingly widely applied to electronic equipment, the third-party applications can access underlying data through a standard application interface, the current API of the Android standard does not support 3D video, and if an interface is added to the standard API, system upgrading is affected.
Disclosure of Invention
The embodiment of the application provides a video processing method and a related device, which aim to realize a 3D video function through an OMedia framework and reduce the implementation cost of three-party application.
In a first aspect, an embodiment of the present application provides a video processing method, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the method comprises the following steps:
the third party application acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata acquired by a first camera of the electronic equipment and second metadata acquired by a second camera;
the media service module receives the target data, and synchronizes the first metadata and the second metadata to obtain synchronized video data; calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application;
the third party application receives the target video data.
In a second aspect, an embodiment of the present application provides a video processing apparatus, which is applied to an electronic device, where the electronic device includes a media service module and an operating system, and an application layer of the operating system is provided with a third-party application; the apparatus comprises a processing unit and a communication unit, wherein,
the processing unit is used for the third-party application to acquire target data and send the target data to the media service module, wherein the target data comprises first metadata acquired by a first camera of the electronic equipment and second metadata acquired by a second camera of the electronic equipment; the media service module receives the target data and synchronizes the first metadata and the second metadata to obtain synchronized video data; calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application; and the third party application receives the target video data.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a chip, including: and the processor is used for calling and running the computer program from the memory so that the device provided with the chip executes part or all of the steps described in any method of the first aspect of the embodiment of the application.
In a fifth aspect, this application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in any one of the methods of the first aspect of this application.
In a sixth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a media service module and an operating system are arranged in an electronic device, and an application layer of the operating system is provided with a third party application; the third-party application firstly acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata collected by a first camera of the electronic equipment and second metadata collected by a second camera, and then the media service module receives the target data and synchronizes the first metadata and the second metadata to obtain synchronized video data; and calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module selects and requests the algorithm module with the enhanced function, which is opened by the operating system to the third-party application, for the third-party application through the media service module, and finally, the third-party application receives the target video data. Therefore, in the embodiment of the application, the third-party application of the electronic device acquires the metadata of the two cameras from the bottom layer, performs synchronous processing on the metadata, and applies the synchronized metadata to the three-dimensional 3D video processing algorithm module for processing to obtain the required 3D video data, that is, the 3D video function is realized through the ome framework, so that the realization cost of the three-party application is reduced, the control is safe, and the safe video processing is facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a video processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another video processing method provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a block diagram of functional units of a video processing apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiments of the present application may be an electronic device with communication capability, and the electronic device may include various handheld devices with wireless communication function, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and so on.
Currently, in an Android platform, a three-party camera Application can access underlying Application data through a standard Android Application Programming Interface (API), but if a user wants to use more underlying enhanced functions or an image processed through an algorithm, there is no corresponding standard Interface to map underlying capabilities to three parties for access. However, it is very important to ensure security after opening the functions of the underlying core, and the current scheme authorizes the security by means of white list or the like.
For the platform developed now, the authorization method of both parties is still simple and insecure after releasing Software Development Kit (SDK) or after close cooperation. The system is easy to be attacked by tampering, simulation and the like, so that the system is greatly influenced.
In view of the foregoing problems, embodiments of the present application provide a video processing method and a related apparatus, and the following describes embodiments of the present application in detail with reference to the accompanying drawings.
As shown in fig. 1, an electronic device 100 according to an embodiment of the present application includes a media service module and an operating system, where the operating system may be an android system, an application layer of the operating system is provided with a third party application and a media management module (also referred to as a media interface module), a hardware abstraction layer of the operating system is provided with a hardware abstraction module (this is an android native module, such as a native camera hardware abstraction module CameraHAL), a media policy module and an algorithm management module, and further, an operating system native architecture further includes a framework layer and a driver layer, the framework layer includes an application interface (such as a native camera application program interface) of various native applications, an application service (such as a native camera service), a framework layer interface (such as Google HAL3 interface), the hardware abstraction layer includes a hardware abstraction layer interface (such as HAL3.0), and a hardware abstraction module (such as a camera hardware abstraction module) of various native applications, the driver layer includes various drivers (e.g., screen Display driver, Audio driver, etc.) for enabling various hardware of the electronic device, such as the image signal processor ISP + front-end image sensors, etc.
The media service module is independent of the operating system, third-party applications can communicate with the media service module through the media management module, the media service module can communicate with the media policy module through an android native information link formed by an application interface, an application service, a frame layer interface, a hardware abstraction layer interface and the hardware abstraction module, the media policy module communicates with the algorithm management module, the algorithm management module maintains an android native algorithm library, the algorithm library comprises enhancement functions supported by various native applications, and for example, for a native camera application, the enhancement functions such as binocular shooting, beauty, sharpening, night vision and the like are supported. In addition, the media service module can also directly communicate with the media policy module or the algorithm management module.
Based on the above framework, the media service module may enable the algorithm module in the algorithm library through the android native information link, the media policy module, and the algorithm management module, or enable the algorithm module in the algorithm library directly through the media policy module and the algorithm management module, or enable the algorithm module in the algorithm library directly through the algorithm management module, thereby implementing an enhanced function of opening native application association for third-party applications.
Based on the above framework, the media service module may invoke the driver of the application to enable some hardware through an android native information link, or through a first information link composed of the media policy module and the hardware abstraction module, or through a second information link composed of the media policy module, the algorithm management module, and the hardware abstraction module, thereby implementing opening native application-related hardware for a third party application.
Referring to fig. 2, fig. 2 is a flowchart illustrating a video processing method according to an embodiment of the present disclosure, where the video processing method can be applied to the electronic device shown in fig. 1. As shown, the video processing method includes the following operations.
S201, the third party application acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata collected by a first camera of the electronic equipment and second metadata collected by a second camera.
The third-party application may be various applications that need to use the underlying application data of the electronic device, such as a camera application and a map application.
The metadata comprises image data corresponding to a time point and a time point, namely A metadata-A time point-A image data.
Optionally, the electronic device further includes a bottom layer driver; the third party application acquires target data, and the method comprises the following steps: the third-party application sends a data request to the underlying driver; the bottom driver receives the data request and reports target data; the third party data receives the target data.
Wherein the third party application is in communication connection with the underlying driver.
S202, the media service module receives the target data, and synchronizes the first metadata and the second metadata to obtain synchronized video data; and calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data, and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application.
Optionally, the third-party application sends a capability obtaining request carrying the version information of the media platform to the media service module; the media service module receives the capability acquisition request, inquires an application capability list of the media platform version information, and sends the application capability list to the third-party application; the third-party application receives the application capability list, and inquires the application capability list to acquire the enhanced functions supported by the current media platform for the third-party application; and determining the enhanced functions selected to be open among the enhanced functions.
Optionally, the media service module converts the first configuration information of the enhanced function selected to be opened into second configuration information that can be recognized by the algorithm management module; and the media service module sends the second configuration information to the algorithm management module through the media policy module.
Optionally, the algorithm management module enables the algorithm module of the enhanced function selected to be opened, and includes: and enabling the algorithm module of the selected open enhanced function by the algorithm management module according to the second configuration information.
S203, the third party application receives the target video data.
Optionally, the electronic device further includes a media management module, where the media management module is disposed in the application layer; the third party application sending the target data to the media service module, including: and the third-party application sends the target data to the media service module through the media management module.
It can be seen that, in the embodiment of the present application, a media service module and an operating system are arranged in an electronic device, and an application layer of the operating system is provided with a third party application; the third-party application firstly acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata collected by a first camera of the electronic equipment and second metadata collected by a second camera, and then the media service module receives the target data and synchronizes the first metadata and the second metadata to obtain synchronized video data; and calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module selects and requests the algorithm module with the enhanced function, which is opened by the operating system to the third-party application, for the third-party application through the media service module, and finally, the third-party application receives the target video data. Therefore, in the embodiment of the application, the third-party application of the electronic device acquires the metadata of the two cameras from the bottom layer, performs synchronous processing on the metadata, and applies the synchronized metadata to the three-dimensional 3D video processing algorithm module for processing to obtain the required 3D video data, that is, the 3D video function is realized through the ome framework, so that the realization cost of the three-party application is reduced, the control is safe, and the safe video processing is facilitated.
In one possible example, the receiving, by the media service module, the target data, and synchronizing the first metadata and the second metadata to obtain synchronized video data includes: the media service module analyzes the first metadata to obtain a first time point and first image data, and analyzes the second metadata to obtain a second time point and second image data; and comparing the first time point with the second time point, and if the difference value is equal to 0, synchronizing the first image data and the second image data to obtain synchronized video data.
Wherein the first metadata and the second metadata may correspond to data collected from a primary camera and a secondary camera, respectively.
For example, the first metadata acquired from the main camera includes a first image data corresponding to 13.00 points and 13.00 points, and the second metadata acquired from the auxiliary camera includes a second image data corresponding to 13.01 points and 13.01 points; and performing difference operation on the 13.00 point and the 13.01 point to obtain a difference value of 0.01, which is not 0, namely not synchronous data, and then acquiring other second metadata from the auxiliary camera, wherein the other second metadata comprise third image data corresponding to the 13.00 point and the 13.00 point, and performing a 3D video processing algorithm by taking the first image data and the third image data as synchronous data to further obtain target video data.
Therefore, in this example, the media service module synchronizes the obtained metadata to further obtain data conforming to the 3D video processing algorithm, so that an unclear or distorted video is prevented from being obtained after the data is mistakenly subjected to the 3D video processing algorithm, the implementation cost of three-party application is reduced, safety is controlled, and safe video processing is facilitated.
In one possible example, the media service module verifying the authentication code and verifying pass includes: the media service module acquires an asymmetric private key of the preconfigured third-party application; the media service module decrypts the authentication code by using the asymmetric private key to obtain an APP signature key, a system date and an appointed field of the third-party application; and the media service module determines that the verification is passed according to the APP signature key, the system date and the appointed field.
The asymmetric private key is one of key pairs in asymmetric encryption, and an asymmetric encryption algorithm needs two keys: public keys (public keys for short) and private keys (private keys for short). The public key and the private key are a pair, and if data is encrypted by the public key, the data can be decrypted only by the corresponding private key. This algorithm is called asymmetric encryption algorithm because two different keys are used for encryption and decryption. The basic process of realizing confidential information exchange by the asymmetric encryption algorithm is as follows: the first party generates a pair of secret keys and discloses the public keys, and needs to send other roles of information to the first party, namely, the second party uses the public keys to encrypt confidential information and then sends the encrypted information to the first party, and the first party uses the private keys to decrypt the encrypted information.
In the specific implementation, the APP signature key can be understood as permission for installing third-party application, after the third-party application is downloaded to the electronic device, the third-party application is sent to an internal server, the internal server uses an asymmetric public key to encrypt according to the APP signature key, the system date, the appointed field and other information of the third-party application to obtain an authentication code, the encrypted information can also be information such as authorization duration, after the media service module receives the authentication code from the third-party application, a pre-configured private key of the third-party application corresponding to the public key can be obtained, then the asymmetric private key is used for decrypting authentication to obtain the APP signature key, the system date, the appointed field and other information of the third-party application, then the system judges the information, and if the information passes correct verification.
It can be seen that, in this example, the media service module first obtains the preconfigured asymmetric private key, and then decrypts the authentication code using the private key to obtain the APP signature key, the system date, and the agreed field of the third-party application, and then determines that the verification is passed according to the APP signature key, the system date, and the agreed field, and encrypts, authorizes, and decrypts through the asymmetric key, so that the security can be effectively controlled, and the security that the bottom core function is open is favorably ensured.
In one possible example, the authentication code is an RSA encrypted ciphertext.
The RSA algorithm is one of asymmetric encryption algorithms, and the RSA algorithm is long in key and high in safety.
In this example, the authentication code is an RSA encrypted ciphertext, which is beneficial to improving security.
Referring to fig. 3, fig. 3 is a flowchart illustrating another video processing method according to an embodiment of the present disclosure, where the video processing method can be applied to the electronic device shown in fig. 1.
As shown in the figure, the video processing method includes the following operations:
s301, the third party application sends a data request to the bottom driver.
S302, the bottom driver receives the data request and reports the target data.
S303, the third party data receives the target data.
S304, the media service module analyzes the first metadata to obtain a first time point and first image data, and analyzes the second metadata to obtain a second time point and second image data.
S305, the media service module compares the first time point with the second time point, and if the difference is equal to 0, synchronizes the first image data and the second image data to obtain synchronized video data.
S306, the media service module calls a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data, and sends the target video data to the third-party application
S307, the third-party application receives the target video data.
It can be seen that, in the embodiment of the present application, a media service module and an operating system are arranged in an electronic device, and an application layer of the operating system is provided with a third party application; the third-party application firstly acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata collected by a first camera of the electronic equipment and second metadata collected by a second camera, and then the media service module receives the target data and synchronizes the first metadata and the second metadata to obtain synchronized video data; and calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module selects and requests the algorithm module with the enhanced function, which is opened by the operating system to the third-party application, for the third-party application through the media service module, and finally, the third-party application receives the target video data. Therefore, in the embodiment of the application, the third-party application of the electronic device acquires the metadata of the two cameras from the bottom layer, performs synchronous processing on the metadata, and applies the synchronized metadata to the three-dimensional 3D video processing algorithm module for processing to obtain the required 3D video data, that is, the 3D video function is realized through the ome framework, so that the realization cost of the three-party application is reduced, the control is safe, and the safe video processing is facilitated.
In addition, the media service module synchronizes the acquired metadata to further acquire data conforming to the 3D video processing algorithm, so that an unclear or distorted video is prevented from being acquired after the data is mistakenly subjected to the 3D video processing algorithm, the implementation cost of three-party application is reduced, safety is controlled, and safe video processing is facilitated.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device 400 according to an embodiment of the present disclosure, and as shown in the figure, the electronic device 400 includes an application processor 410, a memory 420, a communication interface 430, and one or more programs 421, where the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, and the one or more programs 421 include instructions for executing any step in the foregoing method embodiment.
In one possible example, the program 421 includes instructions for performing the following steps: the third party application acquires target data and sends the target data to the media service module, wherein the target data comprises first metadata acquired by a first camera of the electronic equipment and second metadata acquired by a second camera; the media service module receives the target data, and synchronizes the first metadata and the second metadata to obtain synchronized video data; calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application; the third party application receives the target video data.
In one possible example, in an aspect that the media service module receives the target data, and synchronizes the first metadata and the second metadata to obtain synchronized video data, the instructions in the program 421 are specifically configured to perform the following operations: the media service module analyzes the first metadata to obtain a first time point and first image data, and analyzes the second metadata to obtain a second time point and second image data; and comparing the first time point with the second time point, and if the difference value is equal to 0, synchronizing the first image data and the second image data to obtain synchronized video data.
In one possible example, the authentication code is an RSA encrypted ciphertext.
In one possible example, the electronic device further comprises an underlying drive; in terms of the third party application acquiring the target data, the program 421 includes instructions for: the third-party application sends a data request to the underlying driver; the bottom driver receives the data request and reports target data; the third party data receives the target data.
In one possible example, the electronic device further comprises a media management module, the media management module being disposed at the application layer; in terms of the third-party application sending the target data to the media service module, the instructions in the program 421 are specifically configured to perform the following operations: and the third-party application sends the target data to the media service module through the media management module.
In one possible example, in terms of the media service module verifying the authentication code and verifying it is passed, the instructions in the program 421 are specifically configured to: the media service module acquires an asymmetric private key of the preconfigured third-party application; the media service module decrypts the authentication code by using the asymmetric private key to obtain an APP signature key, a system date and an appointed field of the third-party application; and the media service module determines that the verification is passed according to the APP signature key, the system date and the appointed field.
In one possible example, the instructions in the program 421 are specifically configured to perform the following operations: the authentication code is an RSA encrypted ciphertext.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 5 is a block diagram of functional units of a video processing apparatus 500 according to an embodiment of the present application. The video processing apparatus 500 is applied to an electronic device, the electronic device includes a media service module and an operating system, an application layer of the operating system is provided with a third-party application, and the video processing apparatus includes a processing unit 501 and a communication unit 502, where the processing unit 501 is configured to execute any step in the above method embodiments, and when data transmission such as sending is performed, the communication unit 502 is optionally invoked to complete a corresponding operation. The details will be described below.
The processing unit 501 is configured to, in the processing unit, obtain target data by the third-party application, and send the target data to the media service module, where the target data includes first metadata collected by a first camera of the electronic device and second metadata collected by a second camera of the electronic device; the media service module receives the target data and synchronizes the first metadata and the second metadata to obtain synchronized video data; calling a pre-enabled three-dimensional (3D) video processing algorithm module to process the synchronized video data to obtain target video data and sending the target video data to the third-party application, wherein the 3D video processing algorithm module is an algorithm module with enhanced functions, which is selected by the third-party application through the media service module and requests the operating system to be open to the application; and the third party application receives the target video data.
In a possible example, in terms that the media service module receives the target data, synchronizes the first metadata and the second metadata to obtain synchronized video data, the processing unit 501 is specifically configured to, the media service module analyzes the first metadata to obtain a first time point and first image data, and analyzes the second metadata to obtain a second time point and second image data;
and comparing the first time point with the second time point, and if the difference value is equal to 0, synchronizing the first image data and the second image data to obtain synchronized video data.
In a possible example, the electronic device further includes an underlying driver, and in terms of acquiring target data by the third-party application, the processing unit 501 is specifically configured to send a data request to the underlying driver by the third-party application; the bottom driver receives the data request and reports target data; the third party data receives the target data.
In one possible example, the electronic device further comprises a media management module, the media management module being disposed at the application layer; in terms of the third-party application sending the target data to the media service module, the processing unit 501 is specifically configured to send the target data to the media service module through the media management module by the third-party application.
In one possible example, in terms of the media service module verifying the authentication code and verifying that the authentication code is passed, the processing unit 501 is specifically configured to obtain, by the media service module, an asymmetric private key of the preconfigured third-party application; the media service module decrypts the authentication code by using the asymmetric private key to obtain an APP signature key, a system date and an appointed field of the third-party application; and the media service module determines that the verification is passed according to the APP signature key, the system date and the appointed field.
In a possible example, the processing unit 501 is further configured to use the authentication code as an RSA encrypted ciphertext.
The video processing apparatus 500 may further include a storage unit 503 for storing program codes and data of the electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
Embodiments of the present application further provide a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that a device in which the chip is installed performs some or all of the steps described in the electronic device in the above method embodiments.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911252493.7A CN110996088B (en) | 2019-12-09 | 2019-12-09 | Video processing method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911252493.7A CN110996088B (en) | 2019-12-09 | 2019-12-09 | Video processing method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110996088A true CN110996088A (en) | 2020-04-10 |
CN110996088B CN110996088B (en) | 2021-06-29 |
Family
ID=70091467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911252493.7A Active CN110996088B (en) | 2019-12-09 | 2019-12-09 | Video processing method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110996088B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107948702A (en) * | 2017-11-21 | 2018-04-20 | 广州酷狗计算机科技有限公司 | Synchronous method, device, terminal and the storage medium of Application Status |
WO2018140053A1 (en) * | 2017-01-30 | 2018-08-02 | Rovi Guides, Inc. | Systems and methods for enabling settings sharing between applications based on relative distance of application icon placement |
CN108833902A (en) * | 2018-07-17 | 2018-11-16 | 成都泰盟软件有限公司 | A method of realizing that ordinary screen shows 3D rendering by crossfire |
CN109640180A (en) * | 2018-12-12 | 2019-04-16 | 上海玮舟微电子科技有限公司 | Method, apparatus, equipment and the storage medium of video 3D display |
-
2019
- 2019-12-09 CN CN201911252493.7A patent/CN110996088B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018140053A1 (en) * | 2017-01-30 | 2018-08-02 | Rovi Guides, Inc. | Systems and methods for enabling settings sharing between applications based on relative distance of application icon placement |
CN107948702A (en) * | 2017-11-21 | 2018-04-20 | 广州酷狗计算机科技有限公司 | Synchronous method, device, terminal and the storage medium of Application Status |
CN108833902A (en) * | 2018-07-17 | 2018-11-16 | 成都泰盟软件有限公司 | A method of realizing that ordinary screen shows 3D rendering by crossfire |
CN109640180A (en) * | 2018-12-12 | 2019-04-16 | 上海玮舟微电子科技有限公司 | Method, apparatus, equipment and the storage medium of video 3D display |
Also Published As
Publication number | Publication date |
---|---|
CN110996088B (en) | 2021-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8166300B2 (en) | Extending the DRM realm to external devices | |
US10182255B2 (en) | Method, terminal, and system for communication pairing of a digital television terminal and a mobile terminal | |
EP3772700A1 (en) | Method and device for encrypting model of neural network, and storage medium | |
CN105207774B (en) | The cryptographic key negotiation method and device of verification information | |
JP6612322B2 (en) | Data processing method and data processing apparatus | |
CN105119888B (en) | Plug-in unit installation kit method for uploading, installation method and device | |
WO2021115038A1 (en) | Application data processing method and related apparatus | |
CN105634737B (en) | Data transmission method, terminal and system | |
CN107766701B (en) | Electronic equipment, dynamic library file protection method and device | |
US9524394B2 (en) | Method and apparatus for providing provably secure user input/output | |
CN111935166B (en) | Communication authentication method, system, electronic device, server, and storage medium | |
CN113055169B (en) | Data encryption method and device, electronic equipment and storage medium | |
CN105142139A (en) | Method and device for obtaining verification information | |
CN113282951A (en) | Security verification method, device and equipment for application program | |
CN115037552A (en) | Authentication method, device, equipment and storage medium | |
CN109618313B (en) | Vehicle-mounted Bluetooth device and connection method and system thereof | |
CN113141333A (en) | Communication method, device, server, system and storage medium for network access device | |
CN104331672A (en) | Method and device for performing confidential treatment on pictures upon bracelet | |
CN111062025B (en) | Application data processing method and related device | |
CN110996088B (en) | Video processing method and related device | |
CN108696355B (en) | A method and system for preventing the theft of user avatars | |
CN108924136B (en) | Authorization authentication method, device and storage medium | |
CN113315844A (en) | File encryption transmission method, device, equipment and computer readable storage medium | |
CN112131597A (en) | Method and device for generating encrypted information and intelligent equipment | |
CN112182624B (en) | Encryption method, encryption device, storage medium and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |