CN113965693A - A video shooting method, device, storage medium and program product - Google Patents
A video shooting method, device, storage medium and program product Download PDFInfo
- Publication number
- CN113965693A CN113965693A CN202110926459.4A CN202110926459A CN113965693A CN 113965693 A CN113965693 A CN 113965693A CN 202110926459 A CN202110926459 A CN 202110926459A CN 113965693 A CN113965693 A CN 113965693A
- Authority
- CN
- China
- Prior art keywords
- parameter
- shooting
- camera application
- running state
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides a video shooting method, video shooting equipment, a storage medium and a program product. The method comprises the following steps: stopping shooting the video image in response to a shooting end operation, and determining an operating state of the camera application; determining parameter recovery information of the camera application according to the running state of the camera application; the parameter recovery information is related information of shooting parameters needing to be recovered to a preset initial value in camera application; and restoring first target shooting parameters in the camera application to a preset initial value according to the parameter restoring information of the camera application, wherein the first target shooting parameters are at least one of the shooting parameters indicated in the parameter restoring information. The method and the device are used for restoring part of shooting parameters to the initial values, manual restoration of a user is not needed, user operation can be reduced, the intellectualization of the electronic equipment is improved, and therefore user experience is improved.
Description
Technical Field
The present application relates to the field of computer technology, and in particular, to a video capture method, apparatus, storage medium, and program product.
Background
With the development of science and technology, mobile terminals generally have the functions of photographing and recording video, and front and rear camera devices have also become standard fittings. It is known that during the video shooting process, a user can perform personalized video shooting by changing certain shooting parameters. For example, the filter can realize special effect of video, and the user can perform personalized video shooting by selecting the filters with different filter effects.
However, in the prior art, if the user changes the shooting parameters, such as selecting a certain filter for shooting, or changes the zoom parameters of video shooting, during the video shooting process, after the shooting is completed, the shooting process is closed. When the next video shooting is carried out, the shooting parameters cannot be restored to the default values, but the shooting parameters set during the last video shooting are kept, so that the shooting parameters are changed by a user every time, the shooting parameters need to be reset to the default values when the next common shooting is carried out, the user operation is increased, and the poor physical examination of the user is caused.
Disclosure of Invention
In view of this, the present application provides a video shooting method, a video shooting device, a video shooting storage medium, and a video shooting program product, so as to solve the problem that in the prior art, shooting parameters cannot be automatically restored, user operations are increased, and user experience is poor.
In a first aspect, an embodiment of the present application provides a video shooting method applied to an electronic device, where the method includes:
stopping shooting the video image in response to a shooting end operation, and determining an operating state of the camera application;
determining parameter recovery information of the camera application according to the running state of the camera application; the parameter recovery information is related information of shooting parameters needing to be recovered to a preset initial value in camera application;
and restoring the first target shooting parameters in the camera application to a preset initial value according to the parameter restoring information of the camera application, wherein the first target shooting parameters are at least one of the shooting parameters indicated in the parameter restoring information.
Therefore, the electronic equipment can automatically restore the first target shooting parameter to the initial value after shooting is finished, manual restoration of a user is not needed, user operation can be reduced, the intelligence of the electronic equipment is improved, and user experience is improved.
In one possible implementation manner of the first aspect, the running state of the camera application includes: a current running state, a background running state or an end running state.
In another possible implementation manner of the first aspect, when the running state of the camera application is a current running state, the first target shooting parameter includes: at least one of a photographing zoom parameter, a photographing rate parameter and a photographing frame parameter;
when the running state of the camera application is a background running state or a running ending state, the first target shooting parameter includes: at least one of a filter parameter, a photographing zoom parameter, a photographing rate parameter, and a photographing frame parameter.
Therefore, the running states of the camera application are different, the first target shooting parameters needing to be recovered to the preset initial values are different, so that the parameter values of the different first target shooting parameters are recovered according to different running states of the camera, the use requirements of users who can meet the requirements better are met, the intelligence of the electronic equipment is improved, and the user experience is improved.
In another possible implementation manner of the first aspect, when the running state of the camera application is a current running state, a background running state, or an end-of-running state, the first target shooting parameter includes: at least one of filter parameters, shooting zoom parameters, shooting rate parameters and shooting frame parameters.
Therefore, the running states of the camera application are different, the first target shooting parameters needing to be recovered to the preset initial values are different, so that the parameter values of the different first target shooting parameters are recovered according to different running states of the camera, the use requirements of users who can meet the requirements better are met, the intelligence of the electronic equipment is improved, and the user experience is improved.
In another possible implementation manner of the first aspect, when the running state of the camera application is a current running state, the first target shooting parameter includes: shooting rate parameters;
when the running state of the camera application is a background running state or a running ending state, the first target shooting parameter includes: at least one of a shooting rate parameter, a filter parameter and a macroscreen film IMAX parameter.
Therefore, the running states of the camera application are different, the first target shooting parameters needing to be recovered to the preset initial values are different, so that the parameter values of the different first target shooting parameters are recovered according to different running states of the camera, the use requirements of users who can meet the requirements better are met, the intelligence of the electronic equipment is improved, and the user experience is improved.
In another possible implementation manner of the first aspect, the restoring, according to the parameter restoration information of the camera application, the first target shooting parameter in the camera application to a preset initial value includes:
and when the parameter recovery information carries the recovery value of the first target shooting parameter, recovering the first target shooting parameter in the camera application to the recovery value of the first target shooting parameter carried in the parameter recovery information according to the parameter recovery information of the camera application.
Therefore, the recovery value of the first target shooting parameter can be directly carried in the parameter recovery information, the access frequency of the memory can be reduced, and the efficiency of the electronic equipment can be improved.
In another possible implementation manner of the first aspect, the preset initial values of the filter parameters include: the initial LUT is used to tone the corresponding parameter values.
In another possible implementation manner of the first aspect, the preset initial values of the filter parameters include: the parameter value corresponding to the original tone.
In another possible implementation manner of the first aspect, the restoring, according to the parameter restoration information of the camera application, the first target shooting parameter in the camera application to a preset initial value includes:
and according to the parameter recovery information applied by the camera, determining all the shooting parameters indicated by the parameter recovery information as first target shooting parameters, and recovering the first target shooting parameters to preset initial values.
Therefore, all shooting parameters recorded in the parameter recovery information are directly determined as the first target shooting parameters, and the method is easy to implement and convenient to implement.
In another possible implementation manner of the first aspect, the restoring, according to the parameter restoration information of the camera application, the first target shooting parameter in the camera application to a preset initial value includes:
according to the parameter recovery information applied by the camera, determining the shooting parameters with parameter values different from the preset initial values in the shooting parameters indicated by the parameter recovery information as the first target shooting parameters, and recovering the first target shooting parameters to the preset initial values.
In this way, only the shooting parameter of which the value is changed in the parameter recovery information is determined as the first target shooting parameter, so that the resources of the processor can be reduced, the recovery of the parameter value of the first target shooting parameter can be completed more quickly, and the efficiency of the electronic device can be improved.
In another possible implementation manner of the first aspect, before stopping capturing the video image in response to the recording end operation, the method further includes:
setting a value of a second target photographing parameter of the camera application according to the photographing parameter setting operation in response to the photographing parameter setting operation; the shooting parameter setting operation carries a setting value of a second target shooting parameter; the second target shooting parameter is a shooting parameter whose parameter value is set by the user.
Therefore, the user can set the shooting parameters of the electronic equipment according to actual requirements, the intellectualization of the electronic equipment is improved, and the shooting requirements of the user are met.
In another possible implementation manner of the first aspect, the method further includes: and determining a third target shooting parameter according to the second target shooting parameter and the first target shooting parameter. The third target shooting parameter is related information of the shooting parameters which do not need to be restored to the preset initial value in the second target shooting parameters.
In another possible implementation manner of the first aspect, after stopping capturing the video image in response to the capturing end operation, the method further includes:
and when the second target shooting parameters comprise third target shooting parameters, keeping the parameter values of the third target shooting parameters unchanged.
Therefore, the parameter value of the first target shooting parameter can be restored to the initial value according to the actual requirement of the user, and the parameter value of the third target shooting parameter is kept unchanged, so that the shooting requirement of the user on the electronic equipment is better met, and the user experience is improved.
In another possible implementation manner of the first aspect, when it is determined that the running state of the camera application is a current running state, the method further includes:
determining whether the running state of the camera application is switched to a background running state or a running ending state;
when the running state of the camera application is switched to a background running state or the running state is finished, determining parameter recovery information of the camera application according to the switched running state of the camera application;
and restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Therefore, the running states of the camera application are different, the first target shooting parameters needing to be recovered to the preset initial values are different, so that the parameter values of the different first target shooting parameters can be recovered according to different camera running devices, when the running states of the camera application are switched to other running states, the shooting parameters corresponding to other running states can be recovered to the preset initial values, the use requirements of users who can better meet the requirements are met, the intelligence of electronic equipment is improved, and the user experience is improved.
In another possible implementation manner of the first aspect, when the running state of the camera application is determined to be a background running state, the method further includes:
determining whether the running state of the camera application is switched to a state of ending running;
when the running state of the camera application is switched to a running finishing state, determining parameter recovery information of the camera application according to the switched running state of the camera application;
and restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Therefore, the running states of the camera application are different, the first target shooting parameters needing to be recovered to the preset initial values are different, so that the parameter values of the different first target shooting parameters can be recovered according to different camera running devices, when the running states of the camera application are switched to other running states, the shooting parameters corresponding to other running states can be recovered to the preset initial values, the use requirements of users who can better meet the requirements are met, the intelligence of electronic equipment is improved, and the user experience is improved.
In another possible implementation manner of the first aspect, before capturing a video image in response to the video capturing operation, the method further includes:
in response to a camera mode selection operation, a target camera mode is determined.
In another possible implementation form of the first aspect, the target camera mode comprises a movie mode.
In another possible implementation manner of the first aspect, before stopping capturing the video image in response to the recording end operation, the method further includes:
determining a target LUT tone according to the collected video image, and updating the parameter value of the filter parameter to the parameter value corresponding to the target LUT tone when the preset initial value of the filter parameter is different from the parameter value corresponding to the target LUT tone; the target LUT tint is a LUT tint recommended from the captured video image;
performing filter rendering processing on the video image according to the filter parameters;
in response to a video photographing operation, a video image is photographed.
Therefore, the appropriate LUT tone can be automatically recommended according to the actually shot image, so that the styles or effects of the images shot in different shooting scenes are different, the shooting styles or effects of the electronic equipment are enriched, and the shooting effect is more diversified and personalized. On the other hand, the selection of the filter does not need the manual operation of a user, so that the time of the manual operation of the user is reduced, and the shooting efficiency is improved.
In a second aspect, embodiments of the present application provide an electronic device, comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any one of the first aspect.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium includes a stored program, where when the program runs, the apparatus on which the computer-readable storage medium is located is controlled to execute the method in any one of the above first aspects.
In a fourth aspect, the present application provides a computer program product, which contains executable instructions that, when executed on a computer, cause the computer to perform the method of any one of the above first aspects.
By adopting the technical scheme provided by the embodiment of the application, after the video image shooting is finished, the parameter recovery information can be determined according to the running state of the camera application, and the value of the first target shooting parameter in the camera application is recovered to the preset initial value according to the parameter recovery information. The method and the device can automatically restore the first target shooting parameter to the initial value after shooting is completed, do not need manual restoration of a user, can reduce user operation, improve the intelligence of the electronic equipment, and accordingly improve user experience.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments will be briefly described below, and obviously, the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive labor.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a video shooting scene according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a video shooting method according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of another video shooting method according to an embodiment of the present disclosure;
fig. 5 is a schematic view of another video shooting scene provided in an embodiment of the present application;
fig. 6 is a schematic view of another video shooting scene provided in the embodiment of the present application;
fig. 7 is a schematic view of another video shooting scene provided in the embodiment of the present application;
fig. 8 is a schematic view of another video shooting scene provided in an embodiment of the present application;
fig. 9 is a schematic flowchart of another video shooting method according to an embodiment of the present application;
fig. 10 is a schematic view of a scene where filter parameters are restored to preset initial values according to an embodiment of the present disclosure;
fig. 11 is a block diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 12 is a block diagram of a software structure of another electronic device according to an embodiment of the present disclosure;
fig. 13 is a schematic flowchart of another video shooting method according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For better understanding of the technical solutions of the present application, the following detailed descriptions of the embodiments of the present application are provided with reference to the accompanying drawings.
It should be understood that the embodiments described are only a few embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given in the present application without inventive step, shall fall within the scope of protection of the present application.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of associative relationship that describes an associated object, meaning that three types of relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Referring to fig. 1, a schematic view of an electronic device provided in an embodiment of the present application is shown. In fig. 1, an electronic device is exemplified by a mobile phone 100, and fig. 1 shows a front view and a rear view of the mobile phone 100, two front cameras 111 and 112 are arranged on the front side of the mobile phone 100, and four rear cameras 121, 122, 123 and 124 are arranged on the rear side of the mobile phone 100. By configuring a plurality of cameras, a plurality of shooting modes, such as a forward shooting mode, a backward shooting mode, a forward and backward double shooting mode and the like, can be provided for a user. The user can select a corresponding shooting mode to shoot according to the shooting scene so as to improve the user experience.
It is to be understood that the illustration of fig. 1 is merely an exemplary illustration and should not be taken as a limitation on the scope of the present application. For example, the number and positions of cameras may be different for different mobile phones. In addition, the electronic device according to the embodiment of the present application may be a tablet PC, a Personal Computer (PC), a Personal Digital Assistant (PDA), a smart watch, a netbook, a wearable electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an in-vehicle device, a smart car, a smart audio, a robot, smart glasses, a smart television, or the like, in addition to a mobile phone.
It should be noted that, in some possible implementations, the electronic device may also be referred to as a terminal device, a User Equipment (UE), and the like, which is not limited in this embodiment of the present application.
In an actual application scene, a user performs video shooting through the electronic device, and in the video shooting process, in order to perform personalized shooting, the user may change some shooting parameters, such as a filter used in shooting, a shooting rate in shooting, a shooting frame in shooting, and the like. Illustratively, the user may employ the filter 1 in order to highlight the photographed scene during photographing, or may employ the filter 2 in order to highlight the photographed person. At present, in the shooting process, if shooting parameters are changed, the shooting parameters still keep the current values unchanged after shooting is completed. For example, to perform personalized photography, the filter used by the user is LUT2, and photography is performed using 2-fold zoom, as shown in (1) in fig. 2. After the shooting is completed, the shooting parameters will still keep the current values unchanged, as shown in (2) in fig. 2. In this way, when the next shooting is performed, the user needs to restore the shooting parameters changed in the previous shooting to the initial values before performing normal shooting. When the shooting parameters are changed, the shooting parameters need to be restored to the initial values by the user, so that the user operation is increased, and the user experience is poor.
The embodiment of the application provides a video shooting method, after video image shooting is finished, parameter recovery information can be determined according to the running state of a camera application, and the value of a first target shooting parameter in the camera application is recovered to a preset initial value according to the parameter recovery information. The method and the device can automatically restore the first target shooting parameter to the initial value after shooting is completed, do not need manual restoration of a user, can reduce user operation, improve the intelligence of the electronic equipment, and accordingly improve user experience.
Referring to fig. 3, a schematic flow chart of a video shooting method according to an embodiment of the present application is shown. The method can be applied to the electronic device shown in fig. 1, and mainly includes the following steps, as shown in fig. 3.
Step S301, in response to the shooting end operation, stops shooting the video image, and determines the running state of the camera application.
In the embodiment of the application, when a user uses the electronic device to shoot a video, the electronic device can acquire a video image through a camera therein, and when the user needs to end the video shooting, the user can send a shooting end operation to the electronic device. At this time, the electronic apparatus may receive the shooting end operation, and stop the shooting of the video image according to the shooting end operation.
Since the electronic apparatus can restore the values of some shooting parameters that the user changed at the time of shooting to their initial values. At this time, the electronic device needs to detect the running state of the current camera application when the video image stops being shot. That is, when the electronic device stops shooting the video image, it needs to detect whether the camera application is in a current running state, a background running state, or a running ending state.
The running state of the camera application refers to the current working state of the camera application. The operating states of the camera application include: a current running state, or a background running state, or an end running state. When the running state of the camera application is the current running state, the current camera application is in use and is visible to a user, and the display interface of the electronic device is a preview interface of the camera application in the current camera mode and can be operated by the user. The current running state of the camera application comprises a state of staying in a preview interface in a current camera mode, a state of switching modes, such as switching from a movie mode to a photo mode or a video mode or other modes, and the like. When the running state of the camera application is a background running state, it is indicated that the current camera application is still in the running state and the running is not finished, but the display interface of the electronic device is not a shooting preview interface of the camera application, and the camera application is not operated by the user and runs in a background which is not seen by the user. When the running state of the camera application is the state of ending running, the camera application is closed.
Step S302, determining parameter recovery information of the camera application according to the running state of the camera application.
The parameter recovery information is related to shooting parameters that need to be recovered to a preset initial value in the camera application.
In the embodiment of the present application, the parameter recovery information is preset. The parameter recovery information corresponding to the operating state of different camera applications may be different. And when the electronic equipment determines to stop shooting the video image, determining parameter recovery information matched with the current running state according to the running state of the current camera application after the running state of the current camera application.
It should be noted that, for different operation states of the camera application, the corresponding shooting parameters that need to be restored to the preset initial values are different, so that different parameter restoration information can be set in advance according to the different operation states of the camera application.
Further, when the running state of the camera application is the current running state, the parameter recovery information includes at least one piece of relevant information of the shooting zoom parameter, the shooting rate parameter and the shooting frame parameter. When the running state of the camera application is a background running state or a running finishing state, the parameter recovery information comprises at least one piece of relevant information of filter parameters, shooting zoom parameters, shooting rate parameters and shooting picture parameters. That is, the three parameters, i.e., the shooting zoom parameter, the shooting rate parameter, and the shooting frame parameter, may be parameters that need to be restored to a preset initial value when the video image shooting is stopped and the camera application is in the current operating state, or when the video image shooting is stopped and the camera application is switched to background operation, or when the video image shooting is stopped and the camera application is stopped. And the filter parameters only switch the camera application to the background operation while stopping shooting the video image, or restore the camera application to the preset initial value while stopping shooting the video image. When the shooting of the video image is stopped and the camera application is in the current running state, it is described that the electronic device ends the shooting of the video image, but the electronic device is currently running the camera application, and the user can operate the camera application. At this time, the filter parameters are not restored to the preset initial values, but the values are kept unchanged until the filter conforming to the current shooting scene is automatically recommended according to the video image of the current shooting scene, or the user selects other filters for shooting.
Or when the running state of the camera application is the current running state or the background running state or the running ending state, the parameter recovery information comprises at least one piece of relevant information of filter parameters, shooting zoom parameters, shooting rate parameters and shooting picture parameters. That is, the four parameters, i.e., the filter parameter, the shooting zoom parameter, the shooting rate parameter, and the shooting frame parameter, may be parameters that need to be restored to a preset initial value when the video image shooting is stopped and the camera application is in the current operating state, or when the video image shooting is stopped and the camera application is switched to the background operation, or when the video image shooting is stopped and the camera application is stopped. Therefore, in the scene, as long as the video image shooting is completed, at least one of the four parameters, i.e., the filter parameter, the shooting zoom parameter, the shooting rate parameter and the shooting frame parameter, needs to be restored to the preset initial value.
Or when the running state of the camera application is the current running state, the parameter recovery information comprises the relevant information of the shooting rate parameter; when the running state of the camera application is a background running state or a running ending state, the parameter recovery information includes related information of at least one of a shooting rate parameter, a filter parameter and an IMAX (Image Maximum) parameter. That is, when the video image is stopped from being shot and the camera application is in the current running state, only the parameter value of the shooting rate parameter needs to be restored to the preset initial value. At least one of the filter parameter and the IMAX parameter remains unchanged at the current value. And when the video image shooting is stopped and the camera application is switched to the background operation, or when the video image shooting is stopped and the camera application is stopped, at least one of the shooting rate parameter, the filter parameter and the IMAX parameter can be restored to a preset initial value.
It should be noted that, the electronic device is preset with parameter recovery information corresponding to different operating states of the camera application. The shooting parameters included in the parameter restoration information that need to be restored to the preset initial values are preset according to actual needs, and may include other shooting parameters listed above, such as AI intelligent recommendation parameters, and the like, which is not limited in this application. When the AI intelligent recommendation is started, the electronic equipment can intelligently recommend proper filter parameters according to the current shooting scene.
And step S303, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Wherein the first target shooting parameter is at least one of the shooting parameters indicated in the parameter recovery information.
In this embodiment of the application, after determining the parameter recovery information according to the running state of the camera application, the electronic device may determine the first target shooting parameter in the parameter recovery information, and recover a parameter value of the first target shooting parameter to a preset initial value.
However, when the first target photographing parameter is determined based on the parameter restoration information, there are two ways. One way is to determine all the shooting parameters indicated in the parameter recovery information as the first target shooting parameters for convenient implementation, and at this time, all the shooting parameters recorded in the parameter recovery information need to be recovered to their initial preset values. Another way is that in order to reduce the resources used, a shooting parameter whose current value is not a preset initial value among shooting parameters recorded in the parameter recovery information may be determined as the first target shooting parameter. In this way, only the shooting parameters whose parameter values have been changed may be determined as the first target shooting parameters, and the shooting parameters whose parameter values have not been changed may not be subjected to the processing of restoring the preset initial values. The specific implementation mode is as follows:
further, according to the parameter recovery information of the camera application, recovering the first target shooting parameter in the camera application to the preset initial value includes: and according to the parameter recovery information applied by the camera, determining all the shooting parameters indicated by the parameter recovery information as first target shooting parameters, and recovering the first target shooting parameters to preset initial values.
In this way, the shooting parameters to be restored to the preset initial value recorded in the parameter restoration information can all be determined as the first target shooting parameters, and the parameter values of the first target shooting parameters can be restored to the preset initial value. In this manner, the parameter value of the first target shooting parameter may be restored to the preset initial value regardless of whether the current parameter value of the current first target shooting parameter is the preset initial value.
Or, according to the parameter recovery information of the camera application, recovering the first target shooting parameter in the camera application to a preset initial value includes: and determining the shooting parameters with parameter values different from the preset initial values in the shooting parameters indicated by the parameter recovery information as first target shooting parameters according to the parameter recovery information applied by the camera, and recovering the first target shooting parameters to the preset initial values.
Since all the shooting parameters recorded in the parameter recovery information need to be recovered to the preset initial value, and in the process of shooting the video image by the user, only the values of some of the shooting parameters may be changed, and the values of all the shooting parameters indicated in the parameter recovery information are not changed, in order to reduce the resources occupied by the processor and the memory, the shooting parameter whose parameter value is changed in all the shooting parameters indicated in the parameter recovery information may be determined as the first target shooting parameter, and at this time, the parameter value of the first target shooting parameter is recovered to the preset initial value.
It should be noted that the preset initial value of each shooting parameter is preset, and when the user enters the camera application and does not set the value of the shooting parameter, the parameter value of each shooting parameter is set as the corresponding preset initial value by default. The preset initial values corresponding to different shooting parameters are different, and in the actual application process, each preset initial value can be set according to the actual requirement of a user.
Further, according to the parameter recovery information of the camera application, recovering the first target shooting parameter in the camera application to the preset initial value includes: and when the parameter recovery information carries the recovery value of the first target shooting parameter, recovering the first target shooting parameter in the camera application to the recovery value of the first target shooting parameter carried in the parameter recovery information according to the parameter recovery information of the camera application.
Specifically, the shooting parameters that need to be restored to the preset initial values may be preset in the parameter restoration information, and for convenience of implementation, when the shooting parameters that need to be restored to the preset initial values are set in the parameter restoration information, the restored preset initial values may be set at the same time, that is, the preset initial values corresponding to the shooting parameters that need to be restored to the preset initial values are set. At this time, after the electronic device acquires the parameter recovery information, the electronic device may analyze the parameter recovery information to acquire the shooting parameters and the corresponding preset initial values that need to be recovered to the preset initial values when the camera is applied in different operation states. The electronic equipment can determine the first target shooting parameters according to the parameter recovery information, determine the recovery values of the first target shooting parameters in the parameter recovery information, and update the parameter values of the first target shooting parameters by taking the determined recovery values of the first target shooting parameters as preset initial values.
Further, the preset initial values of the filter parameters include: the initial LUT is used to tone the corresponding parameter values. That is, in the embodiment of the present application, a plurality of LUT tone filters are provided in the electronic device, one of the LUT tone filters is selected as an initial LUT tone, and a preset initial value of a filter parameter is set as a parameter value corresponding to the initial LUT tone. For example, filters for a plurality of LUT tones provided in the electronic device are nostalgic, morning light, cyan, blue tones, and the like. A filter of the nostalgic LUT tone may be used as the initial LUT tone and preset initial values of filter parameters may be set as parameter values corresponding to the nostalgic LUT tone. In this way, the preset initial value of the filter parameter applied by the camera is the parameter value corresponding to the initial LUT tone. Thus, when the electronic device is running a camera application, the captured video image is by default superimposed with the initial LUT tone.
Or, the preset initial values of the filter parameters include: and (4) parameter values corresponding to the original tones. That is, in the embodiment of the present application, the preset initial value of the filter parameter is set as the parameter value corresponding to the original tone. At this time, when the electronic device runs the camera application, the shot video image is the original image by default, that is, the video image without any filter effect superimposed thereon. That is to say, the initial default state of the filter applied by the camera is the closed state, and when the filter parameter is restored to the initial preset value, the filter applied by the camera is restored to the closed state.
Further, if the running states of the camera applications are different, the shooting parameters to be restored to the preset initial values included in the parameter restoration information are different. Therefore, the first target photographing parameters are different when the operation states of the camera applications are different. The method comprises the following specific steps:
when the running state of the camera application is a current running state, the first target shooting parameter comprises: at least one of a shooting zoom parameter, a shooting rate parameter and a shooting frame parameter; when the running state of the camera application is a background running state or an end running state, the first target shooting parameter includes: at least one of a filter parameter, a photographing zoom parameter, a photographing rate parameter, and a photographing frame parameter. That is, when the video image capturing is finished and the camera application state is in the current running state, at least one of the capturing zoom parameter, the capturing rate parameter, and the capturing frame parameter may be determined as the first target capturing parameter, so that at least one of the capturing zoom parameter, the capturing rate parameter, and the capturing frame parameter may be restored to the preset initial value. When the video image shooting is finished, the camera application is switched to the background operation, namely when the video image shooting is finished and the camera application state is the background operation state, or when the video image shooting is finished and the camera application is closed, namely when the video image shooting is finished and the camera application state is the end operation state, at least one of the filter parameter, the shooting zoom parameter, the shooting rate parameter and the shooting picture parameter can be determined as the first target shooting parameter, so that at least one of the filter parameter, the shooting zoom parameter, the shooting rate parameter and the shooting picture parameter can be restored to a preset initial value.
Or when the running state of the camera application is a current running state, a background running state or a running ending state, the first target shooting parameter includes: at least one of a filter parameter, a photographing zoom parameter, a photographing rate parameter, and a photographing frame parameter. That is, when the video image shooting is finished and the camera application state is in the current running state, or when the video image shooting is finished and the camera application is switched to the background running state, that is, when the video image shooting is finished and the camera application state is the background running state, or when the video image shooting is finished and the camera application is closed, that is, when the video image shooting is finished and the camera application state is the end running state, at least one of the filter parameter, the shooting zoom parameter, the shooting rate parameter and the shooting frame parameter can be determined as the first target shooting parameter, so that at least one of the filter parameter, the shooting zoom parameter, the shooting rate parameter and the shooting frame parameter can be restored to the preset initial value.
Or, when the running state of the camera application is the current running state, the first target shooting parameter includes: a shooting rate parameter; when the running state of the camera application is a background running state or a running ending state, the first target shooting parameter includes: at least one of a shooting rate parameter, a filter parameter, and an IMAX parameter. That is, when the video image photographing is ended and the camera application state is in a currently running state, the photographing rate parameter may be determined as the first target photographing parameter, so that the photographing rate parameter may be restored to a preset initial value. When the video image shooting is finished, the camera application is switched to the background operation, that is, when the video image shooting is finished, the camera application state is the background operation state, or when the video image shooting is finished, the camera application is closed, that is, when the video image shooting is finished, the camera application state is the end operation state, at least one of the filter parameter and the IMAX parameter can be determined as the first target shooting parameter, so that at least one of the filter parameter and the IMAX parameter can be recovered to the preset initial value.
In summary, after the video image shooting is finished, the parameter recovery information can be determined according to the running state of the camera application, and the value of the first target shooting parameter in the camera application is recovered to the preset initial value according to the parameter recovery information. The method and the device can automatically restore the first target shooting parameter to the initial value after shooting is completed, do not need manual restoration of a user, can reduce user operation, improve the intelligence of electronic equipment, and accordingly improve user experience.
Referring to fig. 4, a schematic flow chart of another video shooting method provided in the embodiment of the present application is shown. The method is different from the embodiment shown in fig. 3 described above in that a step of setting the shooting parameters by the user is added, and as shown in fig. 4, the method mainly includes the following steps.
Step S401, in response to the shooting parameter setting operation, sets a value of a second target shooting parameter of the camera application according to the shooting parameter setting operation.
The shooting parameter setting operation carries a setting value of a second target shooting parameter; the second target shooting parameter is a shooting parameter whose parameter value is set by the user.
In the embodiment of the application, before video shooting, a user can set shooting parameters, at this time, the user can send shooting parameter setting operation to the electronic device, and the shooting parameter setting operation carries values of the shooting parameters set by the user. Namely the setting value carrying the second target shooting parameter. After receiving the shooting parameter setting operation, the electronic device may respond to the shooting parameter setting operation, set a value of a second target shooting parameter in the camera application according to the shooting parameter setting operation, and update the value to a setting value of the second target shooting parameter carried in the shooting parameter setting operation.
Step S402, in response to the shooting end operation, stops shooting the video image, and determines the running state of the camera application.
Specifically, refer to step S301, which is not described herein again.
Step S403, determining parameter recovery information of the camera application according to the running state of the camera application.
The parameter recovery information is related to shooting parameters that need to be recovered to a preset initial value in the camera application.
Specifically, refer to step S302, which is not described herein again.
And S404, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Wherein the first target shooting parameter is at least one of the shooting parameters indicated in the parameter recovery information.
Specifically, refer to step S303, which is not described herein again.
And S405, determining a third target shooting parameter according to the second target shooting parameter and the first target shooting parameter, and keeping the parameter value of the third target shooting parameter unchanged.
The third target shooting parameter is related information of the shooting parameters which do not need to be restored to the preset initial value in the second target shooting parameters.
In the embodiment of the application, when the user sets the shooting parameters of the camera application, after some shooting parameters are set, the camera application is kept unchanged when the operation of the camera application is finished until the parameter values are reset by the user next time. The values of some shooting parameters need to be restored to preset initial values. Based on this, in the present application, the shooting parameters that carry their setting values in the shooting parameter operation and do not need to be restored to the preset initial values after the video image shooting is finished are referred to as third target shooting parameters. Therefore, the second target shooting parameter may include a third target shooting parameter.
Further, the first target photographing parameter and the third target photographing parameter are different photographing parameters.
The third target photographing parameter includes: at least one of a parameter of the HDR10 switch state, a parameter of the AI intelligence recommendation switch state, and a parameter of the flash switch state.
It should be noted that the third target shooting parameter may also be other parameters, such as a shooting zoom parameter, a shooting frame parameter, and the like, which is not limited in this application.
In this embodiment of the application, after the video image capturing is finished, the parameter value of the third target capturing parameter does not need to be restored to the initial value, but remains unchanged, so that after the video image capturing is stopped, when the second target capturing parameter includes the third target capturing parameter, that is, before the video image is captured, the user sets the parameter value of the third target capturing parameter, at this time, after the video image capturing is finished, the electronic device keeps the parameter value of the current third target capturing parameter unchanged. That is, after the video image capturing is finished, the electronic device does not restore the parameter value of the third target capturing parameter to the preset initial value.
Illustratively, as shown in fig. 5 (1), in response to a user operating icon 501 of the "camera" application in the cell phone home screen interface, the cell phone displays interface 502 as shown in fig. 5 (2). The interface 502 is a preview interface for mobile phone photographing, and the interface 502 further includes a "portrait" mode, a "video" mode, a "movie" mode, and a "professional" mode. In response to the user's operation of selecting the "movie" mode 503, the cellular phone displays an interface 504 as shown in (1) in fig. 6. Interface 504 is a preview interface before the handset records. As shown in fig. 6 (1), the interface 504 includes an HDR10 control 505, an LUT control 506, and a zoom control 507. Wherein, the turning on of the HDR10 control 505 represents that the electronic device takes a 10bit video image. The user may select the desired LUT by clicking on the LUT control 506. The zoom control 507 is used to change the focal length of the shot. For example, as shown in (1) of fig. 6, in response to a user operation of the LUT control 506, the handset displays an interface as shown in (2) of fig. 6. The interface includes a plurality of LUT filters disposed within the electronic device, including LUT1, LUT2, LUT 3. Assuming that, in the movie mode, the user has turned on the HDR10 control and selected the LUT2, and photographs with 2 times zoom, the cell phone displays the interface 801 shown in fig. 7. In this interface 801, the HDR10 control is turned on, and the LUT2 and 2-fold zoom are used to display the preview image. In response to a user selecting a shooting operation, video shooting is performed.
At this time, the shooting parameters set by the user are to start the 4K HDR control, the filter parameters are LUT2, and the shooting zoom parameters are zoom by 2 times, that is, the second target shooting parameters are the 4K HDR control parameters, the filter parameters, and the shooting zoom parameters. In response to the user selecting a shooting end operation, the electronic device determines a state in which the camera application is currently located. If the camera application of the electronic equipment is in the current running state, the electronic equipment determines that the camera application is the parameter recovery information in the current running state. It is assumed that the parameter recovery information includes the shooting zoom parameter. That is, when the shooting is finished and the camera application is in the current running state, that is, the display interface of the electronic device is still the preview interface of the movie mode, the electronic device may determine the first target shooting parameter as the shooting zoom parameter. And determining the 4K HDR control parameter and the filter parameter as a third target shooting parameter. At this time, when the shooting is finished, the electronic device only restores the parameter value of the shooting zoom parameter to a preset initial value, that is, to 1 time zoom. The handset displays an interface 802 as shown in fig. 8 (1). While the 4K HDR control parameter and the filter parameter are still the values selected by the user and are not changed.
If the camera application of the electronic equipment is in the running ending state, the electronic equipment determines the parameter recovery information of the camera application in the running ending state. It is assumed that the parameter recovery information includes filter parameters and shooting zoom parameters. That is, when the shooting is ended and the camera application is in a state of ending the running, that is, the electronic apparatus turns off the camera application. The electronic device may determine the first target photographing parameter as a photographing zoom parameter polar filter parameter. The 4K HDR control parameter is determined as the third target photographing parameter. At this time, when the electronic device finishes shooting, the parameter value of the shooting zoom parameter is restored to the preset initial value, that is, to 1 time zoom, and the filter parameter is restored to the default filter LUT1, while the 4K HDR control parameter is still the value selected by the user and is not changed, refer to the interface 803 shown in fig. 8 (2).
Further, in step S302, if it is determined that the camera application is in the current running state when the shooting is finished, it indicates that the display interface of the electronic device is still the shooting preview interface in the camera application. Since there is a user closing the camera application or switching to background running. Therefore, the operation state of the camera application also changes. Due to the difference of the running state of the camera application, the corresponding parameter recovery information is different. That is, the camera applies different operation states, and the shooting parameters that need to be restored to the preset initial values are also different. When the running state of the camera application is switched from the current running state to the background running state, the parameter value of the shooting parameter corresponding to the background running state and needing to be restored to the preset initial value also needs to be restored. Similarly, when the running state of the camera application is switched from the current running state to the running ending state, the parameter value of the shooting parameter that needs to be restored to the preset initial value corresponding to the running ending state also needs to be restored.
Similarly, when the running state of the camera application is the background running state, the user may close the camera application, that is, kill the process of the camera application, and at this time, the state of the camera application may be switched from the background running state to the running ending state. Because the running states of the camera application are different, and the corresponding parameter recovery information is different, when the running state of the camera application is switched from the background running state to the running ending state, the parameter value of the shooting parameter which is required to be recovered to the preset initial value and corresponds to the running ending state also needs to be recovered.
In this case, if the operation state of the camera application is the operation state in which the camera application is ended when the video shooting is stopped in step S402, the following steps need not be executed. If the operation state of the camera application is the current operation state when the video capturing is stopped in step S402, the following steps S406 to S408 are performed. If the operation state of the camera application is the post-operation state, the following steps S409 to S411 are executed.
It should be noted that if the parameter restoration information corresponding to the state of ending the operation is the same as the parameter restoration information corresponding to the state of background operation, the following steps S406 to S4 and S08 need only be executed, and the steps S409 to S411 need not be executed.
Step S406, when the running state of the camera application is the current running state, determining whether the running state of the camera application is switched to the background running state or the running state is ended.
Specifically, if it is determined that the running state of the camera application is the current running state when the shooting is stopped in step S402, the electronic device needs to restore the shooting parameters corresponding to the current running state to the preset initial values, where the shooting parameters need to be restored to the preset initial values. The user can close or switch the camera application of the electronic equipment to the background operation at any time, and the operation state of the corresponding application is changed at the moment. Due to the difference of the running state of the camera application, the corresponding parameter recovery information is different. When the running state of the corresponding application is changed, the shooting parameters corresponding to different running states and needing to be restored to the preset initial values also need to be restored. Therefore, after determining that the running state of the camera application is the current running state in step S402, the electronic device needs to detect in real time whether the running state of the camera application is changed, that is, whether the running state of the camera application is switched to the background running state or to the running ending state.
In some embodiments, the operational state of the camera application may be indicative of different operational states by different values of the operational parameter. If the value of the operation parameter is 01, the operation state of the camera application is represented as a current operation state, if the value of the operation parameter is 10, the operation state of the camera application is represented as a background operation state, and if the value of the operation parameter is 11, the operation state of the camera application is represented as an operation ending state. The electronic device may determine whether the operational state of the camera application is switched by detecting the value of the operational parameter.
Step S407, when the running state of the camera application is switched to the background running state or the running state is ended, determining parameter recovery information of the camera application according to the running state of the camera application.
Specifically, when it is determined that the running state of the camera application is switched from the current running state to the background running state or the running state is finished, the running state is changed, and at this time, the electronic device needs to recover the parameter value of the relevant shooting parameter. At this time, the electronic device may determine, according to the running state of the machine application, the parameter recovery information corresponding to the running state.
For example, if the running state of the corresponding application is switched from the current running state to the background running state, the parameter recovery information of the camera application corresponding to the running state of the platform may be determined according to the running state of the platform.
The step S302 may be specifically performed, and is not described herein again.
And step S408, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
The step S303 may be executed in detail, and is not described herein again.
As described in the above example, referring to the interface shown in fig. 8(2), when video shooting is finished, the camera application is in the current running state, and only the shooting zoom parameter is restored to 1 time, and for the 4K HDR control parameter and the filter parameter, the values are still the values selected by the user and are not changed. If the user has closed the camera application. At this time, the electronic device may detect that the operation state of the camera application is switched from the currently operating state to the state in which the operation is ended. The electronic device may further determine, in the pre-stored parameter recovery information, parameter recovery information corresponding to a state of ending operation of the camera application. The parameter recovery information corresponding to the operation-ended state includes filter parameters and shooting zoom parameters. The electronic device may determine the first target photographing parameter as a photographing zoom parameter polar filter parameter. The 4K HDR control parameter is determined as the third target photographing parameter. At this time, when the electronic device finishes shooting, the parameter value of the shooting zoom parameter is restored to a preset initial value, that is, to 1 time zoom, and the filter parameter is restored to the default filter LUT1, while the 4K HDR control parameter is still the value selected by the user and is not changed, refer to the interface shown in fig. 8 (2).
Step S409, when the running state of the camera is the running state of the background, determining whether the running state of the camera application is switched to the running end state.
Specifically, when the parameter recovery information corresponding to the background running state is different from the parameter recovery information corresponding to the running ending state, the electronic device may detect whether the running state of the camera application is switched to the running ending state in real time when the running state of the camera application is the background running state.
Step S410, when the running state of the camera application is switched to the running finishing state, determining parameter recovery information of the camera application according to the switched running state of the camera application.
The specific parameter step S407 is not described herein again.
And S411, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
The step S303 may be executed in detail, and is not described herein again.
Referring to fig. 9, a schematic flow chart of another video shooting method provided in the embodiment of the present application is shown. The method is different from the embodiment shown in fig. 4, in that a step of the electronic device to capture a video image is added, as shown in fig. 9, which mainly includes the following steps.
Step S901, in response to a camera mode selection operation, determines a target camera mode.
In the embodiment of the present application, the camera mode refers to an operation mode of the camera, and may include a photographing mode, a recording mode, a professional mode, a movie mode, and the like. The user may send a camera mode selection operation to the electronic device after selecting a certain mode for capturing a video image. At this time, the electronic device may determine the target camera mode according to the received camera mode selection operation.
Further, the target camera mode is a movie mode. In the movie mode, the captured video is closer to the movie effect.
And step S902, responding to the shooting parameter setting operation, and setting the value of the second target shooting parameter applied by the camera according to the shooting parameter setting operation.
The shooting parameter setting operation carries a setting value of a second target shooting parameter. The second target shooting parameter is a shooting parameter whose parameter value is set by the user indicated by the shooting parameter setting operation.
Specifically, refer to step S401, which is not described herein again.
It should be noted that if the function of intelligently recommending the LUT filter is turned on in the electronic device, the electronic device needs to recommend the LUT filter according to the current shooting scene, and then the following steps S903 to S904 are executed. If the function of intelligently recommending the LUT filter is not started, the electronic equipment does not recommend the LUT filter according to the current shooting scene, and at this time, the steps S903-S904 are not executed any more, and the following steps S905-S909 are directly executed.
And step S903, determining a target LUT tone according to the acquired video image, and updating the parameter value of the filter parameter to the parameter value corresponding to the target LUT tone when the parameter value of the filter parameter is different from the parameter value corresponding to the target LUT tone.
Wherein the target LUT tone is a LUT tone recommended from the captured video image.
In the embodiment of the application, if the function of intelligently recommending the LUT filter is turned on, the electronic device may collect a video image through the camera, may perform scene recognition according to the collected video image, and further determine an LUT tone suitable for a current shooting scene according to the recognized scene, and determine the suitable LUT tone as the target LUT tone. And when the parameter value of the current filter parameter is different from the parameter value corresponding to the target LUT tone, the LUT filter used by the current camera is not the target LUT tone, and the parameter value of the current filter parameter is updated to the parameter value corresponding to the target LUT tone.
Further, when the electronic device just starts the shooting function of the video image, the parameter value of the current filter parameter is a preset initial value. At this time, if the electronic device determines the target LUT tone according to the acquired video image, and if the preset initial value of the filter parameter is not the parameter value corresponding to the target LUT tone, the filter parameter value needs to be updated to the parameter value corresponding to the target LUT tone.
Further, when the camera mode is the movie mode, the preset initial value of the filter parameter is a parameter value corresponding to the tone of the preset LUT. That is, when the movie mode shooting is entered, in order to more approximate the movie effect, the shot video image needs to be superimposed with an LUT filter.
And step S904, performing filter rendering processing on the video image according to the filter parameters.
In this embodiment of the application, after the filter parameter is set as the parameter value corresponding to the target LUT tone, the rendering process may be performed according to the video image corresponding to the target LUT tone, that is, the target LUT tone is superimposed on the captured video image.
Step S905, in response to the video shooting operation, shoots a video image.
In the embodiment of the application, after the user completes the corresponding shooting parameter setting in the target camera mode, the video shooting operation can be sent to the electronic equipment, and after the electronic equipment receives the video shooting operation, the electronic equipment can control devices such as a camera to shoot video images.
Step S906, in response to the shooting end operation, stops shooting the video image, and determines the running state of the camera application.
Specifically, refer to step S201, which is not described herein again.
Step S907 determines parameter recovery information of the camera application according to the running state of the camera application.
The parameter recovery information is related to shooting parameters that need to be recovered to a preset initial value in the camera application.
Specifically, refer to step S202, which is not described herein again.
Step S908 is to restore the first target shooting parameter in the camera application to a preset initial value according to the parameter restoration information of the camera application.
Wherein the first target shooting parameter is at least one of the shooting parameters indicated in the parameter recovery information.
Specifically, refer to step S203, which is not described herein again.
And step S909, determining a third target shooting parameter according to the second target shooting parameter and the first target shooting parameter, and keeping the parameter value of the third target shooting parameter unchanged.
Specifically, refer to step S405, which is not described herein again.
Step S910, determining whether the running state of the camera application is switched to the background running state, or ending the running state.
Specifically, refer to step S406, which is not described herein again.
Step S911, when the running state of the camera application is the background running state or the running ending state, determining the parameter recovery information of the camera application according to the running state of the camera application.
Specifically, refer to step S407, which is not described herein again.
And step S912, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Specifically, refer to step S408, which is not described herein again.
Step S913 determines whether the operation state of the camera application is switched to a state of ending the operation.
Specifically, refer to step S409, which is not described herein again.
Step S914, when the running state of the camera application is switched to the running end state, determining parameter recovery information of the camera application according to the switched running state of the camera application.
Specifically, refer to step S410, which is not described herein again.
And S915, restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
Specifically, refer to step S411, which is not described herein again.
Illustratively, assume that a user desires the electronic device to take a video shot in movie mode. At this time, the electronic device may receive a camera mode selection operation of a movie mode sent by a user, the electronic device may determine the movie mode as a target camera mode, the electronic device starts the movie mode, after the movie mode is started, a filter parameter value of the electronic device is a parameter value corresponding to the LUT1, that is, in the movie mode, the LUT1 is used as an initial filter, and after the movie mode is started, a captured video image may automatically overlap a filter effect of the LUT 1. The user sets some shooting parameters in the movie mode, and supposing that the user turns on the HDR10 switch and turns on the IMAX switch, and the electronic device automatically turns on the intelligent filter recommendation function in the movie mode, that is, in the movie mode, the electronic device can determine a shooting scene according to a shot video image, and further can recommend an LUT filter suitable for the shooting scene. Based on this, the electronic device may receive a shooting parameter setting operation, where the second target shooting parameters carried in the shooting parameter setting operation are the HDR10 switch parameter and the IMAX switch parameter. And the setting value of the HDR10 switch parameter is a parameter value corresponding to turning on the HDR10 switch, and the setting value of the IMAX switch parameter is a parameter value corresponding to turning on the IMAX switch. The electronic device can set the HDR10 switch parameters and IMAX switch parameters of the camera application according to the shooting parameter setting operation. Moreover, the electronic device may collect a video image of a current shooting scene, and assume that, according to the collected video image, it determines that an LUT filter suitable for the current scene is LUT4, and the electronic device determines LUT4 as a target LUT hue, because a parameter value of a filter parameter is a parameter value corresponding to LUT1 in the movie mode and is different from a parameter value corresponding to the target LUT hue, at this time, the electronic device may update the parameter value of the filter parameter to a parameter value corresponding to LUT4, as shown in fig. 10.
The electronic device can take the video image after receiving the video shooting operation. The electronic device may perform LUT4 tone rendering processing on the captured video image according to the filter parameters, that is, superimpose LUT4 tone filters on the captured video image. And if the user needs to stop shooting, sending a shooting ending operation to the electronic equipment, and after receiving the shooting ending operation, the electronic equipment can stop shooting the video image and determine the running state of the current camera application. Assuming that the running state of the current camera application is a background running state, the electronic device determines parameter recovery information of the camera application according to the running state of the camera application. It is assumed that the parameter recovery information applied by the camera includes two parameters, namely a filter parameter and an IMAX switch parameter, and carries a parameter value corresponding to the filter parameter recovered to LUT1, and the IMAX switch is turned off. At this time, the electronic device may determine the filter parameter and the IMAX switch parameter as the first target shooting parameter according to the parameter recovery information applied by the camera, determine a preset initial value corresponding to the filter parameter and the IMAX switch parameter according to the parameter recovery information, and determine the HDR10 switch parameter as the third target shooting parameter. The electronic device may restore the first target shooting parameter to a preset initial value according to the parameter restoration information, that is, restore the filter parameter to a parameter value corresponding to the LUT1, set the IMAX switch parameter to a parameter value corresponding to the closed IMAX switch, and keep the HDR10 switch parameter unchanged from the parameter value corresponding to the opened HDR10 switch.
In summary, after the video image shooting is finished, the parameter recovery information can be determined according to the running state of the camera application, and the value of the first target shooting parameter in the camera application is recovered to the preset initial value according to the parameter recovery information. The method and the device can automatically restore the first target shooting parameter to the initial value after shooting is completed, do not need manual restoration of a user, can reduce user operation, improve the intelligence of electronic equipment, and accordingly improve user experience.
Referring to fig. 11 and 12, a block diagram of a software structure of an electronic device according to an embodiment of the present application is provided. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android (Android) system is divided into four layers, an application layer, a framework layer, a hardware abstraction layer, and a hardware layer from top to bottom.
An Application layer (App) may comprise a series of Application packages. For example, the application package may include a camera application. The application layer may be further divided into an application interface (UI) and application logic.
The application interface of the camera application comprises a camera, a gallery and other applications, wherein the camera comprises a photographing mode, a video recording mode, a movie mode and the like.
The application logic of the camera application comprises a recovery logic module, an encoding module, a third target shooting parameter control logic module and a first target shooting parameter control logic module. The recovery logic module is used for controlling the parameter value of the first target shooting parameter in the first target shooting parameter control logic module to recover to a preset initial value and controlling the parameter value of the third target shooting parameter in the third target shooting parameter control logic module to keep unchanged. And the coding module is used for coding the shot video image. The third target shooting parameter control logic module is used for recording the parameter value of the third target shooting parameter and keeping the parameter value of the third target shooting parameter unchanged when the video shooting is finished. The third target photographing parameters may include an HDR10 control network module, a flash control logic module, a zoom control logic module, an AI control logic module, a frame control logic module, and the like, as described with reference to fig. 11. Alternatively, the third target photographing parameter control logic may include an IMAX logic control module, a flash control logic module, an HDR10 control network module, an AI control logic module, and the like, as shown in fig. 12. The first target shooting parameter control logic module is used for recording the current parameter value and the preset initial value of the first target shooting parameter and restoring the parameter value of the first target shooting parameter to the preset initial value when video shooting is completed. The first target photographing parameter control logic module may include an LUT control logic module, a rate logic control module, an IMAX logic control module, etc., as shown with reference to fig. 11. Alternatively, the first target shooting parameter control logic module may include an LUT control logic module, a rate logic control module, a zoom logic control module, a frame logic control module, and the like, as shown in fig. 12.
It should be noted that the control logic module included in the first target shooting parameter control logic module and the control logic module included in the third target shooting parameter control logic module are preset and may be preset according to actual requirements, which is not limited in the present application. Fig. 11 and 12 are merely exemplary and not limiting.
The Framework layer (FWK) provides an Application Programming Interface (API) and a programming Framework for applications at the application layer, including some predefined functions. In fig. 11 and 12, the framework layer includes a Camera access interface (Camera2 API), and the Camera2 API is an interface of a set of access Camera devices derived from Android, and adopts a pipeline design to enable data stream to flow from the Camera to the Surface. The Camera2 API includes Camera management (Camera manager) and Camera device (Camera device). The Camera manager is a management class of the Camera device, and the Camera device information of the device can be queried through the class object to obtain a Camera device object. The CameraDevice provides a series of fixed parameters related to the Camera device, such as the setting and output format of the basis.
The Hardware Abstraction Layer (HAL) is an interface layer between the operating system kernel and the hardware circuitry, which aims to abstract the hardware. The method hides the hardware interface details of a specific platform, provides a virtual hardware platform for an operating system, enables the virtual hardware platform to have hardware independence, and can be transplanted on various platforms. In fig. 11 and 12, the HAL includes a Camera hardware abstraction layer (Camera HAL) including a Device (Device)1, a Device (Device)2, a Device (Device)3, and the like. It is understood that the devices 1, 2, and 3 are abstract devices.
The HardWare layer (HardWare, HW) is the HardWare located at the lowest level of the operating system. In fig. 11 and 12, HW includes a camera device (CameraDevice)1, a camera device (CameraDevice)2, a camera device (CameraDevice)3, and the like. Wherein the CameraDevice1, CameraDevice2, and CameraDevice3 can correspond to a plurality of cameras on the electronic device.
Referring to fig. 13, a schematic flow chart of another video shooting method provided in the embodiment of the present application is shown. In the embodiment of the present application, a target camera mode is taken as a movie mode, and an intelligent recommendation filter function is turned on as an example for explanation. The method can be applied to the software structure shown in fig. 11 and 12, as shown in fig. 13, which mainly includes the following steps.
S1301, the camera application of the electronic device enters a movie mode in response to the camera mode selection operation.
The user selects a camera mode for capturing the video image in the electronic device before capturing the video image, and the camera application determines the camera mode of the video image selected by the user as a target camera mode. In the embodiments of the present application, the target camera mode is taken as an example of the movie mode.
And S1302, responding to the shooting parameter setting operation, and sending the value of the second target shooting parameter to a corresponding module by the camera application of the electronic equipment according to the shooting parameter setting operation.
The shooting parameter setting operation carries a setting value of a second target shooting parameter. The second target shooting parameter is a shooting parameter whose parameter value is set by the user.
When the user needs to set the shooting parameters in the movie mode, the shooting parameter setting operation can be sent to the electronic device, and the electronic device carries the setting values of the second target shooting parameters. After receiving the shooting parameter setting operation, the camera application can set the corresponding shooting parameters in the movie mode according to the setting values of the second target shooting parameters in the camera application. In the embodiment of the present application, an example is described in which the user turns on the HDR10 function, that is, the second target shooting parameter is a parameter for turning on the HDR10 switch. Of course, the user may also set other shooting parameters, which is not limited in this application. At this time, after receiving the shooting parameter setting operation and analyzing that the second target shooting parameter carried in the shooting parameter setting operation is a parameter for turning on the HDR10 switch, the camera application may send an instruction for turning on the HDR10 switch to the HDR10 control logic module.
S1303, receiving an HDR10 starting instruction by an HDR10 control logic module of the electronic device, and sending an HDR10 image shooting instruction to a hardware abstraction layer.
And S1304, triggering the hardware layer to shoot a 10bit image by the hardware abstraction layer of the electronic equipment according to the instruction for shooting the HDR10 image, and acquiring the 10bit image returned by the hardware layer.
Specifically, after receiving the instruction of the HDR10 image, the hardware abstraction layer of the electronic device learns that a 10bit image needs to be shot, and at this time, sends the instruction of shooting the 10bit image to the hardware layer. And starting a corresponding camera by the hardware layer to shoot a 10-bit image, and returning the 10-bit image to the hardware abstraction layer.
And S1305, the camera application of the electronic equipment sends an AI model starting instruction to the AI control logic module.
Specifically, the electronic device automatically turns on the filter recommendation function in the movie mode, that is, the AI control logic module is automatically triggered to turn on when entering the movie mode. Therefore, the camera application sends an AI model start instruction to the AI control logic module to trigger the AI control logic module to turn on.
And S1306, an AI control logic module of the electronic equipment determines a target LUT tone according to the collected video image.
Wherein the target LUT tone is a LUT tone recommended from the captured video image.
S1307, the AI control logic module of the electronic device detects whether the current parameter value of the filter parameter is the same as the parameter value corresponding to the target LUT tone, and sends the identification information of the target LUT tone to the LUT control logic module when the current parameter value of the filter parameter is different from the parameter value corresponding to the target LUT tone.
Specifically, the AI control logic module may determine a target LUT tone according to the collected video image, detect whether a current parameter value of the current filter parameter is a parameter value corresponding to the target LUT tone, and send the target LUT tone to the LUT control logic module when the parameter value of the current filter parameter is different from the parameter value corresponding to the target LUT tone.
S1308, the LUT control logic module of the electronic device switches the parameter value of the filter parameter to the parameter value corresponding to the target LUT tone, and sends the parameter information of the target LUT tone to the hardware abstraction layer.
And S1309, rendering the image returned by the hardware layer by the hardware abstraction layer of the electronic device according to the parameter information of the target LUT tone to obtain an image to be displayed.
S1310, displaying the image to be displayed through a display interface by a hardware abstraction layer of the electronic device, and sending the image to be displayed to the coding module.
S1311, when the camera application of the electronic device receives the video shooting operation.
S1312, the camera application of the electronic equipment sends the video shooting operation to the coding module.
S1313, when the encoding module of the electronic device receives the video shooting operation, encoding the received image to be displayed.
When a user needs to take a video, the user can send a video taking operation to the electronic device, and after receiving the video taking operation, the encoding module can encode an image to be displayed, which is sent by the hardware abstraction layer.
And S1314, the camera application of the electronic equipment receives the shooting ending operation.
S1315, the camera application of the electronic device sends shooting receiving operation to the coding module, and when shooting of the video image is stopped, the running state of the camera application is determined.
When the shooting of the video image needs to be ended, the user transmits a shooting ending operation to the electronic device. At this time, the camera application may receive a shooting end operation, may send the shooting end operation to the encoding module, and determine the running state of the current camera application when the shooting end operation is received.
And S1316, the coding module receives shooting ending operation, generates a video file according to the coded video image to be displayed, and stores the video file.
S1317, the camera application of the electronic device sends the running state of the camera application to the recovery logic module.
S1318, the recovery logic module of the electronic device determines parameter recovery information of the camera application according to the running state of the camera application.
The parameter recovery information is related to shooting parameters that need to be recovered to a preset initial value in the camera application.
The recovery logic module determines parameter recovery information of the camera application according to the running state of the camera application. For example, the running state of the camera application is a running ending state, and at this time, the recovery logic module finds the parameter recovery information corresponding to the running ending state in the preset parameter recovery information. The parameters which need to be initialized in the parameter recovery information corresponding to the operation ending state are assumed to be filter parameters and IMAX switch parameters.
It should be noted that the parameter recovery information corresponding to the operation ending state may further include other shooting parameters, and this application only exemplifies that the parameter recovery information includes a filter parameter and an IMAX switch parameter.
It should be noted that the running state of the camera application may also be a current running state or a background running state, and the parameter recovery information corresponding to different running states is also different. In the embodiments of the present application, the description is given only by taking an example in which the operation state of the camera application is a state in which the operation is completed when the video shooting is stopped, and the present application does not limit this. If the shooting parameters of the current state can be restored to the flow of the preset initial value in other running states, the process is not repeated in the application.
S1319, the recovery logic module determines the filter parameter as the first target shooting parameter according to the parameter recovery information applied by the camera, and sends a recovery instruction for instructing the filter parameter to recover to a preset initial value to the LUT control logic module.
After the parameter recovery information of the camera application is determined, the recovery logic module can determine the first target shooting parameter according to the parameter recovery information, and trigger the first target shooting parameter control logic module to recover the parameter value of the first target shooting parameter to a preset initial value. For example, the recovery logic module may determine the filter parameter as the first target shooting parameter according to the parameter recovery information, and at this time, the recovery logic module may send a recovery instruction to the LUT control logic module to trigger the LUT control logic module to recover the filter parameter from the parameter value corresponding to the target LUT tone to the preset initial value. That is, the parameter value corresponding to the preset initial LUT tone is restored.
And S1320, the LUT control logic module restores the filter parameters to preset initial values according to the restoration instruction.
And S1321, determining the HDR10 switch parameter as a third target shooting parameter by the recovery logic module, and sending a holding instruction to the HDR10 control logic module.
Specifically, in the step S1302, the second target capturing parameter includes the HDR10 switch parameter, the parameter restoring information does not include the HDR10 switch parameter, and the HDR10 switch parameter is the third target capturing parameter that does not need to be restored to the preset initial value, at this time, the restoring logic module controls to send the holding instruction to the HDR10 control logic module, so that the HDR10 control logic module keeps the value of the HDR10 switch parameter unchanged, that is, the value of the HDR10 switch parameter is still the parameter value corresponding to the switch-on HDR 10. That is to say, the recovery logic module changes the shooting parameter values in the shooting process, and the shooting parameters which do not need to be recovered to the preset initial values, that is, the HDR10 switch parameters, are determined as the third target shooting parameters, and sends a holding instruction to the HDR10 control logic module. The holding instruction is to instruct to hold the parameter value of the third target photographing parameter unchanged. At this time, the HDR10 control logic module may keep the parameter values of the current HDR10 switch parameter unchanged after receiving the hold instruction.
Corresponding to the above method embodiments, the present application also provides an electronic device, which is used for a memory for storing computer program instructions and a processor for executing the program instructions, wherein when the computer program instructions are executed by the processor, the electronic device is triggered to execute some or all of the steps in the above method embodiments.
Fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 14, the electronic device 1400 may include a processor 1410, an external memory interface 1420, an internal memory 1421, a Universal Serial Bus (USB) interface 1430, a charging management module 1440, a power management module 1441, a battery 1442, an antenna 1, an antenna 2, a mobile communication module 1450, a wireless communication module 1460, an audio module 1470, a speaker 1470A, a receiver 1470B, a microphone 1470C, an earphone interface 1470D, a sensor module 1480, buttons 1490, a motor 1491, a pointer 1492, a camera 1493, a display 1494, and a Subscriber Identification Module (SIM) card interface 1495, and the like. Wherein the sensor module 1480 may include a pressure sensor 1480A, a gyroscope sensor 1480B, an air pressure sensor 1480C, a magnetic sensor 1480D, an acceleration sensor 1480E, a distance sensor 1480F, a proximity light sensor 1480G, a fingerprint sensor 1480H, a temperature sensor 1480J, a touch sensor 1480K, an ambient light sensor 1480L, a bone conduction sensor 1480M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not limit the electronic device 1400. In other embodiments of the present application, the electronic device 1400 may include more or fewer components than illustrated, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 1410 for storing instructions and data. In some embodiments, the memory in the processor 1410 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 1410. If the processor 1410 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 1410, thereby increasing the efficiency of the system.
In some embodiments, processor 1410 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 1410 may include multiple sets of I2C buses. The processor 1410 may be coupled to the touch sensor 1480K, charger, flash, camera 1493, etc. through different I2C bus interfaces. For example: the processor 1410 may be coupled to the touch sensor 1480K via an I2C interface, such that the processor 1410 and the touch sensor 1480K communicate via an I2C bus interface to enable touch functionality of the electronic device 1400.
The I2S interface may be used for audio communication. In some embodiments, processor 1410 may include multiple sets of I2S buses. Processor 1410 may be coupled to audio module 1470 via an I2S bus, enabling communication between processor 1410 and audio module 1470. In some embodiments, the audio module 1470 can communicate audio signals to the wireless communication module 1460 via the I2S interface, enabling answering calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 1470 and wireless communication module 1460 may be coupled by a PCM bus interface. In some embodiments, the audio module 1470 may also transmit audio signals to the wireless communication module 1460 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 1410 with the wireless communication module 1460. For example: the processor 1410 communicates with a bluetooth module in the wireless communication module 1460 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 1470 may transmit an audio signal to the wireless communication module 1460 through a UART interface, so as to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 1410 with peripheral devices such as a display 1494, a camera 1493, etc. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 1410 and camera 1493 communicate over a CSI interface to implement the capture functions of electronic device 1400. The processor 1410 and the display screen 1494 communicate via the DSI interface to implement display functions of the electronic device 1400.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect processor 1410 with camera 1493, display 1494, wireless communication module 1460, audio module 1470, sensor module 1480, and the like. The GPIO interface may also be configured as an I14C interface, an I14S interface, a UART interface, a MIPI interface, and the like.
The USB interface 1430 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 1430 may be used to connect a charger to charge the electronic device 1400, and may also be used to transmit data between the electronic device 1400 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It is to be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration and does not constitute a structural limitation of the electronic device 1400. In other embodiments of the present application, the electronic device 1400 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 1440 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 1440 may receive charging input from a wired charger via the USB interface 1430. In some wireless charging embodiments, the charging management module 1440 may receive wireless charging input through a wireless charging coil of the electronic device 1400. The charging management module 1440 can charge the battery 1442 and supply power to the electronic device through the power management module 1441.
The power management module 1441 is used to connect the battery 1442, the charging management module 1440 and the processor 1410. The power management module 1441 receives input from the battery 1442 and/or the charge management module 1440, and provides power to the processor 1410, the internal memory 1421, the display 1494, the camera 1493, and the wireless communication module 1460. The power management module 1441 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In other embodiments, a power management module 1441 may also be disposed in the processor 1410. In other embodiments, the power management module 1441 and the charging management module 1440 may be disposed in the same device.
The wireless communication function of the electronic device 1400 may be implemented by the antenna 1, the antenna 2, the mobile communication module 1450, the wireless communication module 1460, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 1400 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 1450 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 1400. The mobile communication module 1450 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 1450 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 1450 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 1450 may be disposed in the processor 1410. In some embodiments, at least some of the functional blocks of the mobile communication module 1450 may be provided in the same device as at least some of the blocks of the processor 1410.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 1470A, the receiver 1470B, etc.) or displays an image or video through the display 1494. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 1410, and may be located in the same device as the mobile communication module 1450 or other functional modules.
The wireless communication module 1460 may provide solutions for wireless communication applied to the electronic device 1400, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 1460 may be one or more devices integrating at least one communication processing module. The wireless communication module 1460 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 1410. The wireless communication module 1460 may also receive a signal to be transmitted from the processor 1410, frequency modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it out.
In some embodiments, the antenna 1 and the mobile communication module 1450 of the electronic device 1400 are coupled and the antenna 2 and the wireless communication module 1460 are coupled such that the electronic device 1400 can communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 1400 implements display functions via the GPU, the display 1494, and the application processor, among other things. The GPU is a microprocessor for image processing, connected to the display 1494 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 1410 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 1494 is used to display images, video, and the like. The display 1494 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 1400 may include 1 or N display screens 1494, N being a positive integer greater than 1.
The electronic device 1400 may implement a shooting function through the ISP, the camera 1493, the video codec, the GPU, the display 1494, the application processor, and the like.
The ISP is used to process the data fed back by the camera 1493. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 1493.
The camera 1493 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 1400 may include 1 or N cameras 1493, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 1400 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 1400 may support one or more video codecs. As such, the electronic device 1400 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 1400 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 1420 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 1400. The external memory card communicates with the processor 1410 through an external memory interface 1420 to implement data storage functions. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 1421 may be used to store computer-executable program code, which includes instructions. The internal memory 1421 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 1400, and the like. In addition, the internal memory 1421 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 1410 performs various functional applications and data processing of the electronic device 1400 by executing instructions stored in the internal memory 1421 and/or instructions stored in a memory provided in the processor.
The electronic device 1400 may implement audio functions via the audio module 1470, speaker 1470A, microphone 1470B, earphone interface 1470D, application processor, etc. Such as music playing, recording, etc.
The audio module 1470 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 1470 may also be used to encode and decode audio signals. In some embodiments, the audio module 1470 may be disposed in the processor 1410, or some functional modules of the audio module 1470 may be disposed in the processor 1410.
The speaker 1470A, also referred to as a "horn," is used to convert electrical audio signals into acoustic signals. The electronic device 1400 may listen to music through the speaker 1470A or listen to a hands-free conversation.
A receiver 1470B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device 1400 answers a call or voice message, it can answer the voice by placing the receiver 1470B close to the ear of the person.
The microphone 1470C, also referred to as a "microphone", is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal into the microphone 1470C by speaking the user's mouth near the microphone 1470C. The electronic device 1400 may be provided with at least one microphone 1470C. In other embodiments, electronic device 1400 may be provided with two microphones 1470C, which may implement noise reduction functionality in addition to collecting sound signals. In other embodiments, the electronic device 1400 may further include three, four, or more microphones 1470C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headset interface 1470D is used to connect wired headsets. The headset interface 1470D may be the USB interface 1430, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 1480A is configured to sense a pressure signal, which may be converted to an electrical signal. In some embodiments, the pressure sensor 1480A may be disposed on the display 1494. The pressure sensor 1480A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 1480A, the capacitance between the electrodes changes. The electronic device 1400 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 1494, the electronic device 1400 detects the intensity of the touch operation according to the pressure sensor 1480A. The electronic device 1400 may also calculate the location of the touch from the detection signal of the pressure sensor 1480A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 1480B may be used to determine a motion pose of the electronic device 1400. In some embodiments, the angular velocity of the electronic device 1400 about three axes (i.e., the x, y, and z axes) may be determined by the gyroscope sensors 1480B. The gyro sensor 1480B may be used to photograph anti-shake. For example, when the shutter is pressed, the gyroscope sensor 1480B detects a shaking angle of the electronic device 1400, calculates a distance to be compensated for by the lens module according to the shaking angle, and allows the lens to counteract shaking of the electronic device 1400 through a reverse movement, thereby achieving anti-shaking. The gyroscope sensor 1480B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 1480C is used to measure air pressure. In some embodiments, the electronic device 1400 calculates altitude, aiding positioning and navigation, from barometric pressure values measured by the barometric pressure sensor 1480C.
The magnetic sensor 1480D includes a hall sensor. The electronic device 1400 may detect the opening and closing of the flip holster using the magnetic sensor 1480D. In some embodiments, when the electronic device 1400 is a flip phone, the electronic device 1400 may detect the opening and closing of the flip according to the magnetic sensor 1480D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 1480E may detect the magnitude of acceleration of the electronic device 1400 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 1400 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 1480F for measuring distance. The electronic device 1400 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device 1400 may range using the distance sensor 1480F to achieve fast focus.
The proximity light sensor 1480G may include, for example, a Light Emitting Diode (LED) and a photodetector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 1400 emits infrared light to the outside through the light emitting diode. The electronic device 1400 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 1400. When insufficient reflected light is detected, the electronic device 1400 may determine that there is no object near the electronic device 1400. The electronic device 1400 can utilize the proximity light sensor 1480G to detect that the user holds the electronic device 1400 close to the ear for conversation, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 1480G may also be used in holster mode, pocket mode automatically unlock and lock screen.
The ambient light sensor 1480L is used to sense ambient light levels. The electronic device 1400 may adaptively adjust the brightness of the display 1494 based on the perceived ambient light brightness. The ambient light sensor 1480L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 1480L may also cooperate with the proximity light sensor 1480G to detect whether the electronic device 1400 is in a pocket to prevent inadvertent touches.
The fingerprint sensor 1480H is used to capture a fingerprint. The electronic device 1400 may utilize the collected fingerprint characteristics to implement fingerprint unlocking, access an application lock, fingerprint photographing, fingerprint answering, and the like.
The temperature sensor 1480J is used to detect temperature. In some embodiments, the electronic device 1400 implements a temperature processing strategy using the temperature detected by the temperature sensor 1480J. For example, when the temperature reported by the temperature sensor 1480J exceeds a threshold, the electronic device 1400 performs a reduction in performance of a processor located near the temperature sensor 1480J to reduce power consumption to implement thermal protection. In other embodiments, electronic device 1400 heats battery 1442 when the temperature is below another threshold to avoid low temperatures causing electronic device 1400 to shut down abnormally. In other embodiments, electronic device 1400 performs a boost on the output voltage of battery 1442 when the temperature is below yet another threshold to avoid an abnormal shutdown due to low temperatures.
Touch sensor 1480K, also referred to as a "touch device". The touch sensor 1480K may be disposed on the display screen 1494, and the touch sensor 1480K and the display screen 1494 form a touch screen, which is also referred to as a "touch screen". The touch sensor 1480K is used to detect a touch operation acting thereon or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine a touch event type. Visual output related to the touch operation may be provided through the display 1494. In other embodiments, the touch sensor 1480K may be disposed on a different surface of the electronic device 1400 than the display 1494.
The bone conduction sensor 1480M may acquire a vibration signal. In some embodiments, the bone conduction sensor 1480M may acquire a vibration signal of a human vocal part vibrating a bone mass. The bone conduction sensor 1480M may also contact the body pulse to receive a blood pressure pulse signal. In some embodiments, a bone conduction sensor 1480M may also be provided in the headset, integrated into a bone conduction headset. The audio module 1470 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part obtained by the bone conduction sensor 1480M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure pulsation signal acquired by the bone conduction sensor 1480M to realize the heart rate detection function.
The keys 1490 include a power-on key, a volume key, etc. The keys 1490 may be mechanical keys. Or may be touch keys. The electronic device 1400 may receive a key input, and generate a key signal input related to user setting and function control of the electronic device 1400.
The motor 1491 may generate a vibration cue. The motor 1491 can be used for both incoming call vibration cues and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 1491 may also respond to different vibration feedback effects for touch operations applied to different areas of the display 1494. Different application scenes (such as time reminding, information receiving, alarm clock, game and the like) can also correspond to different vibration feedback effects. Touch vibration feedback effects may also support customization.
The indicator 1492 may be an indicator light, and may be used to indicate a charging status, a change in power, or a message, a missed call, a notification, etc.
The SIM card interface 1495 is used for connecting a SIM card. The SIM card can be attached to and detached from the electronic device 1400 by being inserted into the SIM card interface 1495 or being pulled out of the SIM card interface 1495. The electronic device 1400 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 1495 can support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 1495 allows multiple cards to be inserted simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 1495 is also compatible with different types of SIM cards. The SIM card interface 1495 is also compatible with external memory cards. The electronic device 1400 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 1400 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 1400 and cannot be separated from the electronic device 1400.
In a specific implementation manner, the present application further provides a computer storage medium, where the computer storage medium may store a program, and when the program runs, the computer storage medium controls a device in which the computer readable storage medium is located to perform some or all of the steps in the foregoing embodiments. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
In a specific implementation, the present application further provides a computer program product, where the computer program product includes executable instructions, and when the executable instructions are executed on a computer, the computer is caused to perform some or all of the steps in the foregoing method embodiments.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the front and back associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.
Claims (21)
1. A video shooting method is applied to an electronic device, and comprises the following steps:
stopping shooting the video image in response to a shooting end operation, and determining an operating state of the camera application;
determining parameter recovery information of the camera application according to the running state of the camera application; the parameter recovery information is related information of shooting parameters needing to be recovered to a preset initial value in camera application;
and restoring first target shooting parameters in the camera application to a preset initial value according to the parameter restoring information of the camera application, wherein the first target shooting parameters are at least one of the shooting parameters indicated in the parameter restoring information.
2. The method of claim 1, wherein the running state of the camera application comprises: a current running state, a background running state or an end running state.
3. The method of claim 2,
when the running state of the camera application is a current running state, the first target shooting parameter includes: at least one of a photographing zoom parameter, a photographing rate parameter and a photographing frame parameter;
when the running state of the camera application is a background running state or a running ending state, the first target shooting parameter includes: at least one of a filter parameter, a photographing zoom parameter, a photographing rate parameter, and a photographing frame parameter.
4. The method of claim 2,
when the running state of the camera application is a current running state, a background running state or a running ending state, the first target shooting parameter includes: at least one of a filter parameter, a photographing zoom parameter, a photographing rate parameter, and a photographing frame parameter.
5. The method according to claim 2, wherein the first target shooting parameter when the running state of the camera application is a current running state comprises: shooting rate parameters;
when the running state of the camera application is a background running state or a running ending state, the first target shooting parameter includes: at least one of a shooting rate parameter, a filter parameter and a macroscreen film IMAX parameter.
6. The method according to claim 1, wherein the restoring the first target shooting parameter in the camera application to the preset initial value according to the parameter restoration information of the camera application comprises:
and when the parameter recovery information carries the recovery value of the first target shooting parameter, recovering the first target shooting parameter in the camera application to the recovery value of the first target shooting parameter carried in the parameter recovery information according to the parameter recovery information of the camera application.
7. The method according to any one of claims 3 to 5,
the preset initial values of the filter parameters include: the initial LUT is used to tone the corresponding parameter values.
8. The method according to any one of claims 3 to 5,
the preset initial values of the filter parameters include: and (4) parameter values corresponding to the original tones.
9. The method according to claim 1, wherein the restoring the first target shooting parameter in the camera application to the preset initial value according to the parameter restoration information of the camera application comprises:
and according to the parameter recovery information applied by the camera, determining all the shooting parameters indicated by the parameter recovery information as first target shooting parameters, and recovering the first target shooting parameters to preset initial values.
10. The method according to claim 1, wherein the restoring the first target shooting parameter in the camera application to the preset initial value according to the parameter restoration information of the camera application comprises:
and determining the shooting parameters with parameter values different from the preset initial values in the shooting parameters indicated by the parameter recovery information as first target shooting parameters according to the parameter recovery information applied by the camera, and recovering the first target shooting parameters to the preset initial values.
11. The method of claim 1, wherein prior to said stopping capturing video images in response to an end-of-recording operation, further comprising:
setting a value of a second target shooting parameter of the camera application according to a shooting parameter setting operation in response to the shooting parameter setting operation; the shooting parameter setting operation carries a setting value of a second target shooting parameter; the second target photographing parameter is a photographing parameter whose parameter value is set by the user.
12. The method of claim 11, further comprising:
and determining a third target shooting parameter according to the second target shooting parameter and the first target shooting parameter, wherein the third target shooting parameter is related information of shooting parameters which do not need to be restored to a preset initial value in the second target shooting parameter.
13. The method according to claim 12, further comprising, after said stopping capturing the video image in response to the capturing end operation:
and when the second target shooting parameters comprise third target shooting parameters, keeping the parameter values of the third target shooting parameters unchanged.
14. The method of claim 2, wherein upon determining that the operational state of the camera application is a currently operational state, the method further comprises:
determining whether the running state of the camera application is switched to a background running state or a running ending state;
when the running state of the camera application is switched to a background running state or a running state is finished, determining parameter recovery information of the camera application according to the switched running state of the camera application;
and restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
15. The method according to claim 2 or 14, wherein when the running state of the camera application is determined to be a background running state, the method further comprises:
determining whether the running state of the camera application is switched to a state of ending running;
when the running state of the camera application is switched to a running finishing state, determining parameter recovery information of the camera application according to the switched running state of the camera application;
and restoring the first target shooting parameter in the camera application to a preset initial value according to the parameter restoring information of the camera application.
16. The method of claim 1, wherein prior to said capturing a video image in response to a video capture operation, further comprising:
in response to a camera mode selection operation, a target camera mode is determined.
17. The method of claim 16, wherein the target camera mode comprises a movie mode.
18. The method of claim 17, wherein prior to said stopping capturing video images in response to an end of recording operation, further comprising:
determining a target LUT tone according to the collected video image, and updating the parameter value of the filter parameter to the parameter value corresponding to the target LUT tone when the current parameter value of the filter parameter is different from the parameter value corresponding to the target LUT tone; the target LUT hue is a recommended LUT hue from the captured video image;
performing filter rendering processing on the video image according to the filter parameter;
in response to a video photographing operation, a video image is photographed.
19. An electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-18.
20. A computer-readable storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the computer-readable storage medium resides to perform the method of any one of claims 1-18.
21. A computer program product containing executable instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1 to 18.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110926459.4A CN113965693B (en) | 2021-08-12 | 2021-08-12 | Video shooting method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110926459.4A CN113965693B (en) | 2021-08-12 | 2021-08-12 | Video shooting method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113965693A true CN113965693A (en) | 2022-01-21 |
CN113965693B CN113965693B (en) | 2022-12-13 |
Family
ID=79460515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110926459.4A Active CN113965693B (en) | 2021-08-12 | 2021-08-12 | Video shooting method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113965693B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023160224A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Photographing method and related device |
CN117149294A (en) * | 2023-02-27 | 2023-12-01 | 荣耀终端有限公司 | Camera application configuration method, equipment and storage media |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104469159A (en) * | 2014-12-15 | 2015-03-25 | 乐视致新电子科技(天津)有限公司 | Shooting parameter adjusting method and device |
CN105681675A (en) * | 2016-03-22 | 2016-06-15 | 珠海格力电器股份有限公司 | Mobile terminal and photographing mode setting method and device thereof |
CN105872350A (en) * | 2015-12-08 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | Adjusting method and device for photographing parameter of camera |
CN106454107A (en) * | 2016-10-28 | 2017-02-22 | 努比亚技术有限公司 | Photographing terminal and photographing parameter setting method |
CN107391106A (en) * | 2017-06-09 | 2017-11-24 | 深圳市金立通信设备有限公司 | The initial method and terminal of camera parameter |
US10571925B1 (en) * | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and task planning |
CN111654617A (en) * | 2020-04-30 | 2020-09-11 | 浙江大华技术股份有限公司 | Method and device for controlling running state of movement lens and computer device |
-
2021
- 2021-08-12 CN CN202110926459.4A patent/CN113965693B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104469159A (en) * | 2014-12-15 | 2015-03-25 | 乐视致新电子科技(天津)有限公司 | Shooting parameter adjusting method and device |
CN105872350A (en) * | 2015-12-08 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | Adjusting method and device for photographing parameter of camera |
CN105681675A (en) * | 2016-03-22 | 2016-06-15 | 珠海格力电器股份有限公司 | Mobile terminal and photographing mode setting method and device thereof |
US10571925B1 (en) * | 2016-08-29 | 2020-02-25 | Trifo, Inc. | Autonomous platform guidance systems with auxiliary sensors and task planning |
CN106454107A (en) * | 2016-10-28 | 2017-02-22 | 努比亚技术有限公司 | Photographing terminal and photographing parameter setting method |
CN107391106A (en) * | 2017-06-09 | 2017-11-24 | 深圳市金立通信设备有限公司 | The initial method and terminal of camera parameter |
CN111654617A (en) * | 2020-04-30 | 2020-09-11 | 浙江大华技术股份有限公司 | Method and device for controlling running state of movement lens and computer device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023160224A1 (en) * | 2022-02-28 | 2023-08-31 | 荣耀终端有限公司 | Photographing method and related device |
CN117149294A (en) * | 2023-02-27 | 2023-12-01 | 荣耀终端有限公司 | Camera application configuration method, equipment and storage media |
Also Published As
Publication number | Publication date |
---|---|
CN113965693B (en) | 2022-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230396886A1 (en) | Multi-channel video recording method and device | |
CN112532857B (en) | Shooting method and equipment for delayed photography | |
CN110035141B (en) | Shooting method and equipment | |
CN113727016A (en) | Shooting method and electronic equipment | |
CN110012154A (en) | A kind of control method and electronic equipment of the electronic equipment with Folding screen | |
CN113810601B (en) | Terminal image processing method, device and terminal equipment | |
CN113473005A (en) | Shooting transition live effect insertion method, device, storage medium and program product | |
CN113727025B (en) | A shooting method, device and storage medium | |
CN113810600A (en) | Image processing method, device and terminal device for terminal | |
CN113596319A (en) | Picture-in-picture based image processing method, apparatus, storage medium, and program product | |
CN110602403A (en) | Method for taking pictures under dark light and electronic equipment | |
CN113467735A (en) | Image adjusting method, electronic device and storage medium | |
CN114489533A (en) | Screen projection method and device, electronic equipment and computer readable storage medium | |
CN113747060A (en) | Image processing method, device, storage medium and computer program product | |
CN112241194B (en) | Folding screen lighting method and device | |
CN111770282A (en) | Image processing method and device, computer readable medium and terminal equipment | |
CN113596321A (en) | Transition dynamic effect generation method, apparatus, storage medium, and program product | |
CN110248037B (en) | Identity document scanning method and device | |
CN113965693B (en) | Video shooting method, device and storage medium | |
CN113572957A (en) | A shooting focusing method and related equipment | |
CN114339429A (en) | Audio and video playing control method, electronic equipment and storage medium | |
CN114257737B (en) | Shooting mode switching method and related equipment | |
CN112272191B (en) | Data transfer method and related device | |
CN114500901A (en) | Double-scene video recording method and device and electronic equipment | |
CN113852755A (en) | Photographing method, photographing apparatus, computer-readable storage medium, and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040 Patentee after: Honor Terminal Co.,Ltd. Country or region after: China Address before: 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong Patentee before: Honor Device Co.,Ltd. Country or region before: China |
|
CP03 | Change of name, title or address |