[go: up one dir, main page]

CN117714850B - Time-lapse photography method and related equipment - Google Patents

Time-lapse photography method and related equipment Download PDF

Info

Publication number
CN117714850B
CN117714850B CN202311138044.6A CN202311138044A CN117714850B CN 117714850 B CN117714850 B CN 117714850B CN 202311138044 A CN202311138044 A CN 202311138044A CN 117714850 B CN117714850 B CN 117714850B
Authority
CN
China
Prior art keywords
shooting
preview
stream
control
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311138044.6A
Other languages
Chinese (zh)
Other versions
CN117714850A (en
Inventor
王相钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311138044.6A priority Critical patent/CN117714850B/en
Publication of CN117714850A publication Critical patent/CN117714850A/en
Application granted granted Critical
Publication of CN117714850B publication Critical patent/CN117714850B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

本申请涉及终端领域,提供了一种延时摄影方法及其相关设备,该方法包括:显示延时摄影模式下的预览界面;检测到针对拍摄参数设置控件的第一操作;响应于第一操作,触发拍摄参数设置子流程,拍摄参数设置子流程用于设置拍摄参数,拍摄参数包括录像中的抓拍速率;检测到针对拍摄控件的第二操作;响应于第二操作,基于拍摄参数进行拍摄,生成并存储目标视频流,本申请可以基于用户设置的拍摄参数获取相应延时视频,灵活满足用户需求。

The present application relates to the field of terminals and provides a time-lapse photography method and related equipment, the method comprising: displaying a preview interface in a time-lapse photography mode; detecting a first operation on a shooting parameter setting control; in response to the first operation, triggering a shooting parameter setting sub-process, the shooting parameter setting sub-process being used to set shooting parameters, the shooting parameters including a capture rate in a video recording; detecting a second operation on a shooting control; in response to the second operation, shooting based on the shooting parameters, generating and storing a target video stream, the present application can obtain a corresponding time-lapse video based on the shooting parameters set by a user, and flexibly meet user needs.

Description

Time-delay photographing method and related equipment thereof
Technical Field
The application relates to the field of terminals, in particular to a time-lapse photographing method and related equipment thereof.
Background
With the development of shooting functions in electronic devices, camera shooting functions in electronic devices are increasing. For example, the electronic device may have various photographing functions such as night scene photographing, portrait photographing, and time delay photographing. The time-lapse photography (TIME LAPSE photography) can be understood as time-lapse photography, in which images recorded for several minutes, hours, or even days can be synthesized into a video, and then the process of time-lapse and various object changes can be reproduced in a short time.
In the prior art, a terminal device generally performs sampling according to a fixed frame rate during time-lapse photography, and synthesizes a sampled data frame into a video for time-lapse photography. However, the video capturing effect obtained by the time-lapse photographing method is fixed, and cannot meet the requirements of different scenes and users.
Disclosure of Invention
The application provides a time-lapse photographing method and related equipment, which can flexibly meet different requirements of users, improve the image quality of time-lapse photographing and reduce the power consumption.
In a first aspect, there is provided a time-lapse photography method comprising:
The method comprises the steps of displaying a preview interface in a time-lapse shooting mode, wherein the preview interface comprises a shooting parameter setting control, a shooting control and a preview window, the preview window displays a preview stream, detecting a first operation aiming at the shooting parameter setting control, triggering a shooting parameter setting sub-process in response to the first operation, wherein the shooting parameter setting sub-process is used for setting shooting parameters, the shooting parameters comprise a snapshot rate in video recording, detecting a second operation aiming at the shooting control, responding to the second operation, shooting based on the shooting parameters, and generating and storing a target video stream.
The time-delay photographing method provided by the application can be used for photographing various scenes such as building manufacturing, urban scene, natural landscape, astronomical phenomenon, urban life or biological evolution and the like.
In the embodiment of the application, after entering the time-lapse shooting mode, a user can set shooting parameters by operating a shooting parameter setting control on a display interface, so that before shooting, the user can manually adjust specific shooting parameters related to the time-lapse shooting in advance based on own needs, and then, when shooting, the electronic equipment shoots based on the adjusted shooting parameters, thereby being capable of adaptively acquiring time-lapse videos corresponding to different effects, meeting various needs of the user and improving user experience.
With reference to the first aspect, in some implementations of the first aspect, the preview interface further includes a filter control, before detecting the second operation for the photographing control, the method further includes detecting a third operation for the filter control, and in response to the third operation, triggering a filter setting sub-process for setting a target filter effect.
In the implementation mode, after entering the time-lapse photographing mode, a user can set the filter by operating the filter control on the display interface, so that before photographing, the user can manually adjust a specific filter related to the time-lapse photographing in advance based on own requirements, and then, when photographing, the electronic equipment photographs based on the adjusted target filter effect, thereby being capable of adaptively acquiring time-lapse videos corresponding to different effects, meeting various requirements of the user and improving user experience.
With reference to the first aspect, in some implementations of the first aspect, in response to the second operation, shooting is performed based on shooting parameters to generate and store a target video stream, including receiving a recording request and a snapshot request in the recording, the snapshot request in the recording including a snapshot rate in the recording, acquiring an initial video stream based on the recording request and discarding, acquiring the initial snapshot stream based on the snapshot request in the recording, processing and encoding the initial snapshot stream, and generating and storing the target video stream.
In the implementation mode, the snapshot rate in the video can be preset, so that after shooting is started, the electronic equipment can acquire corresponding delay video based on the rate set by a user, in the process, only images correspondingly acquired at the rate are required to be processed, and compared with the mode that a large number of images are acquired first and then delay video is generated by frame extraction after image processing in the prior art, the mode provided by the application can reduce a large amount of power consumption. In addition, since the target video stream is obtained by snapshot-based image processing, the image quality is higher than that of the prior art.
With reference to the first aspect, in certain implementations of the first aspect, when the method includes triggering the filter setting sub-process in response to the third operation, processing and encoding the initial snapshot stream to generate and store a target video stream includes processing and encoding the initial snapshot stream in combination with the target filter effect to generate and store the target video stream.
In the implementation mode, the target filter effect can be preset, and after shooting is started, the electronic equipment can acquire corresponding delay video based on the target filter effect set by a user, in the process, filter processing is only needed to be carried out on the image correspondingly acquired based on the set shooting parameters, the number of processed images is small, the power consumption is small, and the requirements of different shooting effects can be met.
With reference to the first aspect, in some implementations of the first aspect, the initial snapshot stream includes an initial snapshot image and anti-shake parameters, and the processing and encoding of the initial snapshot stream to generate and store the target video stream includes processing with respect to the initial snapshot image in combination with the anti-shake parameters, and format converting and encoding with respect to the processed initial snapshot stream to generate and store the target video stream.
Illustratively, the anti-shake parameters may include gyro sensor (gyro) data, image feature points, and the like.
For example, format conversion may indicate converting an image located in the RAW domain into an image located in the YUV domain.
In the implementation mode, the influence of shake on the image in the acquisition process of the camera module can be eliminated by combining with anti-shake processing, and the definition of the image is improved. After format conversion, the amount of image data in the YUV domain is smaller than that in the RAW domain, so that the subsequent transmission speed can be improved.
With reference to the first aspect, in certain implementations of the first aspect, before format converting and encoding the processed initial snapshot stream to generate and store the target video stream, the method further includes processing the initial snapshot image in conjunction with the target filter effect.
In this implementation, when the user adjusts the filter effect, the initial snap-shot image may be processed in combination with the set target filter effect, so as to obtain a target video stream carrying the corresponding filter effect.
With reference to the first aspect, in some implementation manners of the first aspect, in response to the first operation, triggering a shooting parameter setting sub-flow includes displaying a first shooting parameter setting interface in response to the first operation, where the first shooting parameter setting interface includes a sub-shooting parameter control and a corresponding auxiliary control, displaying a second shooting parameter setting interface in response to a fourth operation for the auxiliary control, where the second shooting parameter setting interface corresponds to the set shooting parameter, and switching to a preview interface.
In the implementation manner, the first shooting parameter setting interface can be triggered and displayed by operating the shooting parameter setting control, then specific parameter setting is performed, and after the shooting parameters are set, the preview interface can be returned.
With reference to the first aspect, in some implementations of the first aspect, in response to the third operation, triggering the filter setting sub-process includes displaying a first filter setting interface including a plurality of filter effect options in response to the third operation, determining a target filter effect in response to a fifth operation for the plurality of filter effect options, and displaying a second filter setting interface in conjunction with the target filter effect, and switching to a preview interface.
In the implementation manner, the first filter setting interface can be triggered and displayed by operating the filter setting control, then specific filter setting is performed, and after the filter is set, the preview interface can be returned.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes receiving a preview request, acquiring an initial preview stream based on the preview request, processing and encoding the initial preview stream, and generating a preview stream.
In this implementation, in the time-lapse shooting mode, a preview request may be issued, and a preview stream may be acquired and generated, both before and after shooting is not started.
With reference to the first aspect, in certain implementation manners of the first aspect, when the method includes triggering the filter setting sub-flow in response to the third operation, processing and encoding the initial preview stream to generate a preview stream includes processing and encoding the initial preview stream in combination with the target filter effect to generate the preview stream.
In this implementation, in the time-lapse photography mode, after the filter is set before photographing is started, the initial preview image may be processed in combination with the target filter effect so that the displayed preview stream may be combined with the set target filter effect.
With reference to the first aspect, in some implementations of the first aspect, the shooting parameters further include a recording duration, a zoom magnification, and a film-forming duration.
In a second aspect, an electronic device is provided that includes a processor and a memory, the memory storing a computer program executable on the processor, the processor configured to perform:
The method comprises the steps of displaying a preview interface in a time-lapse shooting mode, detecting a first operation aiming at a shooting parameter setting control, triggering a shooting parameter setting sub-flow in response to the first operation, wherein the shooting parameter setting sub-flow is used for setting shooting parameters, the shooting parameters comprise a snapshot rate in video recording, detecting a second operation aiming at the shooting control, responding to the second operation, shooting based on the shooting parameters, and generating and storing a target video stream.
It should be appreciated that the extensions, definitions, explanations and illustrations of the relevant content in the first aspect described above also apply to the same content in the second aspect.
In a third aspect, there is provided a chip for application to an electronic device, the chip comprising one or more processors for invoking computer instructions to cause the electronic device to perform any of the time-lapse photography methods of the first aspect.
In a fourth aspect, there is provided a computer readable storage medium storing a computer program comprising program instructions which, when executed by a processor, cause an electronic device to perform any one of the time-lapse photography methods of the first aspect.
In a fifth aspect, there is provided a computer program product comprising computer program code which, when run by an electronic device, causes the electronic device to perform any of the time-lapse photography methods of the first aspect.
In the embodiment of the application, after entering the time-lapse shooting mode, a user can set shooting parameters and/or filter by operating related controls on a display interface, so that before shooting, the user can manually adjust specific shooting parameters and filters related to the time-lapse shooting in advance based on own requirements, and then, when shooting, the electronic equipment shoots based on the adjusted shooting parameters and/or target filter effects, thereby being capable of adaptively shooting time-lapse videos with different effects according to different requirements, meeting various requirements of the user and improving user experience.
Drawings
FIG. 1 is a schematic diagram of a hardware system suitable for use in an electronic device of the present application;
FIG. 2 is a schematic diagram of a software system of an electronic device provided by the related art;
FIG. 3 is a schematic diagram of a graphical user interface suitable for use with embodiments of the present application;
FIG. 4 is a schematic diagram of another graphical user interface suitable for use with embodiments of the present application;
FIG. 5 is a schematic diagram of yet another graphical user interface suitable for use with embodiments of the present application;
FIG. 6 is a schematic flow chart of a time-lapse photography method provided by an embodiment of the present application;
FIG. 7 is a schematic flow chart of another time-lapse photography method provided by an embodiment of the present application;
Fig. 8 is a schematic flowchart of a photographing parameter setting sub-process provided by an embodiment of the present application;
FIG. 9 is an interface diagram of the shooting parameter setting interface of FIG. 8;
FIG. 10 is another interface diagram of the shooting parameter setting interface of FIG. 8;
FIG. 11 is a further interface schematic of the shooting parameter setting interface of FIG. 8;
FIG. 12 is a further interface schematic of the shooting parameter setting interface of FIG. 8;
FIG. 13 is a schematic flow chart of a filter setting sub-process provided by an embodiment of the present application;
FIG. 14 is a schematic view of the filter setting interface of FIG. 13;
FIG. 15 is a schematic diagram of a software architecture according to an embodiment of the present application;
FIG. 16 is a schematic diagram illustrating a structure of each module in a hardware abstraction layer according to an embodiment of the present application;
FIG. 17 is an example of an initial preview stream, an initial video stream, and an initial snap shot stream provided by an embodiment of the present application;
Fig. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
In embodiments of the present application, the following terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
In order to facilitate understanding of the embodiments of the present application, related concepts related to the embodiments of the present application will be briefly described.
The photographing parameters may include a shutter (shutter), an exposure time, an Aperture Value (AV), an Exposure Value (EV), and a sensitivity ISO. The following description will be made separately.
The shutter is a device for controlling the time of light entering the camera to determine the exposure time of an image. The longer the shutter remains in the open state, the more light that enters the camera, and the longer the exposure time corresponding to the image. Conversely, the shorter the shutter remains in the open state, the less light enters the camera and the shorter the corresponding exposure time of the image.
The exposure time refers to the time required for the shutter to open in order to project light onto the photosensitive surface of the photosensitive material of the camera. The exposure time is determined by the sensitivity of the photosensitive material and the illuminance on the photosensitive surface. The longer the exposure time, the more light enters the camera, the shorter the exposure time, and the less light enters the camera. Thus, a long exposure time is required in a dark scene and a short exposure time is required in a backlight scene.
The aperture value is the ratio of the focal length of the lens (lens) in the camera to the lens light-passing diameter. The larger the aperture value, the more light that enters the camera. The smaller the aperture value, the less light enters the camera.
The exposure value is a value indicating the light transmission capability of the lens of the camera by combining the exposure time and the aperture value. The exposure value may be defined as:
Wherein N is an aperture value, t is exposure time, and the unit is seconds.
ISO, for measuring the sensitivity of the backsheet to light, i.e. the sensitivity or gain. For a non-sensitive film, a longer exposure time is required to achieve the same imaging as the brightness of the sensitive film. For sensitive negatives, a shorter exposure time is required to image at the same brightness as the insensitive negative.
Among the photographing parameters, a shutter, an exposure time, an aperture value, an exposure value, and ISO, the electronic device may algorithmically implement at least one of Auto Focus (AF), auto exposure (automatic exposure, AE), and auto white balance (auto white balance, AWB) to implement automatic adjustment of the photographing parameters.
The automatic focusing means that the electronic device obtains the highest image frequency component by adjusting the position of the focusing lens so as to obtain higher image contrast. The focusing is a continuously accumulated process, and the electronic equipment compares the contrast of the images shot by the lens at different positions, so that the position of the lens when the contrast of the images is maximum is obtained, and the focal length of the focusing is further determined.
Automatic exposure refers to the electronic device automatically setting an exposure value according to the available light source conditions. The electronic device can automatically set the shutter speed and the aperture value according to the exposure value of the current acquired image so as to realize the automatic setting of the exposure value.
The color of the object can be changed due to the projected light color, and the images collected by the electronic equipment can have different color temperatures under different light colors. White balance is closely related to ambient light. Regardless of the ambient light, the camera of the electronic device can recognize the white color and restore other colors with the white color as a reference. The automatic white balance can realize that the electronic equipment adjusts the fidelity degree of the image color according to the light source condition. 3A, auto focus, auto exposure and auto white balance.
Illustratively, the exposure value may be any one of-24, -4, -3, -2, -1, 0, 1,2,3, 4, 24.
And the exposure image corresponding to EV0 is used for indicating the exposure image captured by the determined exposure value 0 when the electronic equipment realizes exposure through an algorithm. And the exposure image corresponding to the EV-2 is used for indicating the exposure image captured by the determined exposure value-2 when the electronic equipment realizes exposure through an algorithm. And the exposure image corresponding to EV1 is used for indicating the exposure image captured by the determined exposure value 1 when the electronic equipment realizes exposure through an algorithm. And so on, and will not be described in detail herein.
Wherein, every 1 increase in exposure value will change one-step exposure, i.e. the exposure (which is the integral of the illuminance received by a certain surface element of the object surface in time t) is doubled, for example, the exposure time or the aperture area is doubled. Then an increase in exposure value will correspond to a slower shutter speed and a smaller f-value. As can be seen from this, EV0 increased the exposure value by 2 relative to EV-2, changing the two-stage exposure, and EV1 increased the exposure value by 1 relative to EV0, changing the one-stage exposure.
Here, when the exposure value EV is equal to 0, the exposure value is generally the optimum exposure value under the current illumination condition. Correspondingly, the exposure image correspondingly acquired by the electronic device under the EV0 condition is the best exposure image under the current illumination condition, and the best exposure image can also be called as a reference exposure image.
Fig. 1 shows a hardware system suitable for use in the electronic device of the application.
The electronic device 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a projector, etc., and the specific type of the electronic device 100 is not limited in the embodiments of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The configuration shown in fig. 1 does not constitute a specific limitation on the electronic apparatus 100. In other embodiments of the application, electronic device 100 may include more or fewer components than those shown in FIG. 1, or electronic device 100 may include a combination of some of the components shown in FIG. 1, or electronic device 100 may include sub-components of some of the components shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination of software and hardware.
Illustratively, the processor 110 may include one or more processing units. For example, the processor 110 may include at least one of an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, a neural-network processor (neural-network processing unit, NPU). The different processing units may be separate devices or integrated devices. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. For example, the processor 110 may include at least one of an inter-integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an inter-integrated circuit audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a SIM interface, a USB interface.
The connection relationships between the modules shown in fig. 1 are merely illustrative, and do not constitute a limitation on the connection relationships between the modules of the electronic device 100. Alternatively, the modules of the electronic device 100 may also use a combination of the various connection manners in the foregoing embodiments.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The electronic device 100 may implement display functions through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 may be used to display images or video.
Alternatively, the display screen 194 may be used to display images or video. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a Mini light-emitting diode (MINI LIGHT-emitting diode, mini LED), a Micro light-emitting diode (Micro LED), a Micro OLED (Micro OLED) or a quantum dot light LIGHT EMITTING diode (QLED). In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
Illustratively, the electronic device 100 may implement a photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
Illustratively, the ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the camera, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. The ISP can carry out algorithm optimization on noise, brightness and color of the image, and can optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
Illustratively, a camera 193 (which may also be referred to as a lens) is used to capture still images or video. The shooting function can be realized by triggering and starting through an application program instruction, such as shooting and acquiring an image of any scene. The camera may include imaging lenses, filters, image sensors, and the like. Light rays emitted or reflected by the object enter the imaging lens, pass through the optical filter and finally are converged on the image sensor. The imaging lens is mainly used for converging and imaging light emitted or reflected by all objects (also called a scene to be shot and a target scene, and can be understood as a scene image expected to be shot by a user) in a shooting view angle, the optical filter is mainly used for filtering out redundant light waves (such as light waves except visible light, for example, infrared light) in light rays, and the image sensor can be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The image sensor is mainly used for photoelectrically converting a received optical signal into an electrical signal, and then transmitting the electrical signal to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
Illustratively, the digital signal processor is configured to process digital signals, and may process other digital signals in addition to digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Illustratively, video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. Thus, the electronic device 100 may play or record video in a variety of encoding formats, such as moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, and MPEG4.
Illustratively, the gyroscopic sensor 180B may be used to determine a motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x-axis, y-axis, and z-axis) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B can also be used for scenes such as navigation and motion sensing games.
For example, the acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically, x-axis, y-axis, and z-axis). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The acceleration sensor 180E may also be used to recognize the gesture of the electronic device 100 as an input parameter for applications such as landscape switching and pedometer.
Illustratively, a distance sensor 180F is used to measure distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, for example, in a shooting scene, the electronic device 100 may range using the distance sensor 180F to achieve fast focus.
Illustratively, ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
Illustratively, the fingerprint sensor 180H is used to capture a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to perform functions such as unlocking, accessing an application lock, taking a photograph, and receiving an incoming call.
Illustratively, the touch sensor 180K, also referred to as a touch device. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen. The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 and at a different location than the display 194.
It should be understood that the foregoing is merely illustrative of the hardware system of the electronic device provided by the present application, and is not intended to limit the hardware system of the electronic device of the present application in any way.
Fig. 2 shows a schematic diagram of a software system of an electronic device provided by the related art.
In the prior art, as shown in fig. 2, a camera application program is started, and in response to the start shooting operation for delayed shooting in the camera application program, the camera application program can issue a preview request and a video recording request, after relevant image processing, one path of data acquired by a camera module can generate a preview stream for display, and the other path of data is processed and then subjected to frame extraction sampling and encoding, so that a preview callback stream can be generated as an obtained video for delayed shooting.
In the method, the frame rate of the frame extraction is fixed, the frame extraction coding module is required to be set in advance, and then the frame extraction sampling and coding are carried out according to the fixed frame rate when the frame extraction coding module is used for processing, so that the electronic equipment uses the fixed frame rate for carrying out the frame extraction sampling and coding according to different shooting contents or different shooting scenes, the shooting effect is fixed, the requirements of different scenes cannot be met, the preview stream with poor quality is processed, the quality of the obtained video is poor, the experience effect of a user is influenced, and in addition, the power consumption of the mode of acquiring the initial preview stream and carrying out the frame extraction after the processing is high, and the performance of the electronic equipment is influenced.
In view of the above, the present application provides a time-lapse photography method and related apparatus. In the embodiment of the application, after entering a time-lapse photographing mode, a user can set photographing parameters and/or filter lenses by operating related controls provided on a display interface, so that before photographing, the user can manually adjust specific photographing parameters and filter lenses related to time-lapse photographing in advance based on own requirements, and then, when photographing, the electronic equipment photographs based on the adjusted photographing parameters and/or target filter lens effects, thereby being capable of adaptively photographing time-lapse videos with different effects according to different requirements, meeting various requirements of the user and improving user experience.
The time-delay photographing method provided by the application can be used for photographing various scenes such as building manufacturing, urban scene, natural landscape, astronomical phenomenon, urban life or biological evolution and the like.
The time-lapse photography method provided by the application can be applied to photographing sunrise, sunset, flower growth, traffic flow, galaxy sky and the like.
The time-lapse photography method provided by the application can be applied to a streamer shutter (B shutter snapshot) scene or a streamer shutter function. The streamer shutter indicates a photographing mode through a longer exposure time, aiming at creating a streaming effect.
In an embodiment of the application, a time-lapse photographing method and related equipment are provided. According to the time-lapse photographing method provided by the embodiment of the application, the problems of fixed photographing parameters and high frame extraction processing power consumption of the electronic equipment can be solved, so that the electronic equipment can obtain high-quality time-lapse photographing videos in a low-power consumption mode in combination with different requirements.
Alternatively, the method of time-lapse photography in the embodiment of the present application may be applied to a movie mode of a camera application, or the method of time-lapse photography in the embodiment of the present application may be applied to a time-lapse photography mode of a camera application.
FIG. 3 is a schematic diagram of a graphical user interface suitable for use with embodiments of the present application.
The electronic device displays a preview interface 11 after running a camera application program, as shown in fig. 3 (a), where the preview interface 11 includes a preview image, a mode option, a thumbnail display control, a camera flip control, a zoom control, and a shooting control X1, where the mode option includes a large aperture, a night scene, a portrait, a photo, a video, a movie, a professional, and more options, when different mode options move onto a centerline of the preview interface 11, the mode is indicated to be a currently selected shooting mode, the thumbnail display control is used to display a thumbnail image of a last shot image, and the camera flip control is used to switch between a front camera and a rear camera. After the electronic equipment runs the camera application program, the shooting mode is selected by default, and the zoom multiple displayed by the zoom control is the zoom multiple corresponding to the shooting mode.
When the electronic device detects a sliding operation for the mode option, as shown in (a) of fig. 3, and when the currently selected mode is "movie" after the sliding is detected, the preview interface 12 is displayed, as shown in (b) of fig. 3, the shooting control displayed in the preview interface 12 is switched from X1 to X2, where, because the mode has been switched from the shooting mode to the movie mode, the zoom factor displayed by the zoom control is the zoom factor corresponding to the movie mode, and the preview interface 12 may not display the camera flip control. In addition, when switching to movie mode, a control Y1 may be displayed in the top status bar of the preview interface 12, the control Y1 being used to indicate whether to turn on or off the time-lapse photography mode. It can be understood that the delayed shooting mode is entered from the preview interface entry corresponding to the movie mode, which is equivalent to the secondary mode in the movie mode.
Based on the preview interface 12, the electronic device may detect a click operation on the control Y1, as shown in (c) of fig. 3, display the preview interface 13, as shown in (d) of fig. 3, and switch the shooting control displayed in the preview interface 13 from X2 to X3, where, since the mode is switched from the movie mode to the time-lapse shooting mode, a prompt Y2 may be displayed in the preview interface 13, such as displaying "time-lapse shooting is turned on" to prompt the user that the time-lapse shooting mode has been turned on in the click operation, and then the related shooting may be performed using the time-lapse shooting mode.
FIG. 4 is a schematic diagram of another graphical user interface suitable for use with embodiments of the present application.
Illustratively, after the electronic device runs the camera application, a preview interface 21 is displayed, as shown in fig. 4 (a). The description of the preview interface 21 may refer to the description of the preview interface 11, which is not described herein.
The electronic device detects a sliding operation for a mode option, as shown in fig. 4 (a), when the currently selected mode is more after the sliding is detected, a preview interface 22 is displayed, as shown in fig. 4 (b), the preview interface 22 adds a cover layer on the original preview interface 21, and simultaneously, a control Y1 indicating delayed shooting, a control indicating HDR shooting, and the like are displayed on the cover layer, and in addition, compared with the preview interface 21, the preview interface 22 may not display a thumbnail display control, a camera flip control, a zoom control, a shooting control, and the like, and the display content on the preview interface 22 may be specifically set according to needs, which is not limited by the application.
Based on the preview interface 22, the electronic device detects a click operation for the control Y1, as shown in fig. 4 (c), displays a preview interface 23, as shown in fig. 4 (d), displays a shooting control X3 in the preview interface 23, where, since the mode is the time-lapse shooting mode, the preview interface 23 may display "time-lapse shooting" to prompt the user that the time-lapse shooting mode has been in time-lapse shooting mode, or may also display a prompt Y2, as to prompt the user that the time-lapse shooting mode has been on in the click operation, and may then use the time-lapse shooting mode to perform related shooting.
FIG. 5 is a schematic diagram of yet another graphical user interface suitable for use with embodiments of the present application.
Illustratively, as shown in (a) of fig. 5, in response to a click operation for the camera application, the electronic device executes the camera application and then, by default, displays the preview interface 31, as shown in (c) of fig. 5. Or after the electronic device runs the camera application program, displaying the preview interface 11, and selecting a shooting mode by default as shown in (b) of fig. 5, and when the currently selected mode is changed to "time-lapse shooting" after the electronic device detects a sliding operation for a mode option, displaying the preview interface 31 as shown in (c) of fig. 5.
The preview interface 31 is similar to the controls included in the preview interface 23 shown in fig. 4 (d), and specific reference is made to the description of the preview interface 23, which is not repeated here.
It can be understood that, compared with fig. 3 and fig. 4, the time-lapse photography mode in fig. 5 is equivalent to an option in the primary interface after the camera application program is started, and this arrangement is more convenient for the user to select the time-lapse photography mode to photograph, so that the depth of the trigger path is shortened.
On the basis of the above three trigger time-lapse photography modes, on the one hand, as shown in (d) in fig. 3, shown in (d) in fig. 4, and shown in (c) in fig. 5, when the time-lapse photography mode is triggered to be turned on, a photographing parameter setting control Y3 may be further displayed in the preview interface 13, the preview interface 23, and the preview interface 31, where the photographing parameter setting control Y3 is used to instruct a user to adjust photographing parameters related to time-lapse photography, and detailed description of related description in a time-lapse photography method provided later may be referred to.
On the other hand, in the top status bars in the preview interface 13 and the preview interface 23, a filter control Y4 may be displayed, where the filter control Y4 is used to instruct the user to add different filters or remove filters for the shot content when the time-lapse shooting mode is triggered to be opened, and detailed description of related description in the time-lapse shooting method provided later may also be referred to, which is not repeated herein.
It should be understood that, in the related art, the shooting parameters of the electronic device corresponding to the time-delay shooting mode are preset before leaving the factory, so that in the process of using the electronic device by a user, the shooting parameters are fixed, and therefore, the time-delay shooting mode cannot adapt to different shooting requirements and cannot be adjusted and changed differently. In contrast, the application can manually adjust shooting parameters and select the filter by a user, so that different shooting requirements can be met, further, delay videos corresponding to different effects can be obtained, and user experience is improved.
It should be understood that the foregoing is illustrative of the trigger entry for the time-lapse photography mode, and is not intended to limit the trigger entry for the time-lapse photography mode of the present application in any way.
The time-lapse photography method provided by the embodiment of the present application is described in detail below with reference to fig. 6.
Fig. 6 is a schematic flowchart of a time-lapse photography method according to an embodiment of the present application. The method 200 may be performed by the electronic device 100 shown in fig. 1, and the method 200 includes S210 to S250, and S210 to S250 are described in detail below, respectively.
S210, displaying a preview interface in a time-lapse shooting mode.
The preview interface comprises a shooting parameter setting control, a shooting control and a preview window, wherein the preview window is used for displaying a preview image. The preview image is an image generated by image processing of an initial preview image acquired by the electronic equipment.
As shown in fig. 2, the camera module may collect an initial preview image and send the initial preview image to the camera hardware abstraction layer, where the camera hardware abstraction layer sends an initial preview stream including the initial preview image to the image processing module for processing, and the generated preview stream.
Illustratively, the preview interface is shown as a preview interface 13 in fig. 3 (d), the shooting parameter setting control is shown as a control Y3 in fig. 3 (d), the shooting control is shown as a control X3, the preview window is shown as a partial area for displaying a preview image in fig. 3 (d), and the preview image is moon and sea in fig. 3 (d).
Alternatively, a camera application may be run before S310.
For example, the user may instruct the electronic device to run the camera application by clicking on an icon of the "camera" application.
For example, when the electronic device is in a locked state, the user may instruct the electronic device to run the camera application through a gesture that slides to the right on the display screen of the electronic device. Or the electronic equipment is in a screen locking state, the screen locking interface comprises an icon of the camera application program, and the user instructs the electronic equipment to operate the camera application program by clicking the icon of the camera application program. Or when the electronic equipment runs other applications, the application has the authority of calling the camera application program, and the user can instruct the electronic equipment to run the camera application program by clicking the corresponding control. For example, while the electronic device is running an instant messaging type application, the user may instruct the electronic device to run the camera application, etc., by selecting a control for the camera function.
It should be appreciated that the foregoing is illustrative of operations for running a camera application, that the operations may also be indicated by voice, or other operations, for instructing an electronic device to run a camera application, and that the application is not limited in any way.
It should also be understood that running the camera application may refer to launching the camera application.
S220, a first operation of setting a control for shooting parameters is detected.
The first operation is used for indicating to enter a photographing parameter sub-process and starting to set photographing parameters, wherein the photographing parameters comprise a snapshot rate in video.
Alternatively, the first operation may be a click operation of the setting control for the photographing parameter.
Illustratively, as shown in (d) of fig. 3, the shooting parameter setting control is a control Y3, and the first operation is a click operation for the control Y3.
Alternatively, in the embodiment of the present application, the foregoing description is given by way of example of the first operation being a click operation, in the embodiment of the present application, the first operation may also be an operation of setting the shooting parameters by voice indication, or an operation of starting to set the shooting parameters by other instruction indication, which is not limited in any way.
S230, responding to the first operation, triggering a shooting parameter setting sub-process, wherein the shooting parameter setting sub-process is used for setting shooting parameters, and the shooting parameters comprise the snapshot rate in video.
It should be understood that the shooting parameter setting sub-process includes setting a capture rate in the video, and the shooting parameter may also include other parameters, which the present application is not limited to.
After triggering the shooting parameter setting sub-flow, the user may set one or more shooting parameters.
In the embodiment of the application, after the photographing parameter setting sub-process is triggered in response to the first operation, a user can perform adaptive setting and adjustment on specific photographing parameters in advance based on own needs, so that delay videos with different effects can be photographed according to own needs.
Fig. 9 to 11 are exemplary user display interfaces related to a photographing parameter setting sub-process. Fig. 9 is a parameter setting interface for adjusting recording duration and zoom magnification, fig. 10 is a parameter setting interface for adjusting snap-in speed in video recording, fig. 11 is a parameter setting interface for adjusting sheeting duration, and fig. 12 is a parameter setting interface for adjusting metering mode M, sensitivity ISO, shutter speed S, auto-exposure shooting parameter EV, auto-focus shooting parameter AF, white balance WB, etc. The user can complete the sub-flow of setting shooting parameters based on fig. 9 to 11.
S240, a second operation for the photographing control is detected.
Wherein the second operation is for instructing start of shooting. The shooting comprises video recording and snap shots in the video recording.
It should be understood that the second operation includes a recording operation and a snapshot operation in the recording.
Illustratively, as shown in (d) of fig. 3, the shooting control is a control X3, and the second operation is a click operation for the control X3.
Alternatively, in the embodiment of the present application, the above description is given by way of example of the second operation being a click operation, in the embodiment of the present application, the second operation may also be an operation of starting shooting by voice indication, or an operation of starting shooting by other instruction indication, which is not limited in any way.
S250, shooting is performed based on shooting parameters in response to the second operation, and a target video stream is generated and stored.
The shooting parameters are the shooting parameters set in S230.
In the embodiment of the application, after the electronic equipment detects the clicking operation for the shooting control, the camera application program can be triggered to issue a video recording request and a snapshot request in the video recording, wherein the snapshot request in the video recording carries the set shooting parameters, and the initial video stream and the initial snapshot stream can be acquired and then discarded in response to the video recording request and the snapshot request in the video recording, and the target video stream is obtained after the initial snapshot stream is processed.
In the embodiment of the application, the electronic equipment can acquire the corresponding delay video based on the snapshot rate in the video set by the user, in the process, only the snapshot image correspondingly acquired at the rate is required to be processed, and compared with the mode of acquiring a large number of preview images firstly and then generating the delay video by frame extraction after processing in the prior art, the mode provided by the application can reduce a large amount of power consumption.
In addition, since the target video stream is obtained by snapshot-based image processing, the image quality is higher than that of the video stream obtained by preview image acquisition.
In one possible implementation, the image quality may include image detail information, image texture information, image brightness, or image quality assessment results obtained by any image quality assessment algorithm, which is not limited in any way by the present application.
The implementation of the embodiment of the present application is described in detail below with reference to fig. 7 to 13.
Illustratively, upon detection of a shooting parameter adjustment operation and a shooting operation at an electronic device, the electronic device generates a target video stream. The shooting parameter adjustment operation may be, for example, an operation of clicking the shooting parameter setting control and an operation further performed on the subordinate control, and the shooting operation may be, for example, an operation of clicking the shooting control.
Fig. 7 is a schematic flow chart of another time-lapse photography method provided by an embodiment of the present application. The method 300 may be performed by the electronic device shown in fig. 1, and includes S310 to S360, and S310 to S360 are described in detail below, respectively.
S310, running a camera application program.
For example, the user may instruct the electronic device to run the camera application by clicking on an icon of the "camera" application. Or when the electronic equipment is in a screen locking state, the user can instruct the electronic equipment to run the camera application program through a gesture of sliding rightwards on the display screen of the electronic equipment. Or the electronic equipment is in a screen locking state, the screen locking interface comprises an icon of the camera application program, and the user instructs the electronic equipment to operate the camera application program by clicking the icon of the camera application program. Or when the electronic equipment runs other applications, the application has the authority of calling the camera application program, and the user can instruct the electronic equipment to run the camera application program by clicking the corresponding control. For example, while the electronic device is running an instant messaging type application, the user may instruct the electronic device to open the camera application, etc., by selecting a control for the camera function.
It should be appreciated that the above description is illustrative of the operation of running the camera application, that the operation may also be indicated by voice, or that other operations may instruct the electronic device to run the camera application, and that the application is not limited in any way.
It should also be understood that running the camera application may refer to launching the camera application.
S320, entering a time-lapse shooting mode and displaying a preview interface.
Optionally, the step S320 is performed by automatically entering a time-lapse photographing mode when the camera application is started, and the step S320 may further comprise detecting an operation indicating to enter the time-lapse photographing mode after the camera application is started, entering the time-lapse photographing mode in response to the operation, or entering the time-lapse photographing mode when a scene to be photographed including a preset photographing object or a preset photographing gesture is detected after the camera application is started. Entering the time-lapse photography mode may also be referred to as starting the time-lapse photography mode.
The preset shooting objects can include streets, running water, forests, buildings (such as high-rise buildings, overpasses and the like), stars and the like, the preset shooting gestures can include sliding down of an index finger, making a fist and the like, and the preset shooting objects and the preset shooting gestures can be preset according to requirements, so that the application does not limit the shooting objects and the preset shooting gestures.
Optionally, the preview interface may include a shooting parameter setting control, a filter control, a shooting control and a preview window, where the shooting parameter setting control is a control indicating adjustment of a shooting parameter, the filter control is a control indicating adjustment of a filter effect (may also be referred to as an image mode), the shooting control is a control indicating shooting, the preview window is used to display a preview image, where the preview image may include a first preview image, the first preview image is an image generated by processing an initial preview image collected by an electronic device, and the initial preview image is an image collected by the electronic device in real time.
It should be understood that, when the electronic device collects the initial preview image, the identification method provided by the related technology may be used to identify the content in the initial preview image, and when a preset shooting object or a preset shooting gesture is identified, the current shooting mode is switched to the time-lapse shooting mode, and a preview interface including the first preview image is displayed.
Illustratively, as shown in (c) of fig. 3 and (c) of fig. 4, the electronic device may detect an operation of the control Y1 corresponding to the time-lapse photography mode by the user, such as a click operation, and enter the time-lapse photography mode in response to the operation, and display the preview interface 13 as shown in (d) of fig. 3 or the preview interface 23 as shown in (d) of fig. 4.
The preview interface 13 and the preview interface 23 are both preview interfaces, the shooting parameter setting control may refer to a control Y3 in the preview interface 13 and the preview interface 23, the filter control may refer to a control Y4 in the preview interface 13 and the preview interface 23, the shooting control may refer to a control X3 in the preview interface 13 and the preview interface 23, and the preview image may refer to an image displayed in the preview interface 13 or the preview interface 23. The partial display areas of the preview interface 13 and the preview interface 23 for displaying the preview image are preview windows.
Illustratively, as shown in (b) of fig. 5, the electronic device may detect an operation, such as a sliding operation, of the user with respect to the preview interface 11, and in response to the operation, enter the time-lapse photography mode when the time-lapse photography mode is the currently selected mode, and display the preview interface 31 as shown in (c) of fig. 5.
The preview interface 31 is a preview interface, the shooting parameter setting control may refer to a control Y3 in the preview interface 31, the filter control may refer to a control Y4 in the preview interface 31, the shooting control may refer to a control X3 in the preview interface 31, and the first preview image may refer to an image displayed in the preview interface 31. It will be appreciated that a localized area in the preview interface 31 may be used to display the first preview image, as the application is not limited in this regard.
It should be appreciated that the foregoing is illustrated with the preview interface including the photographing parameter setting control, the filter control, the photographing control, and the first preview image. The preview interface can also comprise a shooting parameter setting control, a shooting control and a first preview image, or the preview interface can also comprise a filter control, a shooting control and a first preview image.
In the embodiment of the application, when the electronic device displays the preview interface, the user may trigger to perform the first procedure S330 or the second procedure S340, where the first procedure S330 is a procedure of performing shooting parameter setting in the time-lapse shooting mode of the electronic device, and the second procedure S340 is a procedure of performing filter setting in the time-lapse shooting mode of the electronic device.
Alternatively, the first procedure S330 may be triggered first and then the second procedure S340 may be triggered first, that is, the photographing parameter is set first and then the filter is set, or the second procedure S340 may be triggered first and then the first procedure S330 may be triggered first, that is, the photographing parameter is set first and then the filter is set.
The first process S330 includes S331 and S332, and S331 and S332 are described in detail below.
S331, detecting clicking operation of the shooting parameter setting control.
Optionally, S331 is exemplified by clicking on the shooting parameter setting control, S331 may also be exemplified by detecting an instruction to start shooting parameter setting, and for example, instructing the electronic device to perform shooting parameter setting by voice or other instruction information.
S332, in response to a click operation for the shooting parameter setting control, a shooting parameter setting sub-flow is triggered (S3321 to S3323).
The shooting parameter setting sub-flow indicates a flow of adjusting one or more shooting parameters by the electronic device based on the user's operation of the sub-shooting parameter control in the time-lapse shooting mode.
In the embodiment of the application, the shooting parameter setting sub-flow can comprise clicking operation for any one sub-shooting parameter control, sliding operation, clicking operation, input operation and the like for an accessory control corresponding to the sub-shooting parameter control.
Illustratively, the photographing parameter setting sub-process may include S3321 to S3323, and S3321 to S3323 will be described in detail below, respectively, with reference to fig. 8.
S3321, in response to clicking operation of the shooting parameter setting control in the preview interface, displaying a first shooting parameter setting interface.
The first photographing parameter setting interface may be referred to as a first-level interface in the photographing parameter setting sub-flow.
Optionally, the first shooting parameter setting interface includes four sub shooting parameter controls, which may be respectively referred to as a first sub shooting parameter control, a second sub shooting parameter control, a third sub shooting parameter control, and a fourth sub shooting parameter control. Of course, other sub-shooting parameter controls may also be included in the first shooting parameter setting interface, which is not limited in any way in the embodiment of the present application.
It should be noted that, when the sub-shooting parameter control is in the selected state, the auxiliary control corresponding to the sub-shooting parameter control in the selected state may be displayed in the first shooting parameter setting interface, and when the sub-shooting parameter control is in the unselected state, the auxiliary control corresponding to any one sub-shooting parameter control may not be displayed in the first shooting parameter setting interface. For example, when the icon corresponding to the sub-shooting parameter control is white, the icon is used for indicating that the selected state is present, and when the icon is gray, the icon is used for indicating that the unselected state is present.
The first sub-shooting parameter control may refer to a control for entering a secondary interface corresponding to a recording duration and a multiplying power, the second sub-shooting parameter control may refer to a control for entering an interface corresponding to a setting rate (or referred to as a snapshot rate), the third sub-shooting parameter control may refer to a control for entering an interface corresponding to a film duration, and the fourth sub-shooting parameter control may refer to a control for entering an interface corresponding to a shooting parameter such as a light metering mode M, a light sensitivity ISO, a shutter speed S, an automatic exposure shooting parameter EV, an automatic focusing shooting parameter AF, a white balance WB and the like.
The light measurement mode M may include three types, such as a first type of matrix light measurement suitable for wide scenery, a second type of central light measurement suitable for a scene with a main picture and located in the middle, and a third type of spot light measurement suitable for stage photography.
The electronic device may display a shooting parameter setting interface 901, as shown in (a) of fig. 9, as an example, in response to a clicking operation on a shooting parameter setting control Y3, where the shooting parameter setting interface 901 is a first shooting parameter setting interface, and the first sub-shooting parameter control Y51 is in a selected state, and in a case where the first sub-shooting parameter control Y51 is in a selected state, an accessory control corresponding to the first sub-shooting parameter control Y51 may be included in the shooting parameter setting interface 901, and as a control Y52 shown in (a) of fig. 9, the control Y52 may indicate a sliding axis corresponding to a recording duration, and the sliding axis is used for providing a user to adjust the recording duration in the time-lapse shooting mode.
In addition, optionally, in the case that the first sub-shooting parameter control is in the selected state, the shooting parameter setting interface 901 may further include another auxiliary control corresponding to the first sub-shooting parameter control, such as a control Y53 shown in (a) of fig. 9, where the control Y53 may indicate a multiple zoom magnification option, and the multiple zoom magnification option is used to provide the user with adjustment of the zoom magnification in the time-lapse shooting mode. The multiple zoom factors may also be displayed in a sliding axis fashion.
The electronic device may display, for example, a shooting parameter setting interface 1001, as shown in fig. 10 (a), in response to a clicking operation on the shooting parameter setting control Y3, where the shooting parameter setting interface 1001 is a first shooting parameter setting interface, and the second sub-shooting parameter control Y61 is in a selected state, and in the case where the second sub-shooting parameter control Y61 is in a selected state, an accessory control corresponding to the second sub-shooting parameter control Y61 is further included in the shooting parameter setting interface 1001, and as shown in fig. 10 (a) a control Y62, the control Y62 may indicate a sliding axis corresponding to a speed, and the sliding axis is used to provide the user with a snapshot speed in the video in the time-lapse shooting mode.
For example, in response to a clicking operation on the shooting parameter setting control Y3, the electronic device may display a shooting parameter setting interface 1101, as shown in (a) of fig. 11, where the shooting parameter setting interface 1101 is a first shooting parameter setting interface, and the third sub-shooting parameter control Y71 is in a selected state, and in a case where the third sub-shooting parameter control Y71 is in a selected state, the shooting parameter setting interface 1101 further includes an accessory control corresponding to the third sub-shooting parameter control Y71, as shown in (a) of fig. 11, the control Y72 may indicate a sliding axis corresponding to a film-forming duration, and the sliding axis is used to provide a user to adjust the film-forming duration in the time-lapse shooting mode.
For example, in response to a clicking operation on the shooting parameter setting control Y3, the electronic device may display a shooting parameter setting interface 1201, as shown in fig. 12 (a), where the fourth sub-shooting parameter control Y81 is in a selected state, and in the case where the fourth sub-shooting parameter control Y81 is in a selected state, the shooting parameter setting interface 1201 further includes an accessory control corresponding to the fourth sub-shooting parameter control, as shown in fig. 12 (a), the control Y82 may indicate a plurality of professional shooting parameter names and corresponding default modes or default values, and be used to provide a user with a mode or value corresponding to the professional shooting parameter in the time-lapse shooting mode.
The first shooting parameter setting interface may further include a shooting control, a preview window, and a preview image displayed in the preview window, where for the description of the shooting control, the preview window, and the preview image, reference may be made to the description in S320 above, and details are not repeated here.
Optionally, in S3321, the shooting parameter setting interface 901 when the first sub-shooting parameter control is in the selected state is directly displayed by clicking the shooting parameter setting control, or the shooting parameter setting interface 1001 when the second sub-shooting parameter control is in the selected state is directly displayed, or the shooting parameter setting interface 1101 when the third sub-shooting parameter control is in the selected state is directly displayed, or the shooting parameter setting interface 1201 when the first sub-shooting parameter control is in the selected state is directly displayed. S3321 may further display an interface including four sub-photographing parameter controls in a non-selected state (for example, icons corresponding to the four controls are all gray), based on which, in response to a clicking operation of the user on any sub-photographing parameter control, a photographing parameter setting interface corresponding to the selected state of the control may be displayed.
It should be understood that in the case of displaying the photographing parameter setting interface as shown in fig. 9 (a), 10 (a), 11 (a), and 12 (a), the subordinate controls displayed in the photographing parameter setting interface may be displayed switchably in response to a click operation by the user on any other control in a non-selected state.
S3322, in response to user operation of the auxiliary control in the first shooting parameter setting interface, displaying a second shooting parameter setting interface and changing corresponding shooting parameters.
It should be appreciated that operating with respect to the adjunct control in the different first shooting parameter setting interface, the electronic device may display a different second shooting parameter setting interface. The second shooting parameter setting interface comprises the same accessory control as the corresponding first shooting parameter setting interface, and the second shooting parameter setting interface further comprises a preview image.
It should be understood that the user operation includes a slide operation, an input operation, etc., to which the present application is not limited in any way.
For example, as shown in fig. 9 (a), for an accessory control (control Y52) corresponding to a first sub-shooting parameter control in the shooting parameter setting interface 901, when a sliding operation for the control Y52 is detected, the electronic device may display a shooting parameter setting interface 902 shown in fig. 9 (b), where the sliding operation is a user operation for the accessory control, and the shooting parameter setting interface 902 is a second shooting parameter setting interface.
The shooting parameter setting interface 902 includes a control Y52, the top status bar of the shooting parameter setting interface 902 may further display an adjusted recording duration and other shooting parameters, for example, the capturing rate in the default video is 15x, the film-forming duration is not limited, and in response to the sliding operation for the control Y52, the adjusted recording duration is 30min, and subsequently, when the sliding operation for the control Y52 is continuously detected, the value corresponding to the recording duration in the top status bar is also changed. The shooting parameter setting interface 902 further includes a control Y53, and when the control Y53 is changed in response to another slide operation by the user, the zoom magnification is also changed.
For example, as shown in fig. 10 (a), for an accessory control (control Y62) corresponding to the second sub-shooting parameter control in the shooting parameter setting interface 1001, when a sliding operation of the user on the control Y62 is detected, the electronic device may display a shooting parameter setting interface 1002 as shown in fig. 10 (b), where the sliding operation is a user operation on the accessory control, and the shooting parameter setting interface 1002 is the second shooting parameter setting interface.
The shooting parameter setting interface 1002 includes a control Y62, and the top status bar of the shooting parameter setting interface 1002 may also display the adjusted speed and other shooting parameters. For example, in connection with one of the foregoing examples, the recording duration has been adjusted to 30 minutes (min), the sheeting duration is unlimited, and the adjusted rate is 90x in response to a sliding operation for control Y62. Subsequently, when the sliding operation for the control Y62 is continuously detected, the value corresponding to the velocity in the top status bar is also changed.
For example, as shown in fig. 11 (a), for the accessory control (control Y72) corresponding to the third sub-shooting parameter control in the shooting parameter setting interface 1101, when a sliding operation of the user for the control Y72 is detected, the electronic device may display the shooting parameter setting interface 1102 shown in fig. 11 (b), where the sliding operation is a user operation for the accessory control, and the shooting parameter setting interface 1102 is a second shooting parameter setting interface.
The shooting parameter setting interface 1102 includes a control Y72, and the top status bar of the shooting parameter setting interface 1102 may also display the adjusted film forming duration and other shooting parameters. For example, in combination with the two previous examples, the rate has been adjusted to 90x, the recording duration has been adjusted to 30min, and the adjusted sheeting duration is 3min in response to the sliding operation for control Y72. Subsequently, when the slide operation for the control Y72 is continuously detected, the numerical value corresponding to the piece forming duration in the top status bar is also changed.
It should be explained that, the recording duration indicates the duration of the video recording of the electronic device, and the film forming duration indicates the duration of the generated target video stream.
For example, as shown in fig. 12 (a), for an accessory control (control Y82) corresponding to a fourth sub-shooting parameter control in the shooting parameter setting interface 1201, when a click operation of a user on a specific shooting parameter name or value in the control Y82 is detected, the electronic device may display the shooting parameter setting interface 1202 shown in fig. 12 (b), and based on the shooting parameter setting interface 1202, the user may change the value corresponding to any shooting parameter in the control Y82. The clicking operation and the inputting operation are user operations for the auxiliary control, and the shooting parameter setting interface 1202 is a second shooting parameter setting interface.
Wherein the shooting parameter setting interface 1202 includes a control Y82, and a numerical value corresponding to each shooting parameter included in the control Y82 can be changed based on an input operation. The top status bar of the shooting parameter setting interface 1202 may also display other shooting parameters, and in combination with the above three examples, the rate has been adjusted to 90x, the recording duration has been adjusted to 30min, and the sheeting duration has been adjusted to 3min.
S3323, switching to a preview interface. When the adjustment operation for the shooting parameters is detected, the shooting parameters corresponding to the returned preview interface are the adjusted shooting parameters.
It should be understood that when the user triggers the photographing parameter setting sub-process, but the photographing parameter adjustment is not actually performed, the returned preview interface is the same as the photographing parameter corresponding to the previously displayed preview interface.
For example, as shown in (b) of fig. 9, when the electronic apparatus detects that the sliding operation is stopped or after stopping for a preset period of time, it automatically switches to the preview interface shown in (c) of fig. 5, at which time the photographing parameters corresponding to the preview interface 31 after switching are different from the photographing parameters corresponding to before adjustment.
For example, as shown in (b) in fig. 9, when the electronic device detects that the sliding operation is stopped, the four sub-photographing parameter controls are photographing parameter setting interfaces in an unselected state, and at this time, the photographing parameter setting interface 901 may further include a return control, such as a cross beside the time-lapse photographing, for indicating to return to the interface associated with the previous stage. Thus, when a clicking operation for the return control is detected, the shooting parameter setting interface in which all the four sub-shooting parameter controls are in an unselected state can be returned to the preview interface, such as the preview interface 31, or when a clicking operation for the control X3 in the shooting parameter setting interface in which all the four sub-shooting parameter controls are in an unselected state is detected, the shooting parameter setting interface can be returned to the preview interface.
The above description of the procedure of returning to the first shooting parameter setting interface after the shooting parameter adjustment is illustrated in fig. 9, and the procedure of returning to the first shooting parameter setting interface after the shooting parameter adjustment corresponding to fig. 10 and 11 may refer to the above description and will not be repeated herein.
It should be understood that, through the shooting parameter setting sub-processes described in S3321 to S3323, the shooting parameters related to multiple shooting involved in the delayed shooting scene may be manually adjusted and set, so that the delayed shooting may be performed subsequently based on the user' S requirement, thereby improving the user experience.
The second process S340 includes S341 and S342, and S341 and S342 are described in detail below.
S341, detecting clicking operation for the filter control.
Optionally, the operation of clicking the filter control is illustrated in S341, and S341 may also instruct the electronic device to start the filter setting, for example, by voice or other instruction information, for detecting an instruction to start the filter setting.
S342, in response to a click operation for the filter control, a filter setting sub-flow is triggered (S3421 to S3423).
The filter setting sub-flow indicates a flow of invoking a certain filter effect by the electronic device based on the operation of the user on the filter control in the time-lapse photographing mode.
In the embodiment of the application, the filter setting sub-flow can comprise clicking operation for any icon indicating the filter effect, sliding operation for a plurality of icons indicating different filter effects and the like.
Illustratively, the filter setting sub-process may include S3421 and S3423, and S3421 to S3423 are described in detail below with reference to fig. 13, respectively.
S3421, a first filter setting interface is displayed in response to clicking operation of the filter control in the preview interface.
Optionally, the first filter setting interface includes an accessory control of the filter control, such as a plurality of filter effect options and an option of not increasing the filter effect, which may also be referred to as an original image option. Wherein the plurality of filter effects may include vivid, blue-colored, neutral gray, etc.
Illustratively, the electronic device may display a filter setting interface 24 in response to a click operation on a filter control, as shown in fig. 14 (a), and the filter setting interface 24 is a first filter setting interface, as shown in fig. 14 (b). In response to the clicking operation, the filter control Y4 is in an on state, and meanwhile, an accessory control (control Y9) corresponding to the filter control, such as a plurality of icons indicating different filter effects and icons corresponding to original image options, is also displayed on the filter setting interface 24. When the filter effect is started, the default effect is the original image.
S3422, determining a target filter effect in response to the user operation of the auxiliary control in the first filter setting interface, and displaying a second filter setting interface in combination with the target filter effect.
It should be appreciated that operating with respect to the adjunct control in the first filter setting interface, the electronic device may display a second filter setting interface, where preview images may correspond to different filter effects.
The second filter setting interface includes the same accessory control as the first filter setting interface, but because the target filter effect is set, the preview image included in the second filter setting interface is different from the original preview image in filter effect, but the image content is the same.
It should be understood that the user operation includes a slide operation, a click operation, or the like, to which the present application is not limited in any way.
Illustratively, as shown in (b) of fig. 14, for the filter setting interface 24, the electronic device may detect a sliding operation of the user with respect to the accessory control (control Y9), and when it is determined that the user selects "effect 1" as the target filter effect, the electronic device displays the filter setting interface 25 as shown in (c) of fig. 14. The sliding operation is a user operation for the accessory control, and the filter setting interface 25 is a second filter setting interface. The preview image displayed on the filter setting interface 25 has a target filter effect corresponding to "effect 1" combined, and the display effect is different from that of the preview image displayed on the filter setting interface 24. Here, "effect 1" may be used to indicate "vivid", "effect 2" is used to indicate "blue tone", and "effect 3" is used to indicate "middle gray".
S3423, switching to the preview interface. When the adjustment operation for the filter is detected, the filter effect corresponding to the preview image in the returned preview interface is the target filter effect.
It should be understood that when the user triggers the filter setting sub-flow, but the filter adjustment is not actually performed, the returned preview interface is the same as the previously displayed preview interface in terms of the corresponding effect, and is the original image effect or the same filter effect.
Illustratively, when the electronic apparatus detects that the slide operation is stopped or after stopping for a preset period of time, the electronic apparatus automatically switches to the interface 26 shown in fig. 14 (d), the interface 26 being a preview interface, or as shown in fig. 14 (c), the filter setting interface 25 may switch to the interface 26 in response to a click operation for the filter control Y4, or the filter setting interface 25 may switch to the interface 26 in response to a click operation for an area other than the control Y9 in the filter setting interface 25. At this time, the filter effect corresponding to the interface 26 is different from the filter effect corresponding to the preview interface before adjustment.
For example, in the filter setting interface 25 shown in fig. 14 (c), a return control may be further set to instruct to return to the interface associated with the previous stage, so that after the filter effect is adjusted, the user may return to the preview interface from the filter setting interface 25 by clicking the return control, but the filter effect corresponding to the preview image in the returned preview interface is the target filter effect.
It should be understood that, through the filter setting sub-processes described in S3421 to S3423, manual adjustment can be performed for the filter effect involved in the delayed photographing scene, so that delayed photographing can be performed subsequently based on the user' S requirement, and the user experience is improved.
S350, detecting clicking operation for the shooting control.
Optionally, the operation of clicking the shooting control is illustrated in S350, and the shooting start operation may also be detected in S350, for example, by using voice or other indication information to instruct the electronic device to start shooting.
S360, shooting is performed based on the set shooting parameters and/or the filter in response to the clicking operation, and a target video stream is generated and stored.
Optionally, the first and second processes may be performed after S350, that is, after shooting is started, shooting parameter setting and/or filter setting may be performed. It will be appreciated that in this manner, the shooting parameters and filter effects will only be combined in the part of the video stream acquired after the setup is completed.
S310 to S360 are exemplified by the procedure in which the electronic device detects the user' S shooting parameter adjustment and filter adjustment. Alternatively, as an implementation, the electronic device may also perform only S310 to S330, S350, and S360, that is, the electronic device may detect only the process of the user for photographing parameter adjustment and photographing. As another implementation, the electronic device may also perform only S310, S320, S340, S350, and S360, that is, the electronic device may detect only the process of the user for filter adjustment and photographing.
In the embodiment of the application, after entering the time-lapse shooting mode, a user can set shooting parameters and/or filter lenses by operating related controls on a display interface, so that before shooting, the user can manually adjust specific shooting parameters and filter lenses related to the time-lapse shooting in advance based on own needs, and then, when shooting, the electronic equipment shoots based on the adjusted shooting parameters and/or target filter lens effects, thereby being capable of adaptively shooting time-lapse videos with different effects according to different needs, meeting various needs of the user and improving user experience.
Because the shooting parameters and the filters are changeable, the setting mode is simple, for example, the snapshot rate in video recording can be preset, then the electronic equipment can acquire corresponding delay video based on the rate set by a user in a delay shooting mode, in the process, only the image correspondingly acquired at the rate is required to be processed, compared with the mode of acquiring a large number of images firstly and generating delay video for the frame extracted after image processing in the prior art, the mode provided by the application can reduce a large amount of power consumption, and in addition, because the target video stream is acquired based on the snapshot image processing, the image quality is higher than that of the prior art.
The implementation process related to the above step S360 is described in detail below with reference to fig. 15.
Fig. 15 is a software architecture diagram according to an embodiment of the present application. As shown in fig. 15, the software architecture provided by the embodiment of the present application may include an application layer, an application framework layer, a hardware abstraction layer, and a hardware layer.
By way of example, the application layer may include a series of application packages. As shown in fig. 15, the application package may include a camera application and a gallery application.
The application program layer is positioned at the top end of the whole frame, bears the responsibility of directly carrying out interaction by a user, and once receiving shooting or video recording requirements of the user, sends the requirements to the application program frame layer through an interface until the application program frame layer carries out feedback processing results. For example, as shown in fig. 15, in the embodiment of the present application, the result includes a preview stream and a target video stream, and then the application layer feeds back the result to the user.
The application framework layer is located between the application layer and the hardware abstraction layer, and the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
The application framework layer is a framework of applications, and a developer can develop some applications based on the application framework layer under the development principle of following the framework of the applications. In the embodiment of the application, the application framework layer comprises an access interface and the like corresponding to the camera application.
The hardware abstraction layer is used for abstracting hardware and providing a virtual hardware use platform for an operating system.
For example, as shown in FIG. 15, the hardware abstraction layer may include a camera hardware abstraction layer and an image processing module. The functions of each module will be described in detail in the following embodiments, and are not described in detail here.
Illustratively, as shown in fig. 15, the hardware layer mainly includes a display screen and a camera module (e.g., including a video camera 193), and the like.
Next, taking an example of clicking operation on the shooting control after detecting that the user performs shooting parameter setting according to the first procedure shown in fig. 7 and performs filter setting according to the second procedure, the interaction procedure between the modules in the electronic device is described in detail in conjunction with the software architecture diagram shown in fig. 15.
S411, after detecting the set shooting parameters and the set filters, the camera application receives a click operation of the shooting control by the user.
Illustratively, as shown in (d) of fig. 14, the user has manually set the shooting parameters and filters in the time-lapse photography mode, such as a rate of 90X, a recording time of 30min, etc., and the filter is a vivid filter indicated by "effect 1", and after setting, the user has clicked on control X3 to instruct the electronic device to shoot according to the shooting parameters and target filter effects that were just set.
S412, responding to the clicking operation, and generating and sending a video recording request to the hardware abstraction layer by the camera application program.
The step S412 may specifically include, in response to the clicking operation, the camera application generating and sending a video request to the application framework layer, where the application framework layer receives the video request and then sends the video request to the hardware abstraction layer.
It should be noted that, the shooting parameters involved in the video recording request are performed based on the original settings of the camera, and the application does not change this. For example, if the recording rate in the default time-delay shooting mode is 240x, the shooting parameters carried by the issued recording request are 240x.
S413, responding to clicking operation, and generating and sending a snapshot request in video to a hardware abstraction layer by the camera application program.
It should be understood that the photographing request issued by the camera application includes the set photographing parameters and the set filters.
It should be further understood that the delayed shooting mode is equivalent to a video mode in a special case, so that in general, in the prior art, in response to an operation of a user on a shooting control, a video request is issued and is correspondingly taken as a delayed video after the video stream is acquired and processed. For example, assuming that the user sets the rate to 90x, the snapshot request in the video includes a snapshot rate value in the video of 90x.
The recording request and the snapshot request in the recording may be collectively referred to as a shooting request.
S414, the hardware abstract layer transmits the video request and the snapshot request in the video to the hardware layer, and receives two paths of data streams (such as a first data stream and a second data stream) returned by the hardware layer.
The step S414 may specifically include that the hardware abstraction layer issues the video request and the snapshot request in the video to the camera module in the hardware layer through the included camera hardware abstraction layer, and the camera module returns the result to the camera hardware abstraction layer after collecting the image data.
It should be understood that the two data streams respectively correspond to video and photo, and the camera module is equivalent to acquiring image data related to a video request and image data related to a photo request at the same time, or the camera module is equivalent to transmitting a data stream corresponding to video by using a video channel and transmitting a data stream corresponding to photo.
For example, the first data stream sent by the camera module into the camera hardware abstraction layer may be used to generate a video stream, where the video stream is a video obtained by shooting with a default shooting parameter corresponding to the electronic device in the time-lapse shooting mode.
For example, the second data stream sent from the camera module to the camera hardware abstraction layer may be used to generate a photographing stream, where the photographing stream is photographed by using photographing parameters set by the user in the time-lapse photographing mode, and the obtained multi-frame images or photographing streams.
S415, after the hardware abstraction layer generates the initial video stream based on the first data stream, discarding the initial video stream.
Specifically, the camera hardware abstraction layer in the hardware abstraction layer generates an initial video stream based on the first data stream, discards the initial video stream, but does not delete, or may delete.
Illustratively, the initial video stream may include a plurality of frames of initial video images located in the RAW domain.
It should be understood that, since the present application is equivalent to taking a picture by only using a video recording channel, the initial video stream acquired based on the video recording request is not data of the video to be finally acquired, and no processing is required, so that the initial video stream can be discarded after being generated.
S416, after the hardware abstraction layer generates the initial snapshot stream based on the second data stream, the initial snapshot stream is processed.
Optionally, the hardware abstraction layer includes an image processing module in addition to the camera hardware abstraction layer.
Specifically, a camera hardware abstraction layer in the hardware abstraction layer generates an initial snapshot stream based on the second data stream, and transmits the initial snapshot stream to the image processing module for processing.
For example, the initial snapshot stream may include multiple frames of initial snapshot images located in the RAW domain.
It should be understood that the application is equivalent to photographing by using a video channel, and the corresponding photographing parameters are collected based on the photographing parameters set by the user, so that the corresponding photographing flow is collected according to the requirement of the user without any resource waste.
In addition, the photographing mode by using the video channel does not affect the original functions of related chips related to the electronic equipment, and the function of taking a snapshot in video is bound with the video function.
For example, fig. 16 is a schematic structural diagram of each module in a hardware abstraction layer provided by an embodiment of the present application, as shown in (a) of fig. 16, a chip related to the present application may be a high-pass chip of CamX-CHI architecture, which includes a case (Usecase) having a correspondence with a delayed photography mode in a camera application, and Usecase includes Feature2 (Feature 2), where Feature2 includes a corresponding Pipeline (Pipeline). When the camera hardware abstract layer receives a shooting request, the Feature2 can be used for sending a result returned by the hardware layer to a Pipeline for processing so as to acquire a video stream and a shooting stream, and a video recording function and a shooting function in video recording are realized.
Alternatively, S415 and S416 may be performed simultaneously. For example, when the shooting control is clicked, the initial video stream and the initial snapshot stream are acquired in a triggering manner in response to the issued video recording request and the snapshot request in the video recording, and the initial video stream is discarded and the initial snapshot is processed.
Optionally, the image processing module comprises a snap shot image processing module.
Specifically, a snap image processing module in the image processing module processes the initial snap stream after receiving the initial snap stream so as to improve the image quality.
Optionally, the snap image processing module may include a first filtering sub-module, and at least one of an anti-shake processing sub-module and a first format conversion sub-module.
For example, as shown in fig. 16 (b), if the set target filter effect is "vivid", the first filter mirror module may process in combination with the set filter effect during the process of the present application, so that the image included in the processed initial snapshot stream carries the "vivid" filter effect. That is, the first filter sub-module is configured to process an input image in combination with a filter effect set by a user, and generate an image with the filter effect.
If the user does not set the filter effect, the first filter submodule does not need to process the input image, and this step can be skipped.
As shown in fig. 16 (b), the initial snapshot flow may include other data such as gyroscope sensor (gyro) data and image feature points, and based on this, the anti-shake processing sub-module may combine the data to perform anti-shake processing on the initial snapshot image in the initial snapshot flow or the image processed by the first filter sub-module, so as to eliminate the influence of shake on the image in the acquisition process of the camera module, and improve the definition of the image.
The gyroscope sensor data can be acquired based on the gyroscope sensor, the image feature points can be extracted by utilizing a network model provided by a related technology, and the anti-shake algorithm can be an intra-frame anti-shake algorithm provided by the related technology, so that the application is not limited to the intra-frame anti-shake algorithm.
Illustratively, as shown in (b) of fig. 16, the first format conversion submodule is configured to perform format conversion on the initial snapshot stream, the image output by the first filter submodule, or the image output by the anti-shake processing submodule, and convert the image in the RAW domain into the image in the YUV domain, so that the processed initial snapshot stream may include multiple frames of images in the YUV domain.
The amount of image data in the YUV domain is smaller than that in the RAW domain, so that the speed of subsequent transmission can be increased. Of course, the first format conversion sub-module may perform other format conversion, which is not limited in any way by the present application.
Illustratively, the snapshot image processing module may further include a buffering sub-module for buffering data during processing, such as all or part of the images in the processed initial snapshot stream.
It should be understood that, if the rate is set when the shooting parameters are set as described above, the number of frames of images included in the processed initial snapshot stream is the same as the number of frames of images included in the initial snapshot stream, that is, corresponds to the rate in the shooting parameters set by the user. Thus, in the processing process of the application, the snapshot image processing module does not need to carry out frame extraction processing.
It should be understood that fig. 16 is only an example for a snap image processing module, and the snap image processing module may also include other functional sub-modules, which is not limited in any way by the present application.
S417, the hardware abstraction layer returns the processed initial snapshot stream to the gallery application program.
S418, the gallery application program encodes the processed initial snapshot stream, and then generates a target video stream for storage.
The target video stream may also be referred to as a delayed video, that is, a slice obtained in response to a click operation of the shooting control by the user in the delayed shooting mode.
In S411, after the user clicks the shooting control, the shooting control in the display interface may be switched to an end control, before S417, the user may click the end control to end shooting, as a way, in response to a clicking operation for the end control, the camera module stops acquiring the initial video stream and data related to the initial snapshot stream, the hardware abstraction layer returns the processed initial snapshot stream generated from clicking the shooting control to clicking the end control to the gallery application program, and the gallery application program may encode the processed initial snapshot stream by using the first encoding module to generate the target video stream and store the target video stream.
In another way, the hardware abstraction layer can return the processed image in the initial snapshot stream to the gallery application program in real time, the first coding module in the gallery application program codes the image, and the gallery application program stores the generated target video stream when responding to the operation of the ending control.
Optionally, the present application further includes:
S419, the camera application program sends a preview request to the hardware abstraction layer.
It should be understood that when the camera application is turned on, when the camera application is switched from another mode to the time-lapse photographing mode and does not start photographing, and when the time-lapse photographing mode is started, after photographing is started to before photographing is finished in response to a click operation on a photographing control, the camera application needs to issue a preview request in the process.
S420, the hardware abstract layer sends a preview request to the hardware layer and receives a data stream (third data stream) returned by the hardware layer.
The step S420 may specifically include that the hardware abstraction layer issues the preview request to the camera module in the hardware layer through the included camera hardware abstraction layer, and the camera module returns the result to the camera hardware abstraction layer after collecting the image data.
It should be appreciated that the data stream corresponds to the preview image, and the data stream may be acquired based on default preview capture parameters of the electronic device.
S421, after generating an initial preview stream based on the third data stream to the hardware layer, the hardware abstraction layer processes the initial preview stream.
Illustratively, the initial preview stream may include a plurality of frames of initial preview images located in the RAW domain.
Optionally, the hardware abstraction layer includes an image processing module, and the image processing module includes a preview image processing module.
Specifically, the preview image processing module processes the initial preview stream after receiving the initial preview stream to improve the image quality.
Optionally, the preview image processing module may include a second filter mirror module.
For example, as shown in fig. 16 (c), if the filter is set to be "bright" when the filter is set as described above, the second filter mirror module may be used to process the input image in conjunction with the user setting the filter effect to generate an image with the filter effect after the filter is set in the process of the present application.
It should be noted that, if the user does not turn on the filter effect or before setting the filter effect, the preview image processing module does not need to process the input image, and skips this step.
Optionally, the preview image processing module may further include a second format conversion sub-module.
Illustratively, the second format conversion sub-module may be configured to format convert an output image of the initial preview stream or the second filter sub-module into an image in the RAW domain and convert the image into an image in the YUV domain.
The image data volume in the YUV domain is smaller than that in the RAW domain, so that the subsequent transmission rate can be improved. Of course, the second format conversion sub-module may perform other format conversion, which is not limited in any way by the present application.
S422, the hardware abstraction layer returns the processed initial preview stream to the camera application program.
S423, the camera application program codes the processed initial preview stream and generates a preview stream for display.
The camera application program can utilize the second coding module to code the processed initial snapshot stream, and generate a preview stream for display.
Fig. 17 is an example of an initial preview stream, an initial video stream, and an initial snapshot stream provided by an embodiment of the present application. The initial preview stream and the initial video stream may be acquired according to default shooting parameters, and the initial snapshot stream may be acquired according to shooting parameters set by a user, which are different.
In the embodiment of the application, because in a professional video recording channel, photos can be simultaneously captured in the video recording process, in the implementation mode, after the shooting parameters and the filters are set in the time-lapse shooting mode, the initial video stream and the initial captured stream are acquired based on the mode, then the initial video stream is discarded without any processing, so that a large amount of power consumption can be saved.
In addition, the shooting parameters and the filter setting mode are simple, the user uses a lower threshold, the shooting parameters are set by the user to capture, and only the images acquired by the set shooting parameters are required to be processed.
In addition, the image processed by the method is the image captured by the professional video recording channel, the processing mode is the processing of the photographed image, and the steps of anti-shake and the like are combined, so compared with the mode of simply processing the preview image to obtain the delay video in the prior art, the quality of the delay video obtained by the method is higher.
The time-lapse photographing method provided by the embodiment of the present application is described in detail above with reference to fig. 1 to 17, and the device embodiment of the present application will be described in detail below with reference to fig. 18 to 19. It should be understood that the apparatus in the embodiments of the present application may perform the methods of the foregoing embodiments of the present application, that is, specific working procedures of the following various products may refer to corresponding procedures in the foregoing method embodiments.
Fig. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 500 includes a display module 510 and a processing module 520.
The display module 510 is configured to display a preview interface in the time-lapse photography mode.
The processing module 520 is configured to detect a first operation for a shooting parameter setting control, trigger a shooting parameter setting sub-process in response to the first operation, the shooting parameter setting sub-process being configured to set shooting parameters including a snapshot rate in a video, and further configured to detect a second operation for the shooting control, and in response to the second operation, perform shooting based on the shooting parameters, and generate and store a target video stream.
The electronic device 500 is embodied as a functional module. The term "module" herein may be implemented in software and/or hardware, and is not specifically limited thereto.
For example, a "module" may be a software program, a hardware circuit, or a combination of both that implements the functionality described above. The hardware circuitry may include Application Specific Integrated Circuits (ASICs), electronic circuits, processors (e.g., shared, proprietary, or group processors, etc.) and memory for executing one or more software or firmware programs, merged logic circuits, and/or other suitable components that support the described functions.
Thus, the elements of the examples described in the embodiments of the present application can be implemented in electronic hardware, or in a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Fig. 19 shows a schematic structural diagram of an electronic device provided by the present application. The dashed line in fig. 19 indicates that the unit or the module is optional and that the electronic device 600 may be used to implement the time lapse photography method described in the method embodiments described above.
The electronic device 600 includes one or more processors 601, the one or more processors 601 being operable to support the electronic device 600 to implement the time-lapse photography method of the method embodiments. The processor 601 may be a general purpose processor or a special purpose processor. For example, the processor 601 may be a central processing unit (central processing unit, CPU), digital signal processor (DIGITAL SIGNAL processor, DSP), application Specific Integrated Circuit (ASIC), field programmable gate array (field programmable GATE ARRAY, FPGA), or other programmable logic device such as discrete gates, transistor logic, or discrete hardware components.
Alternatively, the processor 601 may be used to control the electronic device 600, execute software programs, and process data of the software programs. The electronic device 600 may also include a communication unit 605 to enable input (reception) and output (transmission) of signals.
For example, the electronic device 600 may be a chip, the communication unit 605 may be an input and/or output circuit of the chip, or the communication unit 605 may be a communication interface of the chip, which may be an integral part of a terminal device or other electronic device.
For another example, the electronic device 600 may be a terminal device, the communication unit 605 may be a transceiver of the terminal device, or the communication unit 605 may be one or more memories 602 in the terminal device, on which a program 604 is stored, the program 604 being executable by the processor 601 to generate instructions 603, such that the processor 601 performs the time-lapse photography method described in the above method embodiments according to the instructions 603.
Optionally, the memory 602 may also have data stored therein.
Alternatively, the processor 601 may also read data stored in the memory 602, which may be stored at the same memory address as the program 604, or which may be stored at a different memory address than the program 604.
Alternatively, the processor 601 and the memory 602 may be provided separately or may be integrated together, for example, on a System On Chip (SOC) of the terminal device.
Optionally, the present application also provides a computer program product which when executed by the processor 601 implements the time-lapse photography method of any of the method embodiments of the present application.
For example, the computer program product may be stored in the memory 602, such as the program 604, and the program 604 is finally converted into an executable object file capable of being executed by the processor 601 through preprocessing, compiling, assembling, and linking.
Optionally, the present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a computer implements the time lapse photography method of any of the method embodiments of the present application. The computer program may be a high-level language program or an executable object program.
For example, the computer-readable storage medium is, for example, memory 602. The memory 602 may be volatile memory or nonvolatile memory, or the memory 602 may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (doubledata RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM).
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the above-described embodiments of the electronic device are merely illustrative, e.g., the division of modules is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In addition, the term "and/or" herein is merely an association relation describing the association object, and means that three kinds of relations may exist, for example, a and/or B, and that three kinds of cases where a exists alone, while a and B exist alone, exist alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method of the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present application, and the application should be covered. Therefore, the protection scope of the present application should be defined by the claims, and the above description is only a preferred embodiment of the technical solution of the present application, and is not intended to limit the protection scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. A time lapse photography method, the method comprising:
Displaying a preview interface in a time-lapse photography mode, wherein the preview interface comprises a shooting parameter setting control, a shooting control and a preview window, and the preview window displays a preview stream;
detecting a first operation of the shooting parameter setting control;
triggering a shooting parameter setting sub-process in response to the first operation, wherein the shooting parameter setting sub-process is used for setting shooting parameters, and the shooting parameters comprise a snapshot rate in video;
Detecting a second operation for the capture control;
receiving a shooting request in response to the second operation, wherein the shooting request comprises a video recording request and a snapshot request in video recording, and the snapshot request in video recording comprises a snapshot rate in video recording;
based on the video recording request, acquiring an initial video stream by using a video recording channel, discarding the initial video stream, but not deleting the initial video stream, wherein the initial video stream comprises a plurality of frames of initial video images positioned in a RAW domain;
Meanwhile, multiplexing the video channel to obtain an initial snapshot stream based on a snapshot request in the video, wherein the initial snapshot stream comprises a plurality of frames of initial snapshot images positioned in a RAW domain, and the initial snapshot images in the initial snapshot stream are identical to part of the initial video images in the initial video stream;
and processing and encoding the initial snapshot stream to generate and store a target video stream.
2. The method of time lapse photography of claim 1, wherein the preview interface further comprises a filter control;
before detecting the second operation for the shooting control, the method further comprises:
Detecting a third operation for the filter control;
and in response to the third operation, triggering a filter setting sub-process, wherein the filter setting sub-process is used for setting a target filter effect.
3. The time-lapse photography method of claim 2, wherein when the method comprises triggering a filter setting sub-process in response to the third operation, the processing and encoding of the initial snapshot stream, generating and storing the target video stream, comprises:
And processing and encoding the initial snapshot stream in combination with the target filter effect to generate and store the target video stream.
4. A time-lapse photography method according to claim 3, wherein the initial snapshot stream further comprises anti-shake parameters;
the processing and encoding the initial snapshot stream, generating and storing the target video stream, including:
Processing the initial snap image by combining the anti-shake parameters;
and performing format conversion and coding on the processed initial snapshot stream, and generating and storing the target video stream.
5. The time-lapse photography method of claim 4, wherein prior to format converting and encoding the processed initial snapshot stream, generating and storing the target video stream, the method further comprises:
And processing the initial snap image and combining the target filter effect.
6. The time-lapse photography method according to any one of claims 1 to 5, wherein triggering a photographing parameter setting sub-process in response to the first operation comprises:
Responding to the first operation, displaying a first shooting parameter setting interface, wherein the first shooting parameter setting interface comprises a sub shooting parameter control and a corresponding accessory control;
responding to a fourth operation for the auxiliary control, displaying a second shooting parameter setting interface, wherein the second shooting parameter setting interface corresponds to the set shooting parameters;
And switching to the preview interface.
7. A time-lapse photography method according to claim 2 or claim 3, wherein in response to the third operation, triggering the filter setting sub-process comprises:
In response to the third operation, displaying a first filter setting interface, the first filter setting interface including a plurality of filter effect options;
Determining the target filter effect in response to a fifth operation for the plurality of filter effect options, and displaying a second filter setting interface in conjunction with the target filter effect;
And switching to the preview interface.
8. The time lapse photography method of claim 7, further comprising:
Receiving a preview request;
acquiring an initial preview stream based on the preview request;
And processing and encoding the initial preview stream to generate the preview stream.
9. The time-lapse photography method of claim 8, wherein when the method comprises triggering a filter setting sub-process in response to the third operation, processing and encoding the initial preview stream to generate the preview stream comprises:
And processing and encoding the initial preview stream in combination with the target filter effect to generate the preview stream.
10. The method according to any one of claims 1 to 5, 8, and 9, wherein the photographing parameters further include a recording time period, a zoom magnification, and a sheeting time period.
11. An electronic device comprising a processor and a memory;
The memory is used for storing a computer program capable of running on the processor;
the processor for performing the time lapse photography method of any one of claims 1 to 10.
12. A chip for application to an electronic device, the chip comprising one or more processors for invoking computer instructions to cause the electronic device to perform the time lapse photography method of any of claims 1 to 10.
13. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a processor, cause an electronic device to perform the time lapse photography method of any one of claims 1 to 10.
CN202311138044.6A 2023-08-31 2023-08-31 Time-lapse photography method and related equipment Active CN117714850B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311138044.6A CN117714850B (en) 2023-08-31 2023-08-31 Time-lapse photography method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311138044.6A CN117714850B (en) 2023-08-31 2023-08-31 Time-lapse photography method and related equipment

Publications (2)

Publication Number Publication Date
CN117714850A CN117714850A (en) 2024-03-15
CN117714850B true CN117714850B (en) 2024-12-13

Family

ID=90143058

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311138044.6A Active CN117714850B (en) 2023-08-31 2023-08-31 Time-lapse photography method and related equipment

Country Status (1)

Country Link
CN (1) CN117714850B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076309A1 (en) * 2016-10-29 2018-05-03 华为技术有限公司 Photographing method and terminal
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography
CN113709377A (en) * 2021-09-07 2021-11-26 深圳市道通智能航空技术股份有限公司 Method, device, equipment and medium for controlling aircraft to shoot rotation delay video

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110086985B (en) * 2019-03-25 2021-03-30 华为技术有限公司 Recording method for delayed photography and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018076309A1 (en) * 2016-10-29 2018-05-03 华为技术有限公司 Photographing method and terminal
CN112532857A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Shooting method and equipment for delayed photography
CN113709377A (en) * 2021-09-07 2021-11-26 深圳市道通智能航空技术股份有限公司 Method, device, equipment and medium for controlling aircraft to shoot rotation delay video

Also Published As

Publication number Publication date
CN117714850A (en) 2024-03-15

Similar Documents

Publication Publication Date Title
KR102381713B1 (en) Photographic method, photographic apparatus, and mobile terminal
CN114157804B (en) Long-focus shooting method and electronic equipment
CN109951633B (en) Method for shooting moon and electronic equipment
CN112150399B (en) Image enhancement method based on wide dynamic range and electronic equipment
CN101610363B (en) Apparatus and method of blurring background of image in digital image processing device
CN114092364B (en) Image processing method and related device
US9013589B2 (en) Digital image processing apparatus and digital image processing method capable of obtaining sensibility-based image
CN113382169A (en) Photographing method and electronic equipment
CN116055890B (en) Method and electronic device for generating high dynamic range video
CN113630558B (en) Camera exposure method and electronic equipment
CN118175436B (en) Image processing method and related device
US11989863B2 (en) Method and device for processing image, and storage medium
CN117880628A (en) Shooting method and related equipment thereof
JP6810299B2 (en) Image processing equipment, methods, and programs as well as imaging equipment
CN113810604A (en) Document shooting method and device
EP4376433A1 (en) Camera switching method and electronic device
CN116546316A (en) Method for switching cameras and electronic equipment
CN110581957B (en) Image processing method, image processing device, storage medium and electronic equipment
JP2015033064A (en) Imaging device, control method therefor, program, and storage medium
CN117714850B (en) Time-lapse photography method and related equipment
EP4304188A1 (en) Photographing method and apparatus, medium and chip
CN113994660B (en) Intelligent flash intensity control system and method
CN116723383A (en) Shooting method and related equipment
JP6353585B2 (en) Imaging apparatus, control method therefor, program, and storage medium
WO2019183833A1 (en) Method and apparatus for image formation, capturing apparatus and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee after: Honor Terminal Co.,Ltd.

Country or region after: China

Address before: 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong

Patentee before: Honor Device Co.,Ltd.

Country or region before: China