[go: up one dir, main page]

CN110505408B - Terminal shooting method and device, mobile terminal and readable storage medium - Google Patents

Terminal shooting method and device, mobile terminal and readable storage medium Download PDF

Info

Publication number
CN110505408B
CN110505408B CN201910873337.6A CN201910873337A CN110505408B CN 110505408 B CN110505408 B CN 110505408B CN 201910873337 A CN201910873337 A CN 201910873337A CN 110505408 B CN110505408 B CN 110505408B
Authority
CN
China
Prior art keywords
shooting
predicted
terminal
motion
shooting target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910873337.6A
Other languages
Chinese (zh)
Other versions
CN110505408A (en
Inventor
彭叶斌
周凡贻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN201910873337.6A priority Critical patent/CN110505408B/en
Publication of CN110505408A publication Critical patent/CN110505408A/en
Priority to PCT/CN2019/122776 priority patent/WO2021047070A1/en
Application granted granted Critical
Publication of CN110505408B publication Critical patent/CN110505408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a terminal shooting method, a terminal shooting device, a mobile terminal and a computer readable storage medium, wherein displacement information of a shooting target of a terminal camera preview frame is obtained and tracked; then, a predicted focusing area is determined according to the displacement information, and focusing is carried out in the predicted focusing area; finally, whether the shooting target enters the prediction focusing area is judged; and if the shooting target enters the predicted focusing area, finishing the shooting of the shooting target. When shooting non-static living bodies (human or living animals and plants) or objects, the invention does not need manual focusing of users, avoids the phenomenon that pictures are not clear due to inaccurate focusing, and improves the shooting effect; meanwhile, the shooting process is simplified, the shooting operation steps of the user are reduced, and the user experience is improved.

Description

Terminal shooting method and device, mobile terminal and readable storage medium
Technical Field
The invention relates to the technical field of terminal intelligent shooting, in particular to a terminal shooting method and device, a mobile terminal and a readable storage medium.
Background
With the wide use of the shooting function of the mobile terminal, the requirement of the user on the shooting effect of the shooting function of the mobile terminal is higher and higher, and how to simply and conveniently shoot clear photos is one of the problems to be solved by the existing mobile terminal. In the prior art, the shooting of the picture is completed by manually focusing a shooting target through a user and then clicking a confirmation frame, when a non-static living body (a person or an animate animal or plant body) or an object needs to be shot, such as a walking person, an active and active pet, a plant floating in the wind, an advancing train, a swinging clock and the like, because a certain time is needed in the process of manually focusing, the position of the shooting target is changed, so that the focusing is inaccurate, the picture is unclear, the shooting effect is poor, and the user experience is poor.
Disclosure of Invention
The invention mainly aims to provide a terminal shooting method, a terminal shooting device, a mobile terminal and a computer storage medium, and aims to solve the technical problem that in the prior art, a camera of the mobile terminal is not focused accurately when shooting a non-static shooting target, so that the shooting effect is not clear.
In order to achieve the above object, an embodiment of the present invention provides a terminal shooting method, where the terminal shooting method includes the following steps:
acquiring and tracking displacement information of a shooting target of a terminal camera preview frame;
determining a predicted focusing area according to the displacement information, and focusing in the predicted focusing area;
judging whether the shooting target enters the prediction focusing area or not;
and if the shooting target enters the predicted focusing area, finishing the shooting of the shooting target.
Optionally, the displacement information includes a historical motion trajectory and an object attribute within a preset unit duration before the current time, and the step of determining the predicted focusing area according to the displacement information includes:
acquiring a predicted motion track of the shooting target in the next preset unit time according to the historical motion track;
and determining a predicted focusing area according to the position point on the predicted motion trail.
Optionally, the step of obtaining a predicted motion trajectory of the shooting target in the next preset unit time according to the historical motion trajectory includes:
determining the object attribute of the shooting target, and determining a motion rule description table and a predicted motion speed of the shooting target according to the object attribute, wherein the motion rule description table comprises a motion reference point and a predicted motion direction mapped by the motion reference point;
and determining the predicted motion track of the shooting target in the next preset unit time according to the motion rule description table, the predicted motion speed and the historical motion track.
Optionally, the step of determining the predicted motion trajectory according to a motion law description table, the predicted motion speed, and the historical motion trajectory includes:
determining a current motion reference point of the shooting target according to the historical motion track;
searching the predicted motion direction mapped by the current motion reference point in the motion track description table;
and determining the predicted motion track according to the predicted motion speed and the predicted motion direction.
Optionally, the step of determining a predicted focusing area according to a position point on the predicted motion trajectory includes:
taking a preset number of position points on the predicted motion trail, and determining a reference position center of a graph formed by the position points on the predicted motion trail;
and determining the predicted focusing area by taking the reference position center as a geometric center.
Optionally, the step of determining the predicted focusing area with the reference position center as a geometric center includes:
taking the reference position center as the geometric center of the whole area formed by the shooting targets;
and determining the display size of the whole area according to the screen ratio of each shooting target relative to a terminal screen so as to generate a predicted focusing area.
Optionally, the step of determining whether the shooting target enters the predicted focusing area includes:
and judging whether the geometric center of the whole area formed by each shooting target is overlapped with the geometric center of the prediction focusing area, and if so, judging that each shooting target enters the prediction focusing area.
Optionally, before the step of acquiring and tracking the displacement information of the terminal camera preview frame shooting target, the method includes:
detecting whether a shooting target of the terminal camera preview frame is a non-static living body or a non-static object;
and if so, executing the step of acquiring and tracking the displacement information of the shooting target of the terminal camera preview frame.
Optionally, if the photographic target enters the predicted focusing area, the step of completing the photographing of the photographic target includes:
if the shooting target enters the predicted focusing area, the user can manually confirm shooting or start a shooting module to automatically shoot.
The present invention also provides a terminal photographing device, including:
the information acquisition module is used for acquiring and tracking displacement information of a shooting target of the terminal camera preview frame;
the area determining module is used for determining a predicted focusing area according to the displacement information and focusing in the predicted focusing area;
the judging module is used for judging whether the shooting target enters the prediction focusing area or not;
and the shooting module is used for finishing the shooting of the shooting target if the shooting target enters the prediction focusing area.
Optionally, the area determination module of the terminal camera includes a prediction unit and a determination unit:
the prediction unit is used for acquiring a predicted motion track of the shooting target in the next preset unit duration according to the historical motion track;
and the determining unit is used for determining a predicted focusing area according to the position point on the predicted motion track.
The present invention also provides a mobile terminal, comprising: the terminal shooting method comprises a memory, a processor and a terminal shooting program which is stored on the memory and can run on the processor, wherein the terminal shooting program realizes the steps of the terminal shooting method when being executed by the processor.
The present invention also provides a computer storage medium having a terminal photographing program stored thereon, which when executed by a processor implements the steps of the terminal photographing method as described above.
In the present embodiment, when photographing a non-stationary living body (a person or a living animal or plant) and a non-stationary object, displacement information of a photographing target is captured by first acquiring and tracking a terminal camera preview frame; then, a predicted focusing area is determined according to the displacement information, and focusing is carried out in the predicted focusing area; finally, whether the shooting target enters the prediction focusing area is judged; and if the shooting target enters the predicted focusing area, finishing the shooting of the shooting target. The method and the device perform focusing in the predicted focusing area, can perform shooting when the shooting target enters the predicted focusing area, do not need manual focusing of a user, avoid inaccurate focusing and unclear pictures caused by position change of the shooting target, and improve the shooting effect; meanwhile, the shooting process is simplified, the shooting operation steps of the user are reduced, and the user experience is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a terminal shooting method according to an embodiment of the present invention;
FIG. 3 is a detailed flowchart of step S20 according to an embodiment of the terminal shooting method of the present invention;
FIG. 4 is a detailed flowchart of step S21 according to an embodiment of the terminal shooting method of the present invention;
FIG. 5 is a flowchart illustrating a detailed process of step S212 according to an embodiment of the terminal shooting method of the present invention;
FIG. 6 is a detailed flowchart of step S22 according to an embodiment of the terminal shooting method of the present invention;
fig. 7 is a detailed flowchart of step S222 according to an embodiment of the terminal shooting method of the present invention;
FIG. 8 is a functional block diagram of a terminal camera area according to the present invention;
FIG. 9 is a schematic diagram of a detailed functional module of the area determination module of the terminal camera according to the present invention;
fig. 10 is a schematic view of an application scenario of an embodiment of a terminal shooting method according to the present invention;
fig. 11 is a schematic view of another application scenario of the terminal shooting method embodiment of the present invention;
fig. 12 is a schematic view of another application scenario of the terminal shooting method embodiment of the present invention;
fig. 13 is a schematic view of another application scenario of the terminal shooting method according to the embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), and the like.
In the following description, taking a mobile terminal as an example, please refer to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store a software program and various data, and the memory 109 may be a computer storage medium, and the memory 109 stores the terminal photographing program of the present invention. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Such as processor 110, executes the terminal capture program in memory 109 to implement the steps of the various embodiments of the terminal capture method of the present invention.
Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein. The mobile terminal 100 can be connected with other terminal devices through bluetooth, so as to realize communication and information interaction.
Based on the hardware structure of the mobile terminal, the invention provides various embodiments of the method.
The invention provides a terminal shooting method, which is applied to a mobile terminal, and in one embodiment of the terminal shooting method, referring to fig. 2, the terminal shooting method comprises the following steps:
step S10, acquiring and tracking displacement information of a shooting target of the terminal camera preview frame;
the execution subject of the invention can be a terminal shooting application; the terminal camera preview frame refers to a display interface of the mobile terminal camera after the user opens the mobile terminal camera application; when a user opens the camera of the mobile terminal, the user enables the shooting target to be displayed in the preview frame according to the shooting composition of the user by adjusting the position of the mobile terminal, at the moment, the shooting target can move in the preview frame, so that the speed and the direction of the shooting target are changed, namely, displacement is generated, and information describing the displacement is displacement information.
Step S20, determining a prediction focusing area according to the displacement information, and focusing in the prediction focusing area;
the predicted focus area refers to an area on a track that is located on the basis of the formed displacement of the moving photographic target and is likely to be formed next; focusing in the predicted focusing area means that the terminal shooting application focuses in the predicted focusing area in advance before the shooting target enters the predicted focusing area.
Step S30, judging whether the shooting target enters the predicted focusing area;
because the shooting target can be displaced in the shooting process and is not absolutely static, the position relation between the shooting target and the predicted focusing area can be changed in real time in the moving process of the shooting target.
In step S40, when the imaging target enters the predicted focus area, the imaging of the imaging target is completed.
When the shooting target enters the predicted focusing area, shooting can be performed without manual focusing of a user.
In the present embodiment, when photographing a non-stationary living body (a person or a living animal or plant) and a non-stationary object, displacement information of a photographing target is captured by first acquiring and tracking a terminal camera preview frame; then, a predicted focusing area is determined according to the displacement information, and focusing is carried out in the predicted focusing area; finally, whether the shooting target enters the prediction focusing area is judged; and if the shooting target enters the predicted focusing area, finishing the shooting of the shooting target. The method and the device perform focusing in the predicted focusing area, perform shooting when the shooting target enters the predicted focusing area, do not need manual focusing of a user, avoid inaccurate focusing and unclear pictures caused by position change of the shooting target, and improve the shooting effect; meanwhile, the shooting process is simplified, the shooting operation steps of the user are reduced, and the user experience is improved.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, where the displacement information includes a historical motion trajectory within a preset unit time length before the current time, referring to fig. 3, step S20 includes:
step S21, acquiring the predicted motion track of the shooting target in the next preset unit duration according to the historical motion track;
the historical motion track belongs to one of the displacement information and refers to a track formed by displacement of a shooting target in a terminal camera preview frame; the predicted movement locus of the next preset unit duration refers to a locus formed by displacement of the photographic target which is likely to occur within a preset unit time after the current time.
In step S22, a predicted focusing area is determined based on the position point on the predicted movement trajectory.
The position point on the predicted motion trajectory refers to a point forming the predicted motion trajectory of the photographic subject.
In the embodiment, the information acquisition module acquires the historical motion track of the shooting target, the predicted motion track is acquired from the historical motion track, and the predicted focusing area is determined according to the position point on the predicted motion track, so that the shooting application of the terminal can focus in the predicted focusing area in advance, shooting can be performed as long as the shooting target enters the predicted focusing area, and the phenomenon that the picture is unclear due to inaccurate focusing is avoided.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, the displacement information further includes an object attribute of the shooting target, and referring to fig. 4, step S21 includes:
step S211, determining the object attribute of the shooting target, and determining a motion rule description table and a predicted motion speed of the shooting target according to the object attribute, wherein the motion rule description table comprises a motion reference point and a predicted motion direction mapped by the motion reference point;
the object attribute of the shooting target belongs to another displacement information, which means that the shooting target can be one or more of human, animal, plant and object; the predicted motion direction mapped by the motion reference point and the motion reference point refers to an estimated motion direction corresponding to a shooting target at a certain motion reference point according to a general motion rule of object attributes, such as a swinging big clock, which is called a pendulum for short, and when the selected motion reference point is the lowest point of a pendulum motion track, the predicted motion direction of the big clock at the motion reference point is the tangential direction of the pendulum motion track; the motion rule description table refers to a mapping relation table of motion reference points and predicted motion directions for describing different object attributes, and the motion reference points and the predicted motion directions mapped by the motion reference points can be inquired by the motion rule description table; the predicted movement speed refers to the movement speed of a shooting target which is predicted according to the general movement rule of the object attribute; because different types of object attributes, motion rules and motion speeds are different, the object attributes of the shooting target need to be determined first, and then the motion rules and the predicted motion speeds of the shooting target need to be determined according to the object attributes.
And step S212, determining the predicted motion track of the shooting target in the next preset unit time according to the motion rule description table, the predicted motion speed and the historical motion track.
The motion rule description table can inquire the predicted motion direction which has a mapping relation with the motion reference point of the shooting target; when the predicted movement direction and the predicted movement speed are determined, the predicted movement locus of the shooting target can be determined by combining the historical movement locus.
In the embodiment, firstly, the object attribute of the shooting target is determined, then the motion reference point, the predicted motion direction and the predicted motion speed of the shooting target are determined according to the object attribute, and finally, the predicted motion track of the shooting target in the next preset unit time length is determined by combining the historical motion track.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, referring to fig. 5, step S212 includes:
step A10, determining the current motion reference point of the shooting target according to the historical motion track;
the current motion reference point refers to a position point at the current time on the historical motion trajectory.
Step A20, searching the predicted motion direction mapped by the current motion reference point in the motion track description table;
each reference motion point in the motion track description table has a corresponding predicted motion direction, and the motion direction of the shooting target can be predicted only if the direction of the reference motion point is determined.
And step A30, determining a predicted motion track according to the predicted motion speed and the predicted motion direction.
When the predicted movement speed and the predicted movement direction are determined, the movement locus of the shooting target from the current movement reference point can be predicted to obtain the predicted movement locus.
In this embodiment, referring to fig. 10, it is assumed that the shooting target is a swinging big clock, referred to as a pendulum, and the historical movement track (the arc solid line portion) of the pendulum is the point moving from the highest point on the left side that the pendulum can reach to the point where the pendulum is perpendicular to the ground, i.e. the lowest point: when the lowest point of the motion track is selected as a motion reference point, the predicted motion direction corresponding to the lowest point is the tangential direction of the pendulum motion track; predicting the movement speed to be gradually reduced from the maximum speed of the lowest point; the predicted movement track (the dotted line of the arc line) moves from the lowest point to the highest point on the right side.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, referring to fig. 6, step S22 includes:
step S221, position points on a preset number of predicted motion tracks are taken, and the reference position center of a graph formed by the position points on the predicted motion tracks is determined;
referring to fig. 11, if 3 position points on the predicted motion trajectory are taken, the center of the triangle formed by the 3 position points is the reference position center.
Step S222, determining the predicted focusing area with the reference position center as the geometric center.
After the reference position center is used as the geometric center of the predicted focusing area, the user may adjust, such as zoom in, zoom out, rotate, etc., the predicted focusing area, thereby obtaining the predicted focusing area.
In this embodiment, the reference position center of the graph formed by the position points on the predicted motion trajectory with the preset number is taken as the geometric center of the predicted focusing area, so as to determine the predicted focusing area, and the predicted focusing area is larger and may be located at the position to which the shooting target will move.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, referring to fig. 7, step S222 includes:
step A40, using the reference position center as the geometric center of the whole area formed by each shooting target;
after determining the reference position center, the reference position center may be taken as a geometric center of an overall area formed by each of the photographic targets; namely, the reference center position of the whole frame selection area covering all the shooting targets is determined.
Step A50, according to the screen ratio of each shooting target relative to the terminal screen, determining the display size of the whole area to generate the predicted focusing area.
Determining the display size of the whole area according to the screen ratio of each shooting target to the terminal screen, wherein the larger the screen ratio of the shooting target to the terminal screen is, the larger the display size of the whole area is,
in the embodiment, after the geometric center position of the whole area is determined as the reference position center, and the display size of the whole area is determined based on the screen occupation ratio, the whole area at the moment is used as the predicted focusing area to generate the predicted focusing area, so that the predicted focusing area is moderate in size, can contain all the shooting targets and cannot be overlarge in size, and the position of the preset focusing area is on the predicted motion track, thereby being more beneficial to accurately focusing the shooting targets subsequently.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, step S30 includes:
and judging whether the geometric center of the whole area formed by each shooting target is overlapped with the geometric center of the prediction focusing area, and if so, judging that each shooting target enters the prediction focusing area.
When the number of the shooting targets is one, referring to fig. 12, the geometric center is the center of gravity of the shooting targets; when the number of the photographic targets is more than one, referring to fig. 13, the geometric center is the gravity center of the whole area formed by each photographic target, and when the number of the photographic targets is more than one, the speed and the direction of each photographic target are not necessarily consistent in the moving process, so that the gravity center of each position can be changed in real time; when the geometric center of the shooting target is coincident with the geometric center of the prediction focusing area, the positions of the geometric center of the shooting target and the geometric center of the prediction focusing area in a preview frame of the terminal camera are the same, and therefore the fact that the shooting target enters the prediction focusing area is obtained.
In the embodiment, whether the shooting target enters the focusing area or not is judged, and shooting can be performed if the shooting target enters the focusing area, so that the problem of inaccurate focusing is avoided, and the shooting effect is improved.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, the steps before step S10 include:
step A60, detecting whether the shooting target of the terminal camera preview frame is a non-static living body or a non-static object;
the photographing target refers to a non-stationary living body (a person or a living animal or plant) or an object that the user wants to photograph, such as a walking person, a lively pet, a plant floating in the wind, an advancing train, a swinging bell, and the like.
And step A70, if yes, executing the step of acquiring and tracking the displacement information of the shooting target of the terminal camera preview frame.
When the shooting target is a non-static living body or a non-static object, the displacement information of the shooting target is acquired.
In this embodiment, it is determined whether the shooting target is a non-stationary living body or a non-stationary object, and if so, the step of obtaining the displacement information of the shooting target is performed, so that the accuracy of information obtaining is improved.
Further, on the basis of an embodiment of the terminal shooting method of the present invention, step S40 includes:
if the shooting target enters the predicted focusing area, the user can manually confirm shooting or start the shooting module to automatically shoot.
When the shooting target enters the predicted focusing area, which indicates that the focusing of the shooting target is finished, the user can manually click the confirmation frame to finish the shooting of the shooting target; or the application starts the shooting module to automatically finish shooting the shooting target, and the automatic shooting effect is better than the manual shooting effect.
In this embodiment, there are two modes of manual shooting and automatic shooting for completing shooting of a shooting target: the manual mode is convenient for the user to select the photographing time independently, and the automatic photographing effect is better than the manual photographing effect.
In addition, referring to fig. 8, an embodiment of the present invention further provides a terminal shooting device, where the terminal shooting device includes:
the information acquisition module is used for acquiring and tracking displacement information of a shooting target of the terminal camera preview frame;
the area determining module is used for determining a predicted focusing area according to the displacement information and focusing in the predicted focusing area;
the judging module is used for judging whether the shooting target enters a prediction focusing area or not;
and the shooting module is used for finishing shooting of the shooting target if the shooting target enters the predicted focusing area.
Optionally, referring to fig. 8, an embodiment of the present invention further provides a terminal shooting device, where the terminal shooting device further includes:
the detection module is used for detecting whether the shooting target of the terminal camera preview frame is a non-static living body or a non-static object, and if so, the information acquisition module is triggered to execute the step of acquiring and tracking the displacement information of the shooting target of the terminal camera preview frame.
Alternatively, referring to fig. 9, the region determining module includes a prediction unit and a determining unit:
the prediction unit is used for acquiring a predicted motion track of the shooting target in the next preset unit duration according to the historical motion track;
and the determining unit is used for determining a predicted focusing area according to the position point on the predicted motion track.
Optionally, the prediction unit is further configured to:
determining object attributes of the shooting target, and determining a motion rule description table and a predicted motion speed of the shooting target according to the object attributes, wherein the motion rule description table comprises a motion reference point and a predicted motion direction mapped by the motion reference point;
and determining the predicted motion track of the shooting target in the next preset unit time according to the motion rule description table, the predicted motion speed and the historical motion track.
Optionally, the prediction unit is further configured to:
determining a current motion reference point of the shooting target according to the historical motion track;
searching a predicted motion direction mapped by a current motion reference point in a motion track description table;
and determining a predicted motion track according to the predicted motion speed and the predicted motion direction.
Optionally, the determining unit is further configured to:
taking a preset number of position points on the predicted motion trail, and determining a reference position center of a graph formed by the position points on the predicted motion trail;
and determining a predicted focusing area by taking the reference position center as a geometric center.
Optionally, the determining unit is further configured to:
taking the center of the whole area formed by each shooting target as the geometric center of each shooting target;
and determining the screen occupation ratio of the predicted focusing area according to the screen occupation ratio of each shooting target.
Taking the center of the reference position as the geometric center of the whole area formed by all the shooting targets;
and determining the display size of the whole area according to the screen ratio of each shooting target relative to the terminal screen so as to generate a predicted focusing area.
Optionally, the determining module is further configured to:
and judging whether the geometric center of the whole area formed by each shooting target is overlapped with the geometric center of the prediction focusing area, and if so, judging that each shooting target enters the prediction focusing area.
The steps implemented by each functional module of the terminal shooting device can refer to each embodiment of the terminal shooting method of the present invention, and are not described herein again.
In addition, the present invention also provides a mobile terminal, comprising: a memory 109, a processor 110 and a terminal shooting program stored on the memory 109 and operable on the processor 110, the mobile terminal display control program implementing the steps of the embodiments of the terminal shooting method described above when executed by the processor 110.
In addition, the present invention also provides a computer-readable storage medium, which stores one or more programs, and the one or more programs can be further executed by one or more processors for implementing the steps of the embodiments of the terminal shooting method.
The specific implementation of the apparatus, the mobile terminal and the readable storage medium (i.e., the computer readable storage medium) of the present invention is basically the same as the embodiments of the terminal shooting method, and will not be described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A terminal shooting method is characterized by comprising the following steps:
acquiring and tracking displacement information of a shooting target of a terminal camera preview frame, wherein the displacement information comprises a historical motion track within a preset unit time length before the current moment;
determining a predicted focusing area according to the displacement information, and focusing in the predicted focusing area;
judging whether the shooting target enters the prediction focusing area or not;
if the shooting target enters the predicted focusing area, shooting of the shooting target is completed;
the determining a predicted focusing area according to the displacement information includes:
acquiring a predicted motion track of the shooting target in the next preset unit time according to the historical motion track;
taking a preset number of position points on the predicted motion trail, and determining a reference position center of a graph formed by the position points on the predicted motion trail;
and determining the predicted focusing area by taking the reference position center as a geometric center.
2. The terminal photographing method of claim 1, wherein the displacement information further includes an object attribute of the photographing target, and the step of obtaining the predicted movement track of the photographing target for the next preset unit duration according to the historical movement track comprises:
determining the object attribute of the shooting target, and determining a motion rule description table and a predicted motion speed of the shooting target according to the object attribute, wherein the motion rule description table comprises a motion reference point and a predicted motion direction mapped by the motion reference point;
and determining the predicted motion track of the shooting target in the next preset unit time according to the motion rule description table, the predicted motion speed and the historical motion track.
3. The terminal photographing method of claim 2, wherein the step of determining the predicted motion trajectory based on a motion law description table, the predicted motion speed, and the historical motion trajectory comprises:
determining a current motion reference point of the shooting target according to the historical motion track;
searching the predicted motion direction mapped by the current motion reference point in the motion track description table;
and determining the predicted motion track according to the predicted motion speed and the predicted motion direction.
4. The terminal photographing method of claim 1, wherein the step of determining the predicted in-focus area with the reference position center as a geometric center comprises:
taking the reference position center as the geometric center of the whole area formed by the shooting targets;
and determining the display size of the whole area according to the screen ratio of each shooting target relative to a terminal screen so as to generate a predicted focusing area.
5. The terminal photographing method of claim 1, wherein the step of determining whether the photographing target enters the predicted focus area comprises:
and judging whether the geometric center of the whole area formed by each shooting target is overlapped with the geometric center of the prediction focusing area, and if so, judging that each shooting target enters the prediction focusing area.
6. The terminal photographing method of claim 1, wherein the step of acquiring and tracking displacement information of the terminal camera preview box photographing target is preceded by:
detecting whether a shooting target of the terminal camera preview frame is a non-static living body or a non-static object;
and if so, executing the step of acquiring and tracking the displacement information of the shooting target of the terminal camera preview frame.
7. The terminal photographing method of claim 1, wherein the step of completing photographing of the photographing target if the photographing target enters the predicted focus area comprises:
if the shooting target enters the predicted focusing area, the user can manually confirm shooting or start a shooting module to automatically shoot.
8. A terminal camera, characterized in that the terminal camera comprises:
the information acquisition module is used for acquiring and tracking displacement information of a shooting target of the terminal camera preview frame, wherein the displacement information comprises a historical motion track in a preset unit time length before the current time;
the area determining module is used for determining a predicted focusing area according to the displacement information and focusing in the predicted focusing area;
the judging module is used for judging whether the shooting target enters the prediction focusing area or not;
the shooting module is used for finishing shooting of the shooting target if the shooting target enters the prediction focusing area;
the region determination module includes a prediction unit and a determination unit:
the prediction unit is used for acquiring a predicted motion track of the shooting target in the next preset unit duration according to the historical motion track;
the determining unit is used for determining the reference position center of a graph formed by all position points on the predicted motion trail by taking a preset number of position points on the predicted motion trail; and determining the predicted focusing area by taking the reference position center as a geometric center.
9. A mobile terminal, characterized in that the mobile terminal comprises: a memory, a processor and a terminal shooting program stored on the memory and executable on the processor, the terminal shooting program when executed by the processor implementing the steps of the terminal shooting method according to any one of claims 1 to 7.
10. A storage medium, characterized in that the storage medium has stored thereon a terminal photographing program, which when executed by a processor implements the steps of the terminal photographing method according to any one of claims 1 to 7.
CN201910873337.6A 2019-09-12 2019-09-12 Terminal shooting method and device, mobile terminal and readable storage medium Active CN110505408B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910873337.6A CN110505408B (en) 2019-09-12 2019-09-12 Terminal shooting method and device, mobile terminal and readable storage medium
PCT/CN2019/122776 WO2021047070A1 (en) 2019-09-12 2019-12-03 Terminal photographing method and apparatus, mobile terminal, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910873337.6A CN110505408B (en) 2019-09-12 2019-09-12 Terminal shooting method and device, mobile terminal and readable storage medium

Publications (2)

Publication Number Publication Date
CN110505408A CN110505408A (en) 2019-11-26
CN110505408B true CN110505408B (en) 2021-07-27

Family

ID=68591921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910873337.6A Active CN110505408B (en) 2019-09-12 2019-09-12 Terminal shooting method and device, mobile terminal and readable storage medium

Country Status (2)

Country Link
CN (1) CN110505408B (en)
WO (1) WO2021047070A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110505408B (en) * 2019-09-12 2021-07-27 深圳传音控股股份有限公司 Terminal shooting method and device, mobile terminal and readable storage medium
CN110933303B (en) * 2019-11-27 2021-05-18 维沃移动通信(杭州)有限公司 Photographing method and electronic device
CN112312005A (en) * 2020-02-12 2021-02-02 北京字节跳动网络技术有限公司 Image acquisition method and device
WO2021258321A1 (en) * 2020-06-24 2021-12-30 华为技术有限公司 Image acquisition method and apparatus
CN114979455A (en) * 2021-02-25 2022-08-30 北京小米移动软件有限公司 Shooting method, device and storage medium
CN113784362B (en) * 2021-07-29 2024-08-06 三维通信股份有限公司 Air base station deployment method, device, electronic device and storage medium
CN113724338B (en) * 2021-08-31 2024-05-03 上海西井科技股份有限公司 Method, system, equipment and storage medium for shooting mobile object based on table
CN113780214B (en) * 2021-09-16 2024-04-19 上海西井科技股份有限公司 Method, system, equipment and storage medium for image recognition based on crowd
CN115052108A (en) * 2022-07-21 2022-09-13 维沃移动通信有限公司 Shooting method and device and electronic equipment
CN115546111B (en) * 2022-09-13 2023-12-05 武汉海微科技有限公司 Curved surface screen detection method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101387812A (en) * 2007-09-13 2009-03-18 鸿富锦精密工业(深圳)有限公司 Camera auto-focusing system and method
CN103369227A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Photographing method of moving object and electronic equipment
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN106060373A (en) * 2015-04-03 2016-10-26 佳能株式会社 Focus detection apparatus and control method thereof
CN106454135A (en) * 2016-11-29 2017-02-22 维沃移动通信有限公司 Photographing reminding method and mobile terminal

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247286B (en) * 2008-03-21 2011-01-05 中兴通讯股份有限公司 Method and system for service quality detection of video distribution system
CN102056010A (en) * 2009-11-02 2011-05-11 鸿富锦精密工业(深圳)有限公司 Automatic test system and method for camera function of notebook computer
CN103929596B (en) * 2014-04-30 2016-09-14 努比亚技术有限公司 Guide the method and device of shooting composition
CN105827928A (en) * 2015-01-05 2016-08-03 中兴通讯股份有限公司 Focusing area selection method and focusing area selection device
CN105554367B (en) * 2015-09-30 2019-01-15 宇龙计算机通信科技(深圳)有限公司 A kind of moving camera shooting method and mobile terminal
JP6825203B2 (en) * 2015-12-01 2021-02-03 株式会社ニコン Imaging controller and camera
US9854156B1 (en) * 2016-06-12 2017-12-26 Apple Inc. User interface for camera effects
JP6742173B2 (en) * 2016-06-30 2020-08-19 キヤノン株式会社 Focus adjusting device and method, and imaging device
CN106357973A (en) * 2016-08-26 2017-01-25 深圳市金立通信设备有限公司 Focusing method and terminal thereof
WO2018062368A1 (en) * 2016-09-30 2018-04-05 株式会社ニコン Image pickup device and image pickup system
CN106961552B (en) * 2017-03-27 2019-10-29 联想(北京)有限公司 A kind of focusing control method and electronic equipment
CN107124556B (en) * 2017-05-31 2021-03-02 Oppo广东移动通信有限公司 Focusing method, focusing device, computer readable storage medium and mobile terminal
CN110505408B (en) * 2019-09-12 2021-07-27 深圳传音控股股份有限公司 Terminal shooting method and device, mobile terminal and readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101387812A (en) * 2007-09-13 2009-03-18 鸿富锦精密工业(深圳)有限公司 Camera auto-focusing system and method
CN103369227A (en) * 2012-03-26 2013-10-23 联想(北京)有限公司 Photographing method of moving object and electronic equipment
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN106060373A (en) * 2015-04-03 2016-10-26 佳能株式会社 Focus detection apparatus and control method thereof
CN106454135A (en) * 2016-11-29 2017-02-22 维沃移动通信有限公司 Photographing reminding method and mobile terminal

Also Published As

Publication number Publication date
WO2021047070A1 (en) 2021-03-18
CN110505408A (en) 2019-11-26

Similar Documents

Publication Publication Date Title
CN110505408B (en) Terminal shooting method and device, mobile terminal and readable storage medium
CN111182205B (en) Photographing method, electronic device, and medium
CN110913139B (en) Photographing method and electronic equipment
WO2019129020A1 (en) Automatic focusing method of camera, storage device and mobile terminal
CN110300267B (en) Photographing method and terminal equipment
CN110602389B (en) Display method and electronic equipment
CN110881105B (en) Shooting method and electronic equipment
CN109068055B (en) Composition method, terminal and storage medium
CN107124556B (en) Focusing method, focusing device, computer readable storage medium and mobile terminal
CN108234876B (en) Tracking focusing method, terminal and computer readable storage medium
CN110266957B (en) Image shooting method and mobile terminal
US12022190B2 (en) Photographing method and electronic device
CN110769154B (en) Shooting method and electronic equipment
CN110099218B (en) Interactive control method and device in shooting process and computer readable storage medium
CN112437172A (en) Photographing method, terminal and computer readable storage medium
CN108307105B (en) Shooting method, terminal and computer readable storage medium
CN110177209B (en) Video parameter regulation and control method, device and computer readable storage medium
CN110187769B (en) Preview image viewing method, equipment and computer readable storage medium
CN108243489B (en) Photographing control method and mobile terminal
CN110958387B (en) Content updating method and electronic equipment
CN110069136B (en) Wearing state identification method and equipment and computer readable storage medium
CN110177208B (en) Video recording association control method, equipment and computer readable storage medium
CN107613194B (en) Focusing method, mobile terminal and computer readable storage medium
CN107257430B (en) A kind of camera control method, terminal and computer readable storage medium
CN111147745B (en) Shooting method, shooting device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant