CN107682630A - Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium - Google Patents
Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium Download PDFInfo
- Publication number
- CN107682630A CN107682630A CN201710912711.XA CN201710912711A CN107682630A CN 107682630 A CN107682630 A CN 107682630A CN 201710912711 A CN201710912711 A CN 201710912711A CN 107682630 A CN107682630 A CN 107682630A
- Authority
- CN
- China
- Prior art keywords
- camera
- data
- offset data
- stabilization
- eyeglass
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 79
- 230000009977 dual effect Effects 0.000 title claims abstract description 27
- 230000006641 stabilisation Effects 0.000 claims abstract description 182
- 238000011105 stabilization Methods 0.000 claims abstract description 182
- 238000001514 detection method Methods 0.000 claims abstract description 9
- 238000012937 correction Methods 0.000 claims description 59
- 238000013507 mapping Methods 0.000 claims description 35
- 230000001133 acceleration Effects 0.000 claims description 31
- 230000000007 visual effect Effects 0.000 claims description 25
- 238000006073 displacement reaction Methods 0.000 claims description 11
- 206010044565 Tremor Diseases 0.000 claims description 2
- 230000006870 function Effects 0.000 description 16
- 230000006854 communication Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 241001269238 Data Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000001990 intravenous administration Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000009527 percussion Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a kind of dual camera anti-fluttering method, mobile terminal and computer-readable recording medium, the dual camera anti-fluttering method obtains the shake data of the first camera when two cameras positioned at mobile terminal the same face are in shooting state;The first offset data of the stabilization eyeglass of first camera is determined according to the shake data of first camera;According to first offset data, the second offset data of the stabilization eyeglass of second camera is determined;Control the stabilization eyeglass of first camera to be compensated with first offset data, and control the stabilization eyeglass of the second camera to be compensated with second offset data.So, the dual camera anti-fluttering method can directly determine the second offset data of the stabilization eyeglass of second camera according to the first offset data of the stabilization eyeglass of the first camera, the shake data of one camera of detection are only needed, operand is reduced, improves stabilization speed.
Description
Technical field
The present invention relates to technique for taking field, more particularly to a kind of dual camera anti-fluttering method, mobile terminal and computer
Readable storage medium storing program for executing.
Background technology
With the continuous development of electronic technology, the function of mobile terminal (such as smart mobile phone, tablet personal computer etc.) is increasingly
It is powerful, play indispensable key player in the work and life of people.Mobile terminal typically all has at present
Camera function, user can be shot using mobile terminal whenever and wherever possible, record Life intravenous drip.
With the continuous improvement that people are required pickup quality, dual camera turns into the Main Trends of The Development of mobile terminal,
User is in shooting process inevitably because shake causes picture unintelligible.In the prior art, two are typically detected respectively
The offset data of individual camera, be then based on two cameras respectively corresponding to offset data calculate rectifying for two cameras respectively
Positive offset amount, and further two cameras are compensated respectively, operand is larger and stabilization speed is slower.It is it can be seen that existing
The anti-fluttering method operand of dual camera is big in technology and stabilization speed is slower.
The content of the invention
In view of this, the present invention proposes a kind of dual camera anti-fluttering method, mobile terminal and computer-readable recording medium,
To solve above-mentioned technical problem.
First, to achieve the above object, the present invention proposes a kind of dual camera anti-fluttering method, applied to mobile terminal, institute
Two cameras that mobile terminal includes being located at the mobile terminal the same face are stated, methods described includes:
When described two cameras are in shooting state, the shake data of the first camera are obtained;
The first compensation number of the stabilization eyeglass of first camera is determined according to the shake data of first camera
According to;
According to first offset data, the second offset data of the stabilization eyeglass of second camera is determined;
Control the stabilization eyeglass of first camera to be compensated with first offset data, and control described second
The stabilization eyeglass of camera is compensated with second offset data.
Alternatively, the shake data for obtaining the first camera, including:
Obtain acceleration direction and the acceleration magnitude for the gyroscope detection being arranged at first camera;
Determine the jitter direction of first camera according to the acceleration direction, and according to the acceleration magnitude with
And preset formula calculates the shake displacement of first camera.
Alternatively, the shake data according to first camera determine the stabilization eyeglass of first camera
First offset data, including:
The correction data of first camera are determined according to the shake data of first camera;
The first compensation number of the stabilization eyeglass of first camera is determined according to the correction data of first camera
According to;
Alternatively, the mobile terminal is stored with multigroup shake data of first camera and multigroup correction data
The first mapping table, and first camera multigroup correction number according to this and multigroup offset data the second corresponding relation
Table;
The shake data according to first camera determine that the first of the stabilization eyeglass of first camera mends
Data are repaid, including:
According to the shake data of first camera, obtained and the shake data pair from first mapping table
The correction data answered are the correction data of first camera;
According to the correction data of first camera, obtained and the correction data pair from second mapping table
The offset data answered is the first offset data of the stabilization eyeglass of first camera.
Alternatively, the mobile terminal is stored with multigroup offset data and the institute of the stabilization eyeglass of first camera
State the 3rd mapping table of multigroup offset data of the stabilization eyeglass of second camera;
First offset data of the stabilization eyeglass according to first camera, determine the stabilization mirror of second camera
Second offset data of piece, including:
According to the first offset data of the stabilization eyeglass of first camera, obtained from the 3rd mapping table
The offset data of the stabilization eyeglass of second camera corresponding with first offset data is the stabilization mirror of the second camera
Second offset data of piece.
Alternatively, it is described according to first offset data, determine that the second of the stabilization eyeglass of second camera compensates number
According to, including:
Obtain the configuration information of first camera and the configuration information of the second camera;
According to first offset data, and the configuration of the configuration information and second camera of first camera
Information, calculate the second offset data of the stabilization eyeglass of the second camera.
Alternatively, it is described according to first offset data, and the configuration information and second of first camera
The configuration information of camera, the second offset data of the stabilization eyeglass of the second camera is calculated, including:
If first camera and the second camera are the same module that structure is consistent, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the second camera are visual angle identical difference module, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the one of camera of the second camera are wide-angle camera, another shooting
Head is focal length camera, by first offset data divided by the visual angle size of the first camera multiplied by regarding with second camera
Angle size, obtain the second offset data of the stabilization eyeglass of the second camera.
Alternatively, offset data includes compensation direction and offset angle;
The stabilization eyeglass of control first camera is compensated with first offset data, and described in control
The stabilization eyeglass of second camera is compensated with second offset data, including:
The stabilization eyeglass of first camera is controlled to deflect the first offset angle to the first compensation direction, and described in control
The stabilization eyeglass of second camera deflects the second offset angle to the second compensation direction.
Further, to achieve the above object, the present invention also provides a kind of mobile terminal, and the mobile terminal includes storage
Device, at least one processor and it is stored on the memory and can be at least one journey of at least one computing device
Sequence, the step in the method described in any of the above-described is realized when at least one program is by least one computing device
Suddenly.
Further, to achieve the above object, the present invention also provides a kind of computer-readable recording medium, the computer
Readable storage medium storing program for executing is stored with the executable at least one program of computer, and at least one program is performed by the computer
When the computer is performed the step in the method described in any of the above-described.
Compared to prior art, dual camera anti-fluttering method proposed by the invention is located at mobile terminal the same face when described
Two cameras when being in shooting state, obtain the shake data of the first camera;According to the shake of first camera
Data determine the first offset data of the stabilization eyeglass of first camera;According to first offset data, second is determined
Second offset data of the stabilization eyeglass of camera;The stabilization eyeglass of first camera is controlled with first offset data
Compensate, and control the stabilization eyeglass of the second camera to be compensated with second offset data.So, it is described double
Camera anti-fluttering method directly can determine second camera according to the first offset data of the stabilization eyeglass of the first camera
Second offset data of stabilization eyeglass, it is only necessary to detect the shake data of a camera, reduce operand, improve stabilization
Speed.
Brief description of the drawings
Fig. 1 is a kind of hardware architecture diagram for the mobile terminal for realizing each embodiment of the present invention;
Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention;
Fig. 3 is a kind of schematic flow sheet of dual camera anti-fluttering method provided in an embodiment of the present invention;
Fig. 4 is the schematic flow sheet of another dual camera anti-fluttering method provided in an embodiment of the present invention;
Fig. 5 is the schematic flow sheet of another dual camera anti-fluttering method provided in an embodiment of the present invention;
Fig. 6 is a kind of visual angle schematic diagram of camera provided in an embodiment of the present invention;
Fig. 7 is the view area schematic diagram of the camera shown in Fig. 6;
The realization, functional characteristics and advantage of the object of the invention will be described further referring to the drawings in conjunction with the embodiments.
Embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In follow-up description, the suffix using such as " module ", " part " or " unit " for representing element is only
Be advantageous to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " part " or " unit " can mix
Ground uses.
Terminal can be implemented in a variety of manners.For example, the terminal described in the present invention can include such as mobile phone, flat board
Computer, notebook computer, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable
Media player (Portable Media Player, PMP), guider, wearable device, Intelligent bracelet, pedometer etc. move
Dynamic terminal, and the fixed terminal such as digital TV, desktop computer.
It will be illustrated in subsequent descriptions by taking mobile terminal as an example, it will be appreciated by those skilled in the art that except special
Outside element for moving purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, its hardware architecture diagram for a kind of mobile terminal of each embodiment of the realization present invention, the shifting
Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit
103rd, A/V (audio/video) input block 104, sensor 105, display unit 106, user input unit 107, interface unit
108th, the part such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1
Mobile terminal structure does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more or less parts,
Either combine some parts or different parts arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receiving and sending messages or communication process in, the reception and transmission of signal, specifically, by base station
Downlink information receive after, handled to processor 110;In addition, up data are sent to base station.Generally, radio frequency unit 101
Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, penetrate
Frequency unit 101 can also be communicated by radio communication with network and other equipment.Above-mentioned radio communication can use any communication
Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications
System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code
Division Multiple Access 2000, CDMA 2000), WCDMA (Wideband Code Division
Multiple Access, WCDMA), TD-SCDMA (Time Division-Synchronous Code
Division Multiple Access, TD SDMA), FDD-LTE (Frequency Division
Duplexing-Long Term Evolution, FDD Long Term Evolution) and TDD-LTE (Time Division
Duplexing-Long Term Evolution, time division duplex Long Term Evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102
Sub- mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and accessed.Although Fig. 1 shows
Go out WiFi module 102, but it is understood that, it is simultaneously not belonging to must be configured into for mobile terminal, completely can be according to need
To be omitted in the essential scope for do not change invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100
When under the isotypes such as formula, speech recognition mode, broadcast reception mode, by radio frequency unit 101 or WiFi module 102 it is receiving or
It is sound that the voice data stored in memory 109, which is converted into audio signal and exported,.Moreover, audio output unit 103
The audio output related to the specific function that mobile terminal 100 performs can also be provided (for example, call signal receives sound, disappeared
Breath receives sound etc.).Audio output unit 103 can include loudspeaker, buzzer etc..
A/V input blocks 104 are used to receive audio or video signal.A/V input blocks 104 can include graphics processor
(Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode
Or the static images or the view data of video obtained in image capture mode by image capture apparatus (such as camera) are carried out
Reason.Picture frame after processing may be displayed on display unit 106.Picture frame after the processing of graphics processor 1041 can be deposited
Storage is transmitted in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike
Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042
Quiet down sound (voice data), and can be voice data by such acoustic processing.Audio (voice) data after processing can
To be converted to the form output that mobile communication base station can be sent to via radio frequency unit 101 in the case of telephone calling model.
Microphone 1042 can implement various types of noises and eliminate (or suppression) algorithm to eliminate (or suppression) in reception and send sound
Caused noise or interference during frequency signal.
Mobile terminal 100 also includes at least one sensor 105, such as optical sensor, motion sensor and other biographies
Sensor.Specifically, optical sensor includes ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to environment
The light and shade of light adjusts the brightness of display panel 1061, and proximity transducer can close when mobile terminal 100 is moved in one's ear
Display panel 1061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions (general
For three axles) size of acceleration, size and the direction of gravity are can detect that when static, the application available for identification mobile phone posture
(such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.;
The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer,
The other sensors such as hygrometer, thermometer, infrared ray sensor, will not be repeated here.
Display unit 106 is used for the information for showing the information inputted by user or being supplied to user.Display unit 106 can wrap
Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, CD), Organic Light Emitting Diode can be used
Forms such as (Organic Light-Emitting Diode, OLED) configures display panel 1061.
User input unit 107 can be used for the numeral or character information for receiving input, and produce the use with mobile terminal
The key signals input that family is set and function control is relevant.Specifically, user input unit 107 may include contact panel 1071 with
And other input equipments 1072.Contact panel 1071, also referred to as touch-screen, collect touch operation of the user on or near it
(for example user uses any suitable objects or annex such as finger, stylus on contact panel 1071 or in contact panel 1071
Neighbouring operation), and corresponding attachment means are driven according to formula set in advance.Contact panel 1071 may include touch detection
Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band
The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it
Contact coordinate is converted into, then gives processor 110, and the order sent of reception processing device 110 and can be performed.In addition, can
To realize contact panel 1071 using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves.Except contact panel
1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap
Include but be not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc.
One or more, do not limit herein specifically.
Further, contact panel 1071 can cover display panel 1061, detect thereon when contact panel 1071 or
After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing
The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, contact panel 1071 and display panel
1061 be the part independent as two to realize the input of mobile terminal and output function, but in certain embodiments, can
Input and the output function of mobile terminal are realized so that contact panel 1071 and display panel 1061 is integrated, is not done herein specifically
Limit.
Interface unit 108 is connected the interface that can pass through as at least one external device (ED) with mobile terminal 100.For example,
External device (ED) can include wired or wireless head-band earphone port, external power source (or battery charger) port, wired or nothing
Line FPDP, memory card port, the port for connecting the device with identification module, audio input/output (I/O) end
Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number
It is believed that breath, electric power etc.) and at least one element that the input received is transferred in mobile terminal 100 or can use
In transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area
And storage data field, wherein, storing program area can storage program area, application program (such as the sound needed at least one function
Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as
Voice data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, can also include non-easy
The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the whole mobile terminal of connection
Individual part, by running or performing the software program and/or module that are stored in memory 109, and call and be stored in storage
Data in device 109, the various functions and processing data of mobile terminal are performed, so as to carry out integral monitoring to mobile terminal.Place
Reason device 110 may include at least one processing unit;Preferably, processor 110 can integrate application processor and modulation /demodulation processing
Device, wherein, application processor mainly handles operating system, user interface and application program etc., and modem processor is mainly located
Manage radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) to all parts power supply, it is preferred that power supply 111
Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put
The function such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also will not be repeated here including bluetooth module etc..
For the ease of understanding the embodiment of the present invention, the communications network system being based on below to the mobile terminal of the present invention enters
Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention, the communication network system
Unite as the LTE system of universal mobile communications technology, the UE that the LTE system includes communicating connection successively (User Equipment, is used
Family equipment) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, evolved UMTS lands
Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation
204。
Specifically, UE201 can be above-mentioned terminal 100, and here is omitted.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning
Journey (backhaul) (such as X2 interface) is connected with other eNodeB2022, and eNodeB2021 is connected to EPC203,
ENodeB2021 can provide UE201 to EPC203 access.
EPC203 can include MME (Mobility Management Entity, mobility management entity) 2031, HSS
(Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way,
Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy and
Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and
The control node of signaling between EPC203, there is provided carrying and connection management.HSS2032 is all to manage for providing some registers
Such as the function of attaching position register (not shown) etc, and preserve some and used about service features, data rate etc.
The special information in family.All customer data can be transmitted by SGW2034, and PGW2035 can provide UE 201 IP
Address is distributed and other functions, and PCRF2036 is strategy and the charging control strategic decision-making of business data flow and IP bearing resources
Point, it selects and provided available strategy and charging control decision-making with charge execution function unit (not shown) for strategy.
IP operation 204 can include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia
System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art it is to be understood that the present invention not only
Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with
And following new network system etc., do not limit herein.
Based on the above-mentioned hardware configuration of mobile terminal 100 and communications network system, each embodiment of the inventive method is proposed.
It should be noted that in the embodiment of the present invention, the mobile terminal 100 also includes being arranged on the same face of mobile terminal 100
Two cameras, described two cameras include optical anti-vibration camera lens (OSI), and the optical anti-vibration camera lens includes stabilization
Eyeglass.
Refering to Fig. 3, Fig. 3 is a kind of step flow chart of dual camera anti-fluttering method provided in an embodiment of the present invention, described
Method is applied in a mobile terminal, as shown in figure 3, methods described includes:
Step 301, when described two cameras are in shooting state, obtain the first camera shake data.
In the step, when described two cameras are in shooting state, methods described obtains the shake of the first camera
Data.Methods described can calculate described first by the acceleration direction and acceleration magnitude for detecting first camera
The shake data of camera, the shake data include jitter direction and shake displacement.
Specifically, methods described can obtain the acceleration direction for the gyroscope detection being arranged at first camera
And acceleration magnitude, the jitter direction of first camera is then determined according to the acceleration direction, such as determine institute
State the jitter direction that acceleration direction is first camera.And institute is calculated according to the acceleration magnitude and preset formula
The shake displacement of the first camera is stated, such as can be according to formula S=1/2*a*t2Calculate the shake position of first camera
Move, wherein, a is acceleration magnitude, time intervals of the t between two adjacent frame preview screens, such as 1/15 second.The gyro
Instrument detects acceleration direction and acceleration magnitude belongs to prior art category, and here is omitted.
Step 302, determined according to the shake data of first camera first camera stabilization eyeglass
One offset data.
In the step, methods described determines the stabilization of first camera according to the shake data of first camera
First offset data of eyeglass.
In the embodiment of the present invention, the mobile terminal can store multigroup shake data of first camera with it is multigroup
The mapping table of offset data, the mapping table are according to the anti-of first camera and first camera
Tremble the parameter setting of eyeglass.Methods described can be directly according to the shake data of first camera from the mapping table
It is middle to obtain offset data corresponding with the shake data of first camera as the stabilization eyeglass of first camera
First offset data.
In the embodiment of the present invention, the mobile terminal can not also store multigroup shake data of first camera with
The mapping table of multigroup offset data, the mobile terminal can determine according to the shake data of first camera first
The correction data of first camera, then determine first camera according to the correction data of first camera
First offset data of stabilization eyeglass.Methods described can determine the correction data of first camera according to preset rules,
Such as determine to make correction for direction as the direction opposite with the jitter direction, it is determined that correction displacement is equal to the shake displacement.It is described
Pair of the multigroup shake data for first camera that method can prestore according to the mobile terminal and multigroup correction data
The correction data for determining first camera should be related to.In the embodiment, described first can be stored in the mobile terminal
Multigroup correction data of camera and the mapping table of multigroup offset data, methods described can be obtained by way of tabling look-up
Offset data corresponding with the correction data of first camera compensates for the first of the stabilization eyeglass of first camera
Data.
Step 303, according to first offset data, determine the second offset data of the stabilization eyeglass of second camera.
In the step, methods described determines the second of the stabilization eyeglass of second camera according to first offset data
Offset data.Methods described can be according to the configuration information and first compensation of first camera and second camera
Data, calculate the second offset data of the stabilization eyeglass of the second camera.Methods described can also be by way of tabling look-up
Corresponding with first offset data the second offset data is obtained, can specifically, in the mobile terminal store described the
Pair of multigroup offset data of the stabilization eyeglass of one camera and multigroup offset data of the stabilization eyeglass of the second camera
Relation table is answered, methods described can obtain and described first according to the first offset data of the stabilization eyeglass of first camera
The offset data of the stabilization eyeglass of second camera corresponding to offset data is as second offset data.
Step 304, the stabilization eyeglass of control first camera are compensated with first offset data, and controlled
The stabilization eyeglass of the second camera is compensated with second offset data.
In the step, methods described controls the stabilization eyeglass of first camera to be mended with first offset data
Repay, and control the stabilization eyeglass of the second camera to be compensated with second offset data.In the embodiment of the present invention, mend
Repaying data includes compensation direction and offset angle, and first offset data includes the first compensation direction and the first compensation angle
Degree, methods described control the stabilization eyeglass of first camera specifically may be used in a manner of first offset data compensates
To be:The stabilization eyeglass of first camera is controlled to deflect the first offset angle to the first compensation direction.Second compensation
Data include the second compensation direction and the second offset angle, and methods described controls the stabilization eyeglass of the second camera with institute
Stating the mode that the second offset data compensates can be specifically:The stabilization eyeglass of the second camera is controlled to be compensated to second
Direction deflects the second offset angle.
In the present embodiment, the dual camera anti-fluttering method is when described at two cameras of mobile terminal the same face
When shooting state, the shake data of the first camera are obtained;Described is determined according to the shake data of first camera
First offset data of the stabilization eyeglass of one camera;According to first offset data, the stabilization mirror of second camera is determined
Second offset data of piece;Control the stabilization eyeglass of first camera to be compensated with first offset data, and control
The stabilization eyeglass for making the second camera is compensated with second offset data.So, the dual camera stabilization side
Method directly can determine the of the stabilization eyeglass of second camera according to the first offset data of the stabilization eyeglass of the first camera
Two offset datas, it is only necessary to detect the shake data of a camera, reduce operand, improve stabilization speed.
Alternatively, the shake data for obtaining the first camera, including:
Obtain acceleration direction and the acceleration magnitude for the gyroscope detection being arranged at first camera;
Determine the jitter direction of first camera according to the acceleration direction, and according to the acceleration magnitude with
And preset formula calculates the shake displacement of first camera.
Alternatively, offset data includes compensation direction and offset angle;
The stabilization eyeglass of control first camera is compensated with first offset data, and described in control
The stabilization eyeglass of second camera is compensated with second offset data, including:
The stabilization eyeglass of first camera is controlled to deflect the first offset angle to the first compensation direction, and described in control
The stabilization eyeglass of second camera deflects the second offset angle to the second compensation direction.
Referring to Fig. 4, Fig. 4 is the schematic flow sheet of another dual camera anti-fluttering method provided in an embodiment of the present invention, such as
Shown in Fig. 4, methods described includes:
Step 401, when described two cameras are in shooting state, obtain the first camera shake data.
The step 401 is identical with the step 301 in the embodiment shown in Fig. 3, and here is omitted.
Step 402, the correction data for determining according to the shake data of first camera first camera.
In the step, methods described determines the correction of first camera according to the shake data of first camera
Data.Methods described can determine the correction of first camera according to the shake data and preset rules of the first camera
Data, such as determine to make correction for direction with jitter direction on the contrary, determining that correction displacement is identical with shaking displacement.Methods described can also
Obtained by way of tabling look-up it is corresponding with the shake data of first camera correct data, specifically, it is described it is mobile eventually
End can store multigroup shake data of first camera and the first mapping table of multigroup correction data, the side
Method can obtain and first camera according to the shake data of first camera from first mapping table
Shake data corresponding to correction data.
Step 403, determined according to the correction data of first camera first camera stabilization eyeglass
One offset data.
In the step, methods described determines the stabilization of first camera according to the correction data of first camera
First offset data of eyeglass.In the embodiment of the present invention, methods described can be obtained by way of tabling look-up to be taken the photograph with described first
The offset data as corresponding to the correction data of head, it can specifically, in the mobile terminal store the more of first camera
Group correction data and the second mapping table of multigroup offset data, methods described can be according to the correction of first camera
Data obtain offset data corresponding with the correction data of first camera from second mapping table, and will obtain
First offset data of the offset data got as the stabilization eyeglass of first camera.
Step 404, according to first offset data, determine the second offset data of the stabilization eyeglass of second camera.
Step 405, the stabilization eyeglass of control first camera are compensated with first offset data, and controlled
The stabilization eyeglass of the second camera is compensated with second offset data.
The step 404 and step 405 are identical with the step 303 in the embodiment shown in Fig. 3 and step 304, herein not
Repeat again.
Alternatively, the mobile terminal is stored with multigroup shake data of first camera and multigroup correction data
The first mapping table, and first camera multigroup correction number according to this and multigroup offset data the second corresponding relation
Table;
The first compensation number of the stabilization eyeglass of first camera is determined according to the shake data of first camera
According to, including:
According to the shake data of first camera, obtained and the shake data pair from first mapping table
The correction data answered are the correction data of first camera;
According to the correction data of first camera, obtained and the correction data pair from second mapping table
The offset data answered is the first offset data of the stabilization eyeglass of first camera.
Alternatively, the mobile terminal is stored with multigroup offset data and the institute of the stabilization eyeglass of first camera
State the 3rd mapping table of multigroup offset data of the stabilization eyeglass of second camera;
First offset data of the stabilization eyeglass according to first camera, determine the stabilization mirror of second camera
Second offset data of piece, including:
According to the first offset data of the stabilization eyeglass of first camera, obtained from the 3rd mapping table
The offset data of the stabilization eyeglass of second camera corresponding with first offset data is the stabilization mirror of the second camera
Second offset data of piece.
Referring to Fig. 5, Fig. 5 is the schematic flow sheet of another dual camera anti-fluttering method provided in an embodiment of the present invention, such as
Shown in Fig. 5, methods described includes:
Step 501, when described two cameras are in shooting state, obtain the first camera shake data.
Step 502, determined according to the shake data of first camera first camera stabilization eyeglass
One offset data.
The step 501 and step 502 are identical with the step 301 in the embodiment shown in Fig. 3 and 302, no longer superfluous herein
State.
The configuration information of step 503, the configuration information for obtaining first camera and the second camera.
In the step, methods described obtains the configuration of the configuration information and the second camera of first camera
Information.In some embodiments of the invention, methods described can also be according to the configuration information of first camera and described
The configuration information of second camera judges the type of first camera and the second camera, such as judges described
One camera and the second camera same module that whether to be structure consistent, or judge first camera with it is described
Whether the visual angle of second camera is identical.
Step 504, according to first offset data, the configuration information of first camera and second camera
Configuration information, calculate the second offset data of the stabilization eyeglass of the second camera.
In the step, methods described is according to first offset data, the configuration information of first camera and
The configuration information of two cameras, calculate the second offset data of the stabilization eyeglass of the second camera.For example, the side
Method determines first camera according to the configuration information of first camera and the configuration information of the second camera
With the relation of the second camera, then further according to the relation of first camera and the second camera, really
Computation rule calculates the second offset data of the stabilization eyeglass of the second camera corresponding to fixed.
The relation of first camera and the second camera can be:First camera is taken the photograph with described second
As head is the consistent same module of structure;Or first camera and the second camera are visual angle identical difference mould
Group;Or one of them of first camera and the second camera are wide-angle camera, another is that focal length images
Head.
For example, it is described when first camera and the second camera are the consistent same module of structure
Method determines the second offset data of the stabilization eyeglass that first offset data is the second camera.Take the photograph when described first
When picture head and the second camera are visual angle identical difference module, first camera takes with the second camera
Scape and photographic region unanimously and overlap, and methods described can determine that first offset data is the second camera
Stabilization eyeglass the second offset data.
When one of them of first camera and the second camera are wide-angle camera, another takes the photograph for focal length
During as head, using the first camera as wide-angle camera, second camera is exemplified by focal length camera.Referring to Fig. 6, Fig. 6 is this hair
The visual angle schematic diagram for the camera that bright embodiment provides, as shown in fig. 6, the visual angle of the first camera 601 is α, second camera
602 visual angle is β.Due to object distance (i.e. the distance between camera and subject) be far longer than first camera with
Spacing between the second camera, it can be considered that the view area 701 of the first camera 601 and the described second shooting
The relation of first 602 view area 702 is as shown in fig. 7, i.e. the view area of second camera 602 is located at first camera
The middle of 601 view area 701.In the case of this kind, the first compensation direction of the stabilization eyeglass of first camera with
Second compensation direction of the second camera is identical, the first offset angle of the stabilization eyeglass of first camera with it is described
The visual angle that the ratio of second offset angle of the stabilization eyeglass of second camera is equal to first camera is taken the photograph with described second
As the ratio at the visual angle of head, you can with by by the visual angle size of first offset data divided by the first camera multiplied by with
The visual angle size of two cameras, obtain the second offset data of the stabilization eyeglass of the second camera.Assuming that the first camera
Visual angle be 85 degree, the visual angle of second camera is 45 degree, described when the stabilization eyeglass of the first camera deflects 2 degree to the left
Method can determine that the stabilization eyeglass of the second camera needs to deflect 2/85*45 degree to the left.
Step 505, the stabilization eyeglass of control first camera are compensated with first offset data, and controlled
The stabilization eyeglass of the second camera is compensated with second offset data.
The step 505 is identical with the step 304 in the embodiment shown in Fig. 3, and here is omitted.
Alternatively, it is described according to first offset data, and the configuration information and second of first camera
The configuration information of camera, the second offset data of the stabilization eyeglass of the second camera is calculated, including:
If first camera and the second camera are the same module that structure is consistent, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the second camera are visual angle identical difference module, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the one of camera of the second camera are wide-angle camera, another shooting
Head is focal length camera, by first offset data divided by the visual angle size of the first camera multiplied by regarding with second camera
Angle size, obtain the second offset data of the stabilization eyeglass of the second camera.
One of ordinary skill in the art will appreciate that all or part of step for realizing above-described embodiment method is can be with
Completed by the related hardware of at least one programmed instruction, at least one program can be stored in shifting as shown in Figure 1
In the memory 109 of dynamic terminal 100, and can the processor 110 perform, at least one program is by the processor
During 110 execution, following steps are realized:
When two cameras positioned at mobile terminal the same face are in shooting state, the shake number of the first camera is obtained
According to;
The first compensation number of the stabilization eyeglass of first camera is determined according to the shake data of first camera
According to;
According to first offset data, the second offset data of the stabilization eyeglass of second camera is determined;
Control the stabilization eyeglass of first camera to be compensated with first offset data, and control described second
The stabilization eyeglass of camera is compensated with second offset data.
Alternatively, the shake data for obtaining the first camera, including:
Obtain acceleration direction and the acceleration magnitude for the gyroscope detection being arranged at first camera;
Determine the jitter direction of first camera according to the acceleration direction, and according to the acceleration magnitude with
And preset formula calculates the shake displacement of first camera.
Alternatively, the shake data according to first camera determine the stabilization eyeglass of first camera
First offset data, including:
The correction data of first camera are determined according to the shake data of first camera;
The first compensation number of the stabilization eyeglass of first camera is determined according to the correction data of first camera
According to;
Alternatively, the mobile terminal is stored with multigroup shake data of first camera and multigroup correction data
The first mapping table, and first camera multigroup correction number according to this and multigroup offset data the second corresponding relation
Table;
The shake data according to first camera determine that the first of the stabilization eyeglass of first camera mends
Data are repaid, including:
According to the shake data of first camera, obtained and the shake data pair from first mapping table
The correction data answered are the correction data of first camera;
According to the correction data of first camera, obtained and the correction data pair from second mapping table
The offset data answered is the first offset data of the stabilization eyeglass of first camera.
Alternatively, the mobile terminal is stored with multigroup offset data and the institute of the stabilization eyeglass of first camera
State the 3rd mapping table of multigroup offset data of the stabilization eyeglass of second camera;
First offset data of the stabilization eyeglass according to first camera, determine the stabilization mirror of second camera
Second offset data of piece, including:
According to the first offset data of the stabilization eyeglass of first camera, obtained from the 3rd mapping table
The offset data of the stabilization eyeglass of second camera corresponding with first offset data is the stabilization mirror of the second camera
Second offset data of piece.
Alternatively, it is described according to first offset data, determine that the second of the stabilization eyeglass of second camera compensates number
According to, including:
Obtain the configuration information of first camera and the configuration information of the second camera;
According to first offset data, and the configuration of the configuration information and second camera of first camera
Information, calculate the second offset data of the stabilization eyeglass of the second camera.
Alternatively, it is described according to first offset data, and the configuration information and second of first camera
The configuration information of camera, the second offset data of the stabilization eyeglass of the second camera is calculated, including:
If first camera and the second camera are the same module that structure is consistent, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the second camera are visual angle identical difference module, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the one of camera of the second camera are wide-angle camera, another shooting
Head is focal length camera, by first offset data divided by the visual angle size of the first camera multiplied by regarding with second camera
Angle size, obtain the second offset data of the stabilization eyeglass of the second camera.
Alternatively, offset data includes compensation direction and offset angle;
The stabilization eyeglass of control first camera is compensated with first offset data, and described in control
The stabilization eyeglass of second camera is compensated with second offset data, including:
The stabilization eyeglass of first camera is controlled to deflect the first offset angle to the first compensation direction, and described in control
The stabilization eyeglass of second camera deflects the second offset angle to the second compensation direction.
One of ordinary skill in the art will appreciate that all or part of step for realizing above-described embodiment method is can be with
Completed by the related hardware of at least one programmed instruction, at least one program can be stored in one and computer-readable deposit
In storage media, when at least one program is performed, following steps are realized:
When two cameras positioned at mobile terminal the same face are in shooting state, the shake number of the first camera is obtained
According to;
The first compensation number of the stabilization eyeglass of first camera is determined according to the shake data of first camera
According to;
According to first offset data, the second offset data of the stabilization eyeglass of second camera is determined;
Control the stabilization eyeglass of first camera to be compensated with first offset data, and control described second
The stabilization eyeglass of camera is compensated with second offset data.
Alternatively, the shake data for obtaining the first camera, including:
Obtain acceleration direction and the acceleration magnitude for the gyroscope detection being arranged at first camera;
Determine the jitter direction of first camera according to the acceleration direction, and according to the acceleration magnitude with
And preset formula calculates the shake displacement of first camera.
Alternatively, the shake data according to first camera determine the stabilization eyeglass of first camera
First offset data, including:
The correction data of first camera are determined according to the shake data of first camera;
The first compensation number of the stabilization eyeglass of first camera is determined according to the correction data of first camera
According to;
Alternatively, the mobile terminal is stored with multigroup shake data of first camera and multigroup correction data
The first mapping table, and first camera multigroup correction number according to this and multigroup offset data the second corresponding relation
Table;
The shake data according to first camera determine that the first of the stabilization eyeglass of first camera mends
Data are repaid, including:
According to the shake data of first camera, obtained and the shake data pair from first mapping table
The correction data answered are the correction data of first camera;
According to the correction data of first camera, obtained and the correction data pair from second mapping table
The offset data answered is the first offset data of the stabilization eyeglass of first camera.
Alternatively, the mobile terminal is stored with multigroup offset data and the institute of the stabilization eyeglass of first camera
State the 3rd mapping table of multigroup offset data of the stabilization eyeglass of second camera;
First offset data of the stabilization eyeglass according to first camera, determine the stabilization mirror of second camera
Second offset data of piece, including:
According to the first offset data of the stabilization eyeglass of first camera, obtained from the 3rd mapping table
The offset data of the stabilization eyeglass of second camera corresponding with first offset data is the stabilization mirror of the second camera
Second offset data of piece.
Alternatively, it is described according to first offset data, determine that the second of the stabilization eyeglass of second camera compensates number
According to, including:
Obtain the configuration information of first camera and the configuration information of the second camera;
According to first offset data, and the configuration of the configuration information and second camera of first camera
Information, calculate the second offset data of the stabilization eyeglass of the second camera.
Alternatively, it is described according to first offset data, and the configuration information and second of first camera
The configuration information of camera, the second offset data of the stabilization eyeglass of the second camera is calculated, including:
If first camera and the second camera are the same module that structure is consistent, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the second camera are visual angle identical difference module, first compensation is determined
Data are the second offset data of the stabilization eyeglass of the second camera;
If first camera and the one of camera of the second camera are wide-angle camera, another shooting
Head is focal length camera, by first offset data divided by the visual angle size of the first camera multiplied by regarding with second camera
Angle size, obtain the second offset data of the stabilization eyeglass of the second camera.
Alternatively, offset data includes compensation direction and offset angle;
The stabilization eyeglass of control first camera is compensated with first offset data, and described in control
The stabilization eyeglass of second camera is compensated with second offset data, including:
The stabilization eyeglass of first camera is controlled to deflect the first offset angle to the first compensation direction, and described in control
The stabilization eyeglass of second camera deflects the second offset angle to the second compensation direction.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row
His property includes, so that process, method, article or device including a series of elements not only include those key elements, and
And also include the other element being not expressly set out, or also include for this process, method, article or device institute inherently
Key element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that including this
Other identical element also be present in the process of key element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side
Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases
The former is more preferably embodiment.Based on such understanding, technical scheme is substantially done to prior art in other words
Going out the part of contribution can be embodied in the form of software product, and the computer software product is stored in a storage medium
In (such as ROM/RAM, magnetic disc, CD), including some instructions to cause a station terminal equipment (can be mobile phone, computer, clothes
Be engaged in device, air conditioner, or network equipment etc.) perform method described in each embodiment of the present invention.
The preferred embodiments of the present invention are these are only, are not intended to limit the scope of the invention, it is every to utilize this hair
The equivalent structure or equivalent flow conversion that bright specification and accompanying drawing content are made, or directly or indirectly it is used in other related skills
Art field, is included within the scope of the present invention.
Claims (10)
1. a kind of dual camera anti-fluttering method, applied to mobile terminal, the mobile terminal includes same positioned at the mobile terminal
Two cameras simultaneously, it is characterised in that methods described includes:
When described two cameras are in shooting state, the shake data of the first camera are obtained;
The first offset data of the stabilization eyeglass of first camera is determined according to the shake data of first camera;
According to first offset data, the second offset data of the stabilization eyeglass of second camera is determined;
Control the stabilization eyeglass of first camera to be compensated with first offset data, and control second shooting
The stabilization eyeglass of head is compensated with second offset data.
2. dual camera anti-fluttering method as claimed in claim 1, it is characterised in that the shake number for obtaining the first camera
According to, including:
Obtain acceleration direction and the acceleration magnitude for the gyroscope detection being arranged at first camera;
The jitter direction of first camera is determined according to the acceleration direction, and according to the acceleration magnitude and in advance
If formula calculates the shake displacement of first camera.
3. dual camera anti-fluttering method as claimed in claim 1, it is characterised in that the trembling according to first camera
Dynamic data determine the first offset data of the stabilization eyeglass of first camera, including:
The correction data of first camera are determined according to the shake data of first camera;
The first offset data of the stabilization eyeglass of first camera is determined according to the correction data of first camera.
4. dual camera anti-fluttering method as claimed in claim 3, it is characterised in that the mobile terminal is stored with described first
First mapping table of multigroup shake data of camera and multigroup correction data, and the multigroup of first camera rectify
Second mapping table of correction data and multigroup offset data;
The shake data according to first camera determine the first compensation number of the stabilization eyeglass of first camera
According to, including:
According to the shake data of first camera, obtained from first mapping table corresponding with the shake data
Correct the correction data that data are first camera;
According to the correction data of first camera, obtained from second mapping table corresponding with the correction data
Offset data is the first offset data of the stabilization eyeglass of first camera.
5. dual camera anti-fluttering method as claimed in claim 1, it is characterised in that the mobile terminal is stored with described first
The of multigroup offset data of the stabilization eyeglass of camera and multigroup offset data of the stabilization eyeglass of the second camera
Three mapping tables;
First offset data of the stabilization eyeglass according to first camera, determines the stabilization eyeglass of second camera
Second offset data, including:
According to the first offset data of the stabilization eyeglass of first camera, obtain and be somebody's turn to do from the 3rd mapping table
The offset data of the stabilization eyeglass of second camera corresponding to first offset data is the stabilization eyeglass of the second camera
Second offset data.
6. dual camera anti-fluttering method as claimed in claim 1, it is characterised in that it is described according to first offset data,
The second offset data of the stabilization eyeglass of second camera is determined, including:
Obtain the configuration information of first camera and the configuration information of the second camera;
According to first offset data, and the configuration information of first camera and second camera match somebody with somebody confidence
Breath, calculate the second offset data of the stabilization eyeglass of the second camera.
7. dual camera anti-fluttering method as claimed in claim 6, it is characterised in that it is described according to first offset data,
And the configuration information of first camera and the configuration information of second camera, calculate the stabilization of the second camera
Second offset data of eyeglass, including:
If first camera and the second camera are the same module that structure is consistent, first offset data is determined
For the second offset data of the stabilization eyeglass of the second camera;
If first camera and the second camera are visual angle identical difference module, first offset data is determined
For the second offset data of the stabilization eyeglass of the second camera;
If first camera and the one of camera of the second camera are wide-angle camera, another camera is
Focal length camera, by first offset data divided by the visual angle size of the first camera multiplied by big with the visual angle of second camera
It is small, obtain the second offset data of the stabilization eyeglass of the second camera.
8. the dual camera anti-fluttering method as described in any one of claim 1 to 6, it is characterised in that offset data includes compensation
Direction and offset angle;
The stabilization eyeglass of control first camera is compensated with first offset data, and controls described second
The stabilization eyeglass of camera is compensated with second offset data, including:
Control the stabilization eyeglass of first camera to deflect the first offset angle to the first compensation direction, and control described second
The stabilization eyeglass of camera deflects the second offset angle to the second compensation direction.
9. a kind of mobile terminal, it is characterised in that the mobile terminal includes memory, at least one processor and is stored in institute
State on memory and can at least one program of at least one computing device, at least one program by it is described extremely
The step in the method described in any one of the claims 1~8 is realized during a few computing device.
10. a kind of computer-readable recording medium, the computer-readable recording medium storage has computer executable at least
One program, it is characterised in that at least one program makes the computer perform above-mentioned power when being performed by the computer
Profit requires the step in the method described in 1~8 any one.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710912711.XA CN107682630A (en) | 2017-09-30 | 2017-09-30 | Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710912711.XA CN107682630A (en) | 2017-09-30 | 2017-09-30 | Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107682630A true CN107682630A (en) | 2018-02-09 |
Family
ID=61138693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710912711.XA Pending CN107682630A (en) | 2017-09-30 | 2017-09-30 | Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107682630A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110213479A (en) * | 2019-04-30 | 2019-09-06 | 北京迈格威科技有限公司 | A kind of video capture anti-fluttering method and device |
CN110278372A (en) * | 2019-06-26 | 2019-09-24 | Oppo广东移动通信有限公司 | Anti-shake method and apparatus, electronic device, computer-readable storage medium |
CN111866377A (en) * | 2020-06-22 | 2020-10-30 | 上海摩象网络科技有限公司 | Stability augmentation control method and device and camera system |
CN113747097A (en) * | 2020-05-14 | 2021-12-03 | 北京小米移动软件有限公司 | Video processing method and device and storage medium |
CN113873157A (en) * | 2021-09-28 | 2021-12-31 | 维沃移动通信有限公司 | Shooting method, apparatus, electronic device and readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105629427A (en) * | 2016-04-08 | 2016-06-01 | 东莞佩斯讯光电技术有限公司 | Stereo Digital Camera Based on Dual Controllable Lens Tilting Voice Coil Motors |
CN106060367A (en) * | 2016-07-29 | 2016-10-26 | 广东欧珀移动通信有限公司 | Dual-camera photographing control method, device and shooting device |
CN106686307A (en) * | 2016-12-28 | 2017-05-17 | 努比亚技术有限公司 | Shooting method and mobile terminal |
CN107071263A (en) * | 2016-12-30 | 2017-08-18 | 努比亚技术有限公司 | A kind of image processing method and terminal |
-
2017
- 2017-09-30 CN CN201710912711.XA patent/CN107682630A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105629427A (en) * | 2016-04-08 | 2016-06-01 | 东莞佩斯讯光电技术有限公司 | Stereo Digital Camera Based on Dual Controllable Lens Tilting Voice Coil Motors |
CN106060367A (en) * | 2016-07-29 | 2016-10-26 | 广东欧珀移动通信有限公司 | Dual-camera photographing control method, device and shooting device |
CN106686307A (en) * | 2016-12-28 | 2017-05-17 | 努比亚技术有限公司 | Shooting method and mobile terminal |
CN107071263A (en) * | 2016-12-30 | 2017-08-18 | 努比亚技术有限公司 | A kind of image processing method and terminal |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110213479A (en) * | 2019-04-30 | 2019-09-06 | 北京迈格威科技有限公司 | A kind of video capture anti-fluttering method and device |
CN110278372A (en) * | 2019-06-26 | 2019-09-24 | Oppo广东移动通信有限公司 | Anti-shake method and apparatus, electronic device, computer-readable storage medium |
CN113747097A (en) * | 2020-05-14 | 2021-12-03 | 北京小米移动软件有限公司 | Video processing method and device and storage medium |
CN111866377A (en) * | 2020-06-22 | 2020-10-30 | 上海摩象网络科技有限公司 | Stability augmentation control method and device and camera system |
CN113873157A (en) * | 2021-09-28 | 2021-12-31 | 维沃移动通信有限公司 | Shooting method, apparatus, electronic device and readable storage medium |
CN113873157B (en) * | 2021-09-28 | 2024-04-16 | 维沃移动通信有限公司 | Shooting method, device, electronic device and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107770448A (en) | A kind of image-pickup method, mobile terminal and computer-readable storage medium | |
CN107659758A (en) | Periscopic filming apparatus and mobile terminal | |
CN108052302A (en) | Association display methods, terminal and the computer readable storage medium of double-sided screen | |
CN107682627A (en) | A kind of acquisition parameters method to set up, mobile terminal and computer-readable recording medium | |
CN108055411A (en) | Flexible screen display methods, mobile terminal and computer readable storage medium | |
CN107682630A (en) | Dual camera anti-fluttering method, mobile terminal and computer-readable recording medium | |
CN108063901A (en) | A kind of image-pickup method, terminal and computer readable storage medium | |
CN108008889A (en) | Photographic method, mobile terminal and the computer-readable recording medium of flexible screen | |
CN107666526A (en) | A kind of terminal with camera | |
CN107948430A (en) | A kind of display control method, mobile terminal and computer-readable recording medium | |
CN108345426A (en) | A kind of terminal control method, terminal and computer readable storage medium | |
CN107463324A (en) | A kind of image display method, mobile terminal and computer-readable recording medium | |
CN107240072A (en) | A kind of screen luminance adjustment method, terminal and computer-readable recording medium | |
CN107483804A (en) | A kind of image pickup method, mobile terminal and computer-readable recording medium | |
CN107067842A (en) | Colour method of adjustment, mobile terminal and storage medium | |
CN108172161A (en) | Display methods, mobile terminal and computer readable storage medium based on flexible screen | |
CN107844230A (en) | A kind of advertisement page method of adjustment, mobile terminal and computer-readable recording medium | |
CN107770443A (en) | A kind of image processing method, mobile terminal and computer-readable recording medium | |
CN107239205A (en) | A kind of photographic method, mobile terminal and storage medium | |
CN107124552A (en) | A kind of image pickup method, terminal and computer-readable recording medium | |
CN107404618A (en) | A kind of image pickup method and terminal | |
CN108196777A (en) | A kind of flexible screen application process, equipment and computer readable storage medium | |
CN107172605A (en) | A kind of Emmergency call method, mobile terminal and computer-readable recording medium | |
CN109710159A (en) | A kind of flexible screen response method, equipment and computer readable storage medium | |
CN107613206A (en) | A kind of image processing method, mobile terminal and computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180209 |