CN105389113B - Application control method, apparatus based on gesture and terminal - Google Patents
Application control method, apparatus based on gesture and terminal Download PDFInfo
- Publication number
- CN105389113B CN105389113B CN201510740186.9A CN201510740186A CN105389113B CN 105389113 B CN105389113 B CN 105389113B CN 201510740186 A CN201510740186 A CN 201510740186A CN 105389113 B CN105389113 B CN 105389113B
- Authority
- CN
- China
- Prior art keywords
- gesture
- information
- module
- grouping
- operation information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to based on gesture application control method, apparatus and terminal, this method include:Gesture operation information input by user is detected in predeterminable area, the predeterminable area shows the operation interface of application icon or application;Judge whether the gesture operation information is pre-stored standard gesture operation information;When it is the pre-stored standard gesture operation information to be judged as the gesture operation information, pre-stored gesture function information corresponding with the standard gesture operation information and the application is executed.Using the embodiment of the present disclosure, terminal can be executed when the gesture operation information that predeterminable area detects is pre-stored standard gesture operation information with the standard gesture operation information and using corresponding gesture function information.By the embodiment, user can be managed the different gestures set by different application, simplify memory of the user to gesture, easy to operate.
Description
Technical field
This disclosure relates to field of terminal technology more particularly to a kind of application control method, apparatus and terminal based on gesture.
Background technology
Currently, smart machine rapidly develops, function is also more and more, what the appearance of gesture operation had great convenience for the user
Operation, such as:Sliding can return to an interface to the right on the screen;Sliding can delete this list to the left on list items
.
In the related technology, due to the convenience of gesture operation, many application programs both provide different gesture operation choosings
So that user can operate application by gesture operation.For example, in the circle of friends of wechat, user can pass through
The content that pulling operation issues wechat good friend in circle of friends is updated.
Invention content
In order to solve the problems, such as present in the relevant technologies, the application control method that present disclose provides a kind of based on gesture,
Device and terminal.
According to the first aspect of the embodiments of the present disclosure, a kind of application control method based on gesture is provided, including:Default
Gesture operation information input by user is detected in region, the predeterminable area shows operation circle of application icon or application
Face;
Judge whether the gesture operation information is pre-stored standard gesture operation information;
When it is the pre-stored standard gesture operation information to be judged as the gesture operation information, execution is deposited in advance
The gesture function information corresponding with the standard gesture operation information and the application of storage.
Optionally, the method further includes:
Obtain the preset gesture information of current application;
Extract the gesture function information in the gesture information;
During gesture information storage is grouped to gesture corresponding with the gesture function information, wherein the gesture point
Difference of the group based on gesture function information and set, be equipped with corresponding standard gesture operation information in each gesture grouping.
Optionally, it is described by gesture information storage to pre-establishing gesture corresponding with the gesture function information
In grouping, including:
It is grouped with the presence or absence of gesture corresponding with the gesture function information based on the gesture function information searching;
When being grouped there is no gesture corresponding with the gesture function information, established based on the gesture function information new
Gesture grouping;
The gesture information is stored in the new gesture grouping;
The new hand is stored in by standard gesture operation information that user is the new gesture grouping setting is corresponding
In gesture grouping.
Optionally, the method further includes:
When there is gesture corresponding with gesture function information grouping, the gesture information is stored in and the hand
In the corresponding gesture grouping of gesture functional information.
Optionally, described to judge whether the gesture operation information is pre-stored standard gesture operation information, including:
The gesture operation information detected and pre-stored each standard gesture operation information are compared respectively,
Obtain the similarity corresponding to each standard gesture operation information;
When at least one similarity is more than setting threshold value, it is judged as that the gesture operation information is pre-stored
The standard gesture operation information.
Optionally, described to execute pre-stored gesture work(corresponding with the standard gesture operation information and the application
Energy information, including:
The gesture grouping of the detected gesture operation information is determined based on the standard gesture operation information;
According to the title for reading the application in system catalog file using the title of registration;
Based on the title of the application, read corresponding for realizing gesture function with the application in the gesture grouping
The gesture interface code of information;
Read gesture interface code is run, corresponding gesture function is executed.
Optionally, the method further includes:
When obtained each similarity is both less than the setting threshold value, answered described in acquisition from the application message of storage
Preset gesture information;
Extract the gesture interface code for realizing gesture function information in the preset gesture information;
The extracted gesture interface code of operation, executes corresponding gesture function.
Optionally, the method further includes:
Detect the predetermined registration operation in current application;
Gesture grouping module is shown based on the predetermined registration operation;
When detecting the operational motion to the gesture grouping module, the preset hand for obtaining current application is executed
Gesture information.
Optionally, the method further includes:
Judge whether to detect the operational motion to the gesture grouping module of current application;
When detecting the operational motion to the gesture grouping module, the preset hand for obtaining current application is executed
Gesture information.
According to the second aspect of the embodiment of the present disclosure, a kind of application control device based on gesture is provided, including:
Detection module is configured as detecting gesture operation information input by user, the predeterminable area in predeterminable area
Show the operation interface of application icon or application;
Judgment module is configured as judging whether the gesture operation information of the detection module detection is to prestore
Standard gesture operation information;
Execution module is configured as being judged as that the gesture operation information is pre-stored described in the judgment module
When standard gesture operation information, pre-stored gesture work(corresponding with the standard gesture operation information and the application is executed
It can information.
Optionally, described device further includes:
Acquisition module is configured as obtaining the preset gesture information of current application;
Extraction module is configured as extracting the gesture function information in the gesture information that the acquisition module obtains;
Memory module is configured as the gesture information for obtaining acquisition module storage and arrives and the extraction module
In the corresponding gesture grouping of the gesture function information of extraction, wherein the gesture is grouped the difference based on gesture function information
And set, it is equipped with corresponding standard gesture operation information in each gesture grouping.
Optionally, the memory module includes:
Submodule is searched, is configured as whether there is and the gesture function information based on the gesture function information searching
Corresponding gesture grouping;
Setting up submodule, be configured as the lookup submodule lookup result be there is no with the gesture function information
When corresponding gesture grouping, new gesture grouping is established based on the gesture function information;
First sub-module stored is configured as the gesture information being stored in the new hand that the setting up submodule is established
In gesture grouping;
Second sub-module stored is configured as user being that the new gesture is grouped the standard gesture operation information being arranged
It is corresponding to be stored in the new gesture grouping that the setting up submodule is established.
Optionally, the memory module further includes:
Third sub-module stored is configured as in the lookup result for searching submodule being to exist and the gesture function
When the corresponding gesture of information is grouped, the gesture information is stored in gesture grouping corresponding with the gesture function information.
Optionally, the judgment module includes:
Submodule is compared, the gesture operation information that will be detected is configured as and is grasped with pre-stored each standard gesture
Make information to be compared respectively, obtains the similarity corresponding to each standard gesture operation information;
Judging submodule is configured as being more than setting threshold at least one similarity that the comparison submodule obtains
When value, it is judged as that the gesture operation information is the pre-stored standard gesture operation information.
Optionally, the execution module includes:
It is grouped determination sub-module, is configured as determining the detected gesture based on the standard gesture operation information
The gesture of operation information is grouped;
Title reading submodule is configured as reading the application using the title of registration according in system catalog file
Title;
Code reading submodule is configured as the title for the application read based on the title reading submodule, reads
With described using the corresponding gesture interface code for realizing gesture function information in taking the gesture to be grouped;
First operation submodule, is configured as running the read gesture interface code of the code reading submodule, hold
The corresponding gesture function of row.
Optionally, the judgment module further includes:
Gesture information acquisition submodule is configured as being both less than institute in each similarity that the comparison submodule obtains
When stating setting threshold value, the preset gesture information of the application is obtained from the application message of storage;
Code extracting sub-module is configured as extracting the preset gesture that the gesture information acquisition submodule obtains
The gesture interface code for realizing gesture function information in information;
Second operation submodule, is configured as running the gesture interface code that the code extracting sub-module is extracted, hold
The corresponding gesture function of row.
Optionally, described device further includes:
Detection module is operated, is configured as detecting the predetermined registration operation in current application;
Display module is configured as showing that gesture is grouped mould based on the predetermined registration operation of the operation detection module detection
Block;
Then the acquisition module is configured as detecting the gesture grouping module for showing the display module
When operational motion, the preset gesture information of current application is obtained.
Optionally, described device further includes:
Operation judges module is configured as judging whether to detect that the operation to the gesture grouping module of current application is dynamic
Make;
Then the acquisition module is configured as when detecting the operational motion to the gesture grouping module, is obtained current
The preset gesture information of application.
According to the third aspect of the embodiment of the present disclosure, a kind of terminal is provided, including:Processor;It can for storing processor
The memory executed instruction;Wherein, the processor is configured as:
Detect gesture operation information input by user in predeterminable area, the predeterminable area show application icon or
The operation interface of application;
Judge whether the gesture operation information is pre-stored standard gesture operation information;
When it is the pre-stored standard gesture operation information to be judged as the gesture operation information, execution is deposited in advance
The gesture function information corresponding with the standard gesture operation information and the application of storage.
The technical scheme provided by this disclosed embodiment can include the following benefits:
Terminal can be grasped in the gesture operation information that predeterminable area detects for pre-stored standard gesture in the disclosure
When making information, execute with the standard gesture operation information and using corresponding gesture function information.Pass through the embodiment, Yong Huneng
It is enough that different gestures set by different application are managed, memory of the user to gesture is simplified, it is easy to operate.
Different gesture groupings can be arranged in terminal based on the difference of gesture function information in the disclosure, and in each gesture
Each preset gesture information of application is stored in grouping, consequently facilitating user manages different gestures based on the difference of function.
In the disclosure when being grouped there is no gesture corresponding with current gesture function information, terminal can be based on the gesture
Functional information establishes new grouping, and standard gesture information is arranged for the new grouping, consequently facilitating by the gesture of said function
Information storage operated based on set standard gesture information in the new grouping, and convenient for user.
In the disclosure in existing gesture grouping corresponding with gesture function information, terminal can be direct by gesture information
It is stored in corresponding grouping.That is, by the way that grouping and corresponding standard gesture information is arranged, no matter using the hand being arranged
Whether gesture is identical, need to only operate same gesture to same function, consequently facilitating user operates, avoid the relevant technologies
In the identical function that different application is directed to have various gestures in the case of, need remember various gestures, it is possible to cause to miss
The problem of operation.
Terminal can be carried out by the gesture operation information that will be detected with pre-stored standard gesture information in the disclosure
Comparison, is further compared to determine whether standard gesture information, this side by obtained similarity and predetermined threshold value
Formula is easily achieved.
Terminal can determine the code for realizing gesture function based on gesture grouping and application in the disclosure, run
The code can execute corresponding gesture function.
Terminal is obtained when the gesture operation information of detection is not standard gesture operation information from application message in the disclosure
It takes using preset gesture information, and reads and run corresponding code.That is, if terminal first judges the hand detected
Whether gesture is standard gesture, executes corresponding function based on pre-stored corresponding code when being judged as standard gesture, is sentencing
When breaking not to be standard gesture, corresponding function is executed based on the preset information of application.
Show that gesture grouping module, guiding user enter gesture grouping when terminal can detect predetermined registration operation in the disclosure
Setting procedure in be configured.
Terminal can also directly acquire the pre- of application when detecting the operational motion to gesture grouping module in the disclosure
If gesture information, guiding user enter gesture grouping setting procedure in is configured, a variety of set-up modes for user select.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not
The disclosure can be limited.
Description of the drawings
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.
Fig. 1 is a kind of application control method flow diagram based on gesture of the disclosure shown according to an exemplary embodiment.
Fig. 2 is the disclosure according to another application control method flow based on gesture shown in an exemplary embodiment
Figure.
Fig. 3 is a kind of application control application scenarios signal based on gesture of the disclosure shown according to an exemplary embodiment
Figure.
Fig. 4 is a kind of application control device block diagram based on gesture of the disclosure shown according to an exemplary embodiment.
Fig. 5 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Fig. 6 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Fig. 7 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Fig. 8 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Fig. 9 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Figure 10 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Figure 11 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Figure 12 is the disclosure according to another application control device block diagram based on gesture shown in an exemplary embodiment.
Figure 13 is that the disclosure is a kind of for the application control device based on gesture shown according to an exemplary embodiment
One structural schematic diagram.
Specific implementation mode
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all implementations consistent with this disclosure.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the disclosure.
It is the purpose only merely for description specific embodiment in the term that the disclosure uses, is not intended to be limiting the disclosure.
The "an" of singulative used in disclosure and the accompanying claims book, " described " and "the" are also intended to including majority
Form, unless context clearly shows that other meanings.It is also understood that term "and/or" used herein refers to and wraps
Containing one or more associated list items purposes, any or all may be combined.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the disclosure
A little information should not necessarily be limited by these terms.These terms are only used for same type of information being distinguished from each other out.For example, not departing from
In the case of disclosure range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as
One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ...
When " or " in response to determination ".
As shown in Figure 1, Fig. 1 is a kind of application control method flow based on gesture shown according to an exemplary embodiment
Figure, this method can be used in terminal, include the following steps:
Step 101 detects gesture operation information input by user in the first predeterminable area, which, which shows, answers
With icon or the operation interface of application.
Terminal in the disclosure can be any intelligent terminal with function of surfing the Net, for example, can be specially mobile phone, flat
Plate computer, PDA (Personal Digital Assistant, personal digital assistant) etc..Wherein, terminal can pass through wireless office
Domain net couple in router, and pass through the server on router access public network.
Wherein, the first predeterminable area is using the region for supporting user's progress gesture operation, can be the interface of application, example
Such as the circle of friends interface in wechat, the pulling operation of user can be detected on the surface;The predeterminable area can also be desktop
On application icon, such as in iPhone, in the icon area of application, be able to detect that the long press operation of user.
Step 102 judges whether the gesture operation information is pre-stored standard gesture operation information.
In the disclosure, standard gesture operation information is in order to facilitate user's operation, gesture operation information set by the user.
Since much application both provides different gesture operation options, gesture of the different application for same function setting
Not necessarily unify, and the corresponding function of same gesture of different application settings may be different, thus user's memory is got up more
Difficulty may cause confusion, thus to being grouped using preset gesture information in the disclosure, and standard is provided with to each group
Gesture operation information is remembered and is operated convenient for user.
Step 103, when it is pre-stored standard gesture operation information to judge the gesture operation information, execution deposit in advance
Storage with standard gesture operation information and apply corresponding gesture function information.
In the embodiment of the present disclosure, user is when the first supported predeterminable area of application executes gesture operation, terminal system
The gesture recognition module of itself can go to identify this gesture operation prior to the application, by the gesture operation information with pre-set
Standard gesture operation information be compared matching, and execute in the case of successful match pre-stored with standard gesture pair
The function of answering.Thus it is convenient for applying the various gestures provided to be managed collectively to each, remembers various gesture behaviour convenient for user
Make, and simple operation, it is easy to accomplish.
As shown in Fig. 2, Fig. 2 is according to another application control method stream based on gesture shown in an exemplary embodiment
Cheng Tu, this method can be used in terminal, include the following steps:
Step 201, the preset gesture information for obtaining current application.
The step can be based on following manner and realize:
Mode one detects predetermined registration operation in the second predeterminable area of current application, for example, pulling operation, this second
Predeterminable area can be the region of the support gesture operation of the application, also, the position of second predeterminable area is different from first
The position of predeterminable area;It is then based on the predetermined registration operation detected and shows that gesture grouping module, the gesture grouping module can be
One virtual push button;When detecting the operational motion to the gesture grouping module, the preset gesture information of current application is obtained,
Wherein the operational motion can be clicking operation, long press operation etc..The preset gesture information wherein applied is that application program is pre-
The gesture being first arranged specifies some gesture motion to be able to carry out a certain function, still by taking wechat as an example, the drop-down at circle of friends interface
The more new function of the corresponding application acquiescence of operation.
Mode two judges whether to detect that the operational motion to the gesture grouping module of current application, the gesture are grouped mould
Block can be identical virtual push button with mode one, or different virtual push buttons;The gesture is grouped detecting
When the operational motion of module, the preset gesture information for obtaining current application is executed.
In the embodiment of the present disclosure, user think using preset gesture operation do not meet oneself use habit or because
When wanting using other gesture operation for other reasons, gesture grouping module can be recalled, the gesture of the application is grouped,
And the corresponding standard gesture operation information of setting.
In disclosure step, obtaining the preset gesture information of current application can note according to being applied in system file directory
The title of volume reads the Apply Names of current application, the unique name for the application which determines;It is then based on
The Apply Names read obtain the preset gesture information of current application from the application message of storage, which can
With including gesture operation information, gesture function information and for realizing the gesture interface code of gesture function information.
Wherein gesture interface code is to refer in the concept of programming for realizing the code information of corresponding gesture function
Be " class name+method name ", different application functions may be implemented in different class names and method name.Preset each hand in
Gesture operation can all correspond to storage gesture interface code, for realizing certain function, for example return to the work(such as a upper interface, deletion
Energy.
Step 202 extracts gesture function information in the gesture information.
Step 203 stores gesture information into gesture corresponding with gesture function information grouping.
Wherein gesture is grouped the difference based on gesture function information and sets, and corresponding standard hand is equipped in each gesture grouping
Gesture operation information.
In disclosure step, the gesture function information searching that can be extracted based on step 202 whether there is and the gesture work(
It can the corresponding gesture grouping of information;If not finding gesture grouping, that is to say, that be not present and the gesture function information pair
The gesture grouping answered then establishes corresponding new gesture grouping based on the gesture function information;And gesture information is stored in newly
Gesture grouping in;Then user can be that the new gesture is grouped setting standard gesture operation information, and user is by terminal should
The standard gesture operation information of new gesture grouping setting is corresponding to be stored in the new gesture grouping.
If can find, that is to say, that there is gesture grouping corresponding with gesture function information, then it can be by the hand
Gesture information is stored directly in established gesture grouping corresponding with gesture function information.
In the embodiment of the present disclosure, it can through the above steps establish including the difference based on gesture function information and be provided with
The gesture group list of multiple gesture groupings is stored with following information in each gesture grouping:Gesture group names, standard gesture behaviour
Make information, Apply Names, using preset gesture operation information and gesture interface code, the gesture group list is as shown in table 1:
1. gesture group list of table
The gesture operation that the embodiment of the present disclosure provides different application it can be seen from foregoing description and table 1 is united
One grouping management, and standard gesture information is arranged to each group, thus user need to only remember the standard gesture for executing some function
Operation, can realize to institute it is in need realization the function application unified management.
Step 204 detects gesture operation information input by user in the first predeterminable area.
Wherein, first predeterminable area be using support user gesture operation region, can show application icon or
The operation interface of person's application.
Step 205 judges whether the gesture operation information is pre-stored standard gesture operation information.
In disclosure step, terminal the gesture operation information detected and can will be stored in advance in gesture group list
Each standard gesture operation information compared respectively, obtain the similarity corresponding to each standard gesture operation information;At least
When one similarity is more than setting threshold value, it is judged as that the gesture operation information detected is to be stored in advance in gesture group list
In standard gesture operation information.
If threshold value is both less than arranged in obtained all similarities, illustrate that the gesture operation information is not standard gesture
Operation information.
Step 206, when it is pre-stored standard gesture operation information to be judged as the gesture operation information, execute advance
Storage with the standard gesture operation information and apply corresponding gesture function information.
When it is pre-stored standard gesture operation information to be judged as the gesture operation information, terminal can be based on the mark
Quasi- gesture operation information determines the gesture grouping of detected gesture operation information;Then it is applied according in system catalog file
The title of registration reads the title of the application;It is then based on the title of application, is read corresponding with the application in gesture grouping
The gesture interface code for realizing gesture function information;Read gesture interface code is run, corresponding gesture is executed
Function.
That is, the gesture recognition module of terminal system intercepts and captures the gesture operation information of user prior to application, and
When being judged as that the gesture operation information is standard gesture operation information, pre-stored corresponding function is executed, rather than by applying
To identify the gesture operation information.
When it is not pre-stored standard gesture operation information to be judged as the gesture operation information, that is to say, that obtaining
Each similarity both less than be arranged threshold value when, the preset gesture information of application is obtained from the application message of storage;Extraction should
The gesture interface code for realizing gesture function information in preset gesture information;Operation extracted gesture interface generation
Code, executes corresponding gesture function.
That is, the gesture recognition module of terminal system is being judged as that detected gesture operation information is not standard hand
When gesture operation information, which is identified by application program, and executes and corresponds to the gesture operation using preset
The function of information.
It is answered as shown in figure 3, Fig. 3 is a kind of application control based on gesture of the disclosure shown according to an exemplary embodiment
Use schematic diagram of a scenario.In scene shown in Fig. 3, including:Smart mobile phone as terminal.
In the operation interface of the application of smart mobile phone, the gesture operation letter of " sliding to the right " input by user is detected
Breath, smart mobile phone judge whether the gesture operation information is pre-stored standard gesture operation information, are being judged as this " to the right
Sliding " is when being pre-stored standard gesture operation information, based on the standard gesture operation information determine it is detected " to the right
The gesture grouping of sliding ", to return to gesture grouping;Then terminal reads according to the title using registration in system catalog file and is somebody's turn to do
The title of application is wechat;Title wechat of the terminal based on application reads corresponding with wechat in return gesture grouping be used for
Realize the gesture interface code returned;Read gesture interface code is run, executes and returns to function.
In application scenarios shown in Fig. 3, it is aforementioned to figure to realize that the detailed process of the application control based on gesture may refer to
Description in 1-2, details are not described herein.
Corresponding with the aforementioned application control embodiment of the method based on gesture, the disclosure additionally provides the application based on gesture
The embodiment of control device and its terminal applied.
As shown in figure 4, Fig. 4 is a kind of application control dress based on gesture of the disclosure shown according to an exemplary embodiment
Block diagram is set, which may include:Detection module 410, judgment module 420 and execution module 430.
Detection module 410 is configured as detecting gesture operation information input by user, the predeterminable area in predeterminable area
Show the operation interface of application icon or application;
Judgment module 420 is configured as judging whether the gesture operation information that detection module 410 detects is pre-stored
Standard gesture operation information;
Execution module 430 is configured as being judged as that gesture operation information is pre-stored standard hand in judgment module 420
When gesture operation information, execute pre-stored with the standard gesture operation information and this is using corresponding gesture function information.
In above-described embodiment, terminal can be pre-stored standard hand in the gesture operation information that predeterminable area detects
When gesture operation information, execute with the standard gesture operation information and using corresponding gesture function information.By the embodiment, use
Family can be managed the different gestures set by different application, simplify memory of the user to gesture, easy to operate.
As shown in figure 5, Fig. 5 is the disclosure according to another application control based on gesture shown in an exemplary embodiment
Device block diagram, on the basis of aforementioned embodiment illustrated in fig. 4, device further includes the embodiment:Acquisition module 440, extraction module
450 and first memory module 460.
Acquisition module 440 is configured as obtaining the preset gesture information of current application;
Extraction module 450, the gesture function information being configured as in the gesture information that extraction acquisition module 440 obtains;
Memory module 460 is configured as proposing the gesture information storage that acquisition module 440 obtains to extraction module 450
In the corresponding gesture grouping of gesture function information taken, wherein gesture is grouped the difference based on gesture function information and sets, respectively
It is equipped with corresponding standard gesture operation information in gesture grouping.
In above-described embodiment, different gestures can be arranged based on the difference of gesture function information and be grouped for terminal, and
Each preset gesture information of application is stored in each gesture grouping, consequently facilitating user manages different hands based on the difference of function
Gesture.
As shown in fig. 6, Fig. 6 is the disclosure according to another application control based on gesture shown in an exemplary embodiment
Device block diagram, on the basis of aforementioned embodiment illustrated in fig. 5, memory module 460 may include the embodiment:Search submodule
461, setting up submodule 462, the first sub-module stored 463 and the second sub-module stored 464.
Submodule 461 is searched, is configured as based on gesture function information searching with the presence or absence of corresponding with gesture function information
Gesture grouping;
Setting up submodule 462, be configured as search 461 lookup result of submodule be there is no with gesture function information pair
When the gesture grouping answered, new gesture grouping is established based on gesture function information;
First sub-module stored 463 is configured as gesture information being stored in the new gesture that setting up submodule 462 is established
In grouping;
Second sub-module stored 464 is configured as user being that new gesture is grouped the standard gesture operation information being arranged
It is corresponding to be stored in the new gesture grouping of the foundation of setting up submodule 463.
In above-described embodiment, when being grouped there is no gesture corresponding with current gesture function information, the hand can be based on
Gesture functional information establishes new grouping, and standard gesture information is arranged for the new grouping, consequently facilitating by the hand of said function
The information storage of gesture is operated in the new grouping, and convenient for user based on set standard gesture information.
As shown in fig. 7, Fig. 7 is the disclosure according to another application control based on gesture shown in an exemplary embodiment
Device block diagram, on the basis of aforementioned embodiment illustrated in fig. 6, which can also include the embodiment:Third stores
Submodule 465.
Third sub-module stored 465 is configured as in the lookup result for searching submodule 461 being to exist to believe with gesture function
When ceasing corresponding gesture grouping, gesture information is stored in gesture grouping corresponding with the gesture function information.
It, can be straight by gesture information in existing gesture grouping corresponding with gesture function information in above-described embodiment
It connects and is stored in corresponding grouping.That is, by the way that grouping and corresponding standard gesture information is arranged, no matter using setting
Whether gesture is identical, and same gesture need to be only operated to same function, consequently facilitating user operates, avoids related skill
In art in the case where the identical function that different application is directed to has various gestures, need to remember various gestures, it is possible to cause
The problem of maloperation.
As shown in figure 8, Fig. 8 is the disclosure according to another application control based on gesture shown in an exemplary embodiment
Device block diagram, on the basis of aforementioned embodiment illustrated in fig. 4, which may include the embodiment:Compare submodule
421 and judging submodule 422.
Submodule 421 is compared, the gesture operation information that will be detected is configured as and is grasped with pre-stored each standard gesture
Make information to be compared respectively, obtains the similarity corresponding to each standard gesture operation information;
Judging submodule 422 is configured as being more than setting threshold value at least one similarity that comparison submodule 421 obtains
When, it is judged as that gesture operation information is pre-stored standard gesture operation information.
In above-described embodiment, terminal can pass through the gesture operation information that will be detected and pre-stored standard gesture information
It is compared, is further compared to determine whether standard gesture information by obtained similarity and predetermined threshold value, this
Kind mode is easily achieved.
As shown in figure 9, Fig. 9 is the disclosure according to another application control based on gesture shown in an exemplary embodiment
Device block diagram, on the basis of aforementioned embodiment illustrated in fig. 4, execution module 430 may include the embodiment:Grouping determines submodule
Block 431, title reading submodule 432, code reading submodule 433 and first run submodule 434.
It is grouped determination sub-module 431, is configured as determining detected gesture operation based on standard gesture operation information
The gesture of information is grouped;
Title reading submodule 432 is configured as reading application according to the title using registration in system catalog file
Title;
Code reading submodule 433 is configured as the title for the application read based on title reading submodule 432, reads
In gesture grouping the corresponding gesture interface code for realizing gesture function information is applied with this;
First operation submodule 434 is configured as 433 read gesture interface code of operation code reading submodule,
Execute corresponding gesture function.
In above-described embodiment, terminal can determine the generation for realizing gesture function based on gesture grouping and application
Code, corresponding gesture function can be executed by running the code.
As shown in Figure 10, Figure 10 is that the disclosure is controlled according to another application based on gesture shown in an exemplary embodiment
Device block diagram processed, on the basis of aforementioned embodiment illustrated in fig. 8, judgment module 420 can also include the embodiment:Gesture information
Acquisition submodule 423, code extracting sub-module 424 and second run submodule 425.
Gesture information acquisition submodule 423 is configured as both less than being arranged in each similarity that comparison submodule 421 obtains
When threshold value, the preset gesture information of application is obtained from the application message of storage;
Code extracting sub-module 424 is configured as the preset gesture letter that extraction gesture information acquisition submodule 423 obtains
The gesture interface code for realizing gesture function information in breath;
Second operation submodule 425 is configured as the gesture interface code that operation code extracting sub-module 424 is extracted,
Execute corresponding gesture function.
In above-described embodiment, terminal is believed when the gesture operation information of detection is not standard gesture operation information from application
It is obtained in breath and applies preset gesture information, and read and run corresponding code.That is, if terminal first judges to detect
To gesture whether be standard gesture, corresponding work(is executed based on pre-stored corresponding code when being judged as standard gesture
Can, when it is not standard gesture to be judged as, corresponding function is executed based on the preset information of application.
As shown in figure 11, Figure 11 is that the disclosure is controlled according to another application based on gesture shown in an exemplary embodiment
Device block diagram processed, on the basis of aforementioned embodiment illustrated in fig. 5, which can also include the embodiment:Operate detection module
480 and display module 490.
Detection module 480 is operated, is configured as detecting the predetermined registration operation in current application;
Display module 490 is configured as showing that gesture is grouped mould based on the predetermined registration operation that operation detection module 480 detects
Block;
Then acquisition module 440 is configured as dynamic in the operation for detecting the gesture grouping module shown to display module 490
When making, the preset gesture information of current application is obtained.
In above-described embodiment, terminal shows gesture grouping module when can detect predetermined registration operation, and guiding user is into starting with
It is configured in the setting procedure of gesture grouping.
As shown in figure 12, Figure 12 is that the disclosure is controlled according to another application based on gesture shown in an exemplary embodiment
Device block diagram processed, on the basis of aforementioned embodiment illustrated in fig. 5, which can also include the embodiment:Operation judges module
4100。
Operation judges module 4100 is configured as judging whether to detect the operation to the gesture grouping module of current application
Action;
Then acquisition module 440 is configured as when detecting the operational motion to gesture grouping module, obtains current application
Preset gesture information.
In above-described embodiment, terminal can also directly acquire application when detecting the operational motion to gesture grouping module
Preset gesture information, guiding user enter gesture grouping setting procedure in is configured, a variety of set-up modes are for user
Selection.
The application control device embodiment based on gesture shown in above-mentioned Fig. 4 to Figure 12 can be applied in the terminal.
The function of each unit and the realization process of effect specifically refer to and correspond to step in the above method in above-mentioned apparatus
Realization process, details are not described herein.
For device embodiments, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separating component
The unit of explanation may or may not be physically separated, and the component shown as unit can be or can also
It is not physical unit, you can be located at a place, or may be distributed over multiple network units.It can be according to actual
It needs that some or all of module therein is selected to realize the purpose of disclosure scheme.Those of ordinary skill in the art are not paying
In the case of going out creative work, you can to understand and implement.
Correspondingly, the disclosure also provides a kind of terminal, which includes processor;For storing, processor is executable to be referred to
The memory of order;Wherein, which is configured as:
Gesture operation information input by user is detected in predeterminable area, which shows application icon or answer
Operation interface;
Judge whether gesture operation information is pre-stored standard gesture operation information;
When it is pre-stored standard gesture operation information to be judged as gesture operation information, pre-stored and mark is executed
Quasi- gesture operation information and this apply corresponding gesture function information.
As shown in figure 13, Figure 13 is that the disclosure is a kind of for the application based on gesture shown according to an exemplary embodiment
A structural schematic diagram (end side) for control device 1300.For example, device 1300 can be the mobile phone for having routing function,
Computer, digital broadcast terminal, messaging devices, game console, tablet device, Medical Devices, body-building equipment, a number
Word assistant etc..
Referring to Fig.1 3, device 1300 may include following one or more components:Processing component 1302, memory 1304,
Power supply module 1306, multimedia component 1308, audio component 1310, the interface 1312 of input/output (I/O), sensor module
1314 and communication component 1316.
The integrated operation of 1302 usual control device 1300 of processing component, such as with display, call, data communication,
Camera operation and record operate associated operation.Processing component 1302 may include one or more processors 1320 to execute
Instruction, to perform all or part of the steps of the methods described above.In addition, processing component 1302 may include one or more moulds
Block, convenient for the interaction between processing component 1302 and other assemblies.For example, processing component 1302 may include multi-media module,
To facilitate the interaction between multimedia component 1308 and processing component 1302.
Memory 1304 is configured as storing various types of data to support the operation in device 1300.These data
Example includes the instruction for any application program or method that are operated on device 1300, contact data, telephone book data,
Message, picture, video etc..Memory 1304 can by any kind of volatibility or non-volatile memory device or they
Combination is realized, such as static RAM (SRAM), electrically erasable programmable read-only memory (EEPROM), it is erasable can
Program read-only memory (EPROM), programmable read only memory (PROM), read-only memory (ROM), magnetic memory, flash memory
Reservoir, disk or CD.
Power supply module 1306 provides electric power for the various assemblies of device 1300.Power supply module 1306 may include power management
System, one or more power supplys and other generated with for device 1300, management and the associated component of distribution electric power.
Multimedia component 1308 is included in the screen of one output interface of offer between described device 1300 and user.
In some embodiments, screen may include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel,
Screen may be implemented as touch screen, to receive input signal from the user.Touch panel includes that one or more touch passes
Sensor is to sense the gesture on touch, slide, and touch panel.The touch sensor can not only sense touch or sliding is dynamic
The boundary of work, but also detect duration and pressure associated with the touch or slide operation.In some embodiments, more
Media component 1308 includes a front camera and/or rear camera.When device 1300 is in operation mode, mould is such as shot
When formula or video mode, front camera and/or rear camera can receive external multi-medium data.Each preposition camera shooting
Head and rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio component 1310 is configured as output and/or input audio signal.For example, audio component 1310 includes a wheat
Gram wind (MIC), when device 1300 is in operation mode, when such as call model, logging mode and speech recognition mode, microphone quilt
It is configured to receive external audio signal.The received audio signal can be further stored in memory 1304 or via communication
Component 1316 is sent.In some embodiments, audio component 1310 further includes a loud speaker, is used for exports audio signal.
I/O interfaces 1312 provide interface, above-mentioned peripheral interface module between processing component 1302 and peripheral interface module
Can be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and
Locking press button.
Sensor module 1314 includes one or more sensors, and the state for providing various aspects for device 1300 is commented
Estimate.For example, sensor module 1314 can detect the state that opens/closes of device 1300, the relative positioning of component, such as institute
The display and keypad that component is device 1300 are stated, sensor module 1314 can be with detection device 1300 or device 1,300 1
The position change of a component, the existence or non-existence that user contacts with device 1300,1300 orientation of device or acceleration/deceleration and dress
Set 1300 temperature change.Sensor module 1314 may include proximity sensor, be configured in not any physics
It is detected the presence of nearby objects when contact.Sensor module 1314 can also include optical sensor, as CMOS or ccd image are sensed
Device, for being used in imaging applications.In some embodiments, which can also include acceleration sensing
Device, gyro sensor, Magnetic Sensor, pressure sensor, microwave remote sensor or temperature sensor.
Communication component 1316 is configured to facilitate the communication of wired or wireless way between device 1300 and other equipment.Dress
The wireless network based on communication standard, such as WiFi can be accessed by setting 1300,2G or 3G or combination thereof.It is exemplary at one
In embodiment, communication component 1316 receives broadcast singal or broadcast correlation from external broadcasting management system via broadcast channel
Information.In one exemplary embodiment, the communication component 1316 further includes near-field communication (NFC) module, to promote short distance
Communication.For example, radio frequency identification (RFID) technology, Infrared Data Association (IrDA) technology, ultra wide band can be based in NFC module
(UWB) technology, bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 1300 can be by one or more application application-specific integrated circuit (ASIC), number
Signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD), field programmable gate array
(FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for executing the above method.
In the exemplary embodiment, it includes the non-transitorycomputer readable storage medium instructed, example to additionally provide a kind of
Such as include the memory 1304 of instruction, above-metioned instruction can be executed by the processor 1320 of device 1300 to complete the above method.Example
Such as, the non-transitorycomputer readable storage medium can be ROM, it is random access memory (RAM), CD-ROM, tape, soft
Disk and optical data storage devices etc..
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the disclosure
Its embodiment.The disclosure is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and includes the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
The foregoing is merely the preferred embodiments of the disclosure, not limiting the disclosure, all essences in the disclosure
With within principle, any modification, equivalent substitution, improvement and etc. done should be included within the scope of the disclosure protection god.
Claims (17)
1. a kind of application control method based on gesture, which is characterized in that including:
Gesture operation information input by user is detected in predeterminable area, the predeterminable area shows application icon or application
Operation interface;
Judge whether the gesture operation information is pre-stored standard gesture operation information;
When it is the pre-stored standard gesture operation information to be judged as the gesture operation information, execute pre-stored
Gesture function information corresponding with the standard gesture operation information and the application;
The method further includes:
Obtain the preset gesture information of current application;
Extract the gesture function information in the gesture information;
During gesture information storage is grouped to gesture corresponding with the gesture function information, wherein the gesture is grouped base
It is set in the difference of gesture function information, corresponding standard gesture operation information is equipped in each gesture grouping.
2. according to the method described in claim 1, it is characterized in that, it is described by the gesture information storage to pre-establish with
In the corresponding gesture grouping of the gesture function information, including:
It is grouped with the presence or absence of gesture corresponding with the gesture function information based on the gesture function information searching;
When being grouped there is no gesture corresponding with the gesture function information, new hand is established based on the gesture function information
Gesture is grouped;
The gesture information is stored in the new gesture grouping;
The new gesture point is stored in by standard gesture operation information that user is the new gesture grouping setting is corresponding
In group.
3. according to the method described in claim 2, it is characterized in that, the method further includes:
When there is gesture corresponding with gesture function information grouping, the gesture information is stored in and the gesture work(
In the corresponding gesture grouping of energy information.
4. according to the method described in claim 1, it is characterized in that, described judge whether the gesture operation information is to deposit in advance
The standard gesture operation information of storage, including:
The gesture operation information detected and pre-stored each standard gesture operation information are compared respectively, obtained
Corresponding to the similarity of each standard gesture operation information;
When at least one similarity is more than setting threshold value, it is judged as that the gesture operation information is pre-stored described
Standard gesture operation information.
5. according to the method described in claim 1, it is characterized in that, described execute the pre-stored and standard gesture operation
Information and the corresponding gesture function information of the application, including:
The gesture grouping of the detected gesture operation information is determined based on the standard gesture operation information;
According to the title for reading the application in system catalog file using the title of registration;
Based on the title of the application, read corresponding for realizing gesture function information with the application in the gesture grouping
Gesture interface code;
Read gesture interface code is run, corresponding gesture function is executed.
6. according to the method described in claim 4, it is characterized in that, the method further includes:
When obtained each similarity is both less than the setting threshold value, the application is obtained from the application message of storage
Preset gesture information;
Extract the gesture interface code for realizing gesture function information in the preset gesture information;
The extracted gesture interface code of operation, executes corresponding gesture function.
7. according to the method described in claim 1, it is characterized in that, the method further includes:
Detect the predetermined registration operation in current application;
Gesture grouping module is shown based on the predetermined registration operation;
When detecting the operational motion to the gesture grouping module, the preset gesture letter for obtaining current application is executed
Breath.
8. according to the method described in claim 1, it is characterized in that, the method further includes:
Judge whether to detect the operational motion to the gesture grouping module of current application;
When detecting the operational motion to the gesture grouping module, the preset gesture letter for obtaining current application is executed
Breath.
9. a kind of application control device based on gesture, which is characterized in that including:
Detection module is configured as detecting gesture operation information input by user in predeterminable area, and the predeterminable area is shown
There is the operation interface of application icon or application;
Judgment module is configured as judging whether the gesture operation information of the detection module detection is pre-stored mark
Quasi- gesture operation information;
Execution module is configured as being judged as that the gesture operation information is the pre-stored standard in the judgment module
When gesture operation information, pre-stored gesture function letter corresponding with the standard gesture operation information and the application is executed
Breath;
Described device further includes:
Acquisition module is configured as obtaining the preset gesture information of current application;
Extraction module is configured as extracting the gesture function information in the gesture information that the acquisition module obtains;
Memory module is configured as the gesture information for obtaining acquisition module storage and is extracted to the extraction module
The gesture function information corresponding gesture grouping in, wherein gesture difference of the grouping based on gesture function information and set
It is fixed, it is equipped with corresponding standard gesture operation information in each gesture grouping.
10. device according to claim 9, which is characterized in that the memory module includes:
Submodule is searched, is configured as based on the gesture function information searching with the presence or absence of corresponding with the gesture function information
Gesture grouping;
Setting up submodule is configured as in the lookup submodule lookup result being that there is no corresponding with the gesture function information
Gesture grouping when, new gesture grouping is established based on the gesture function information;
First sub-module stored is configured as the gesture information being stored in the new gesture point that the setting up submodule is established
In group;
Second sub-module stored is configured as corresponding to the standard gesture operation information that user is the new gesture grouping setting
Be stored in the new gesture grouping that the setting up submodule is established.
11. device according to claim 10, which is characterized in that the memory module further includes:
Third sub-module stored is configured as in the lookup result for searching submodule being to exist and the gesture function information
When corresponding gesture grouping, the gesture information is stored in gesture grouping corresponding with the gesture function information.
12. device according to claim 9, which is characterized in that the judgment module includes:
Submodule is compared, the gesture operation information that will be detected is configured as and believes with pre-stored each standard gesture operation
Breath is compared respectively, obtains the similarity corresponding to each standard gesture operation information;
Judging submodule is configured as being more than setting threshold value at least one similarity that the comparison submodule obtains
When, it is judged as that the gesture operation information is the pre-stored standard gesture operation information.
13. device according to claim 9, which is characterized in that the execution module includes:
It is grouped determination sub-module, is configured as determining the detected gesture operation based on the standard gesture operation information
The gesture of information is grouped;
Title reading submodule is configured as according to the name for reading the application in system catalog file using the title of registration
Claim;
Code reading submodule is configured as the title for the application read based on the title reading submodule, reads institute
It states in gesture grouping with described using the corresponding gesture interface code for realizing gesture function information;
First operation submodule is configured as running the read gesture interface code of the code reading submodule, execution pair
The gesture function answered.
14. device according to claim 12, which is characterized in that the judgment module further includes:
Gesture information acquisition submodule is configured as being both less than described set in each similarity that the comparison submodule obtains
When setting threshold value, the preset gesture information of the application is obtained from the application message of storage;
Code extracting sub-module is configured as extracting the preset gesture information that the gesture information acquisition submodule obtains
In the gesture interface code for realizing gesture function information;
Second operation submodule is configured as running the gesture interface code that the code extracting sub-module is extracted, execution pair
The gesture function answered.
15. device according to claim 9, which is characterized in that described device further includes:
Detection module is operated, is configured as detecting the predetermined registration operation in current application;
Display module is configured as showing gesture grouping module based on the predetermined registration operation of the operation detection module detection;
Then the acquisition module is configured as in the operation for detecting the gesture grouping module shown to the display module
When action, the preset gesture information of current application is obtained.
16. device according to claim 9, which is characterized in that described device further includes:
Operation judges module is configured as judging whether to detect the operational motion to the gesture grouping module of current application;
Then the acquisition module is configured as when detecting the operational motion to the gesture grouping module, obtains current application
Preset gesture information.
17. a kind of terminal, which is characterized in that including:Processor;Memory for storing processor-executable instruction;Wherein,
The processor is configured as:
Gesture operation information input by user is detected in predeterminable area, the predeterminable area shows application icon or application
Operation interface;
Judge whether the gesture operation information is pre-stored standard gesture operation information;
When it is the pre-stored standard gesture operation information to be judged as the gesture operation information, execute pre-stored
Gesture function information corresponding with the standard gesture operation information and the application;
Obtain the preset gesture information of current application;
Extract the gesture function information in the gesture information;
During gesture information storage is grouped to gesture corresponding with the gesture function information, wherein the gesture is grouped base
It is set in the difference of gesture function information, corresponding standard gesture operation information is equipped in each gesture grouping.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510740186.9A CN105389113B (en) | 2015-11-03 | 2015-11-03 | Application control method, apparatus based on gesture and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510740186.9A CN105389113B (en) | 2015-11-03 | 2015-11-03 | Application control method, apparatus based on gesture and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105389113A CN105389113A (en) | 2016-03-09 |
CN105389113B true CN105389113B (en) | 2018-09-04 |
Family
ID=55421435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510740186.9A Active CN105389113B (en) | 2015-11-03 | 2015-11-03 | Application control method, apparatus based on gesture and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105389113B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106095277B (en) * | 2016-06-22 | 2020-12-15 | 惠州Tcl移动通信有限公司 | Method and system for realizing double-click event strategy selection processing by mobile terminal |
CN106325704A (en) * | 2016-08-06 | 2017-01-11 | 广东欧珀移动通信有限公司 | A method and device for unifying sliding functions in video playback applications |
CN107918490B (en) * | 2017-11-22 | 2022-05-31 | 联想(北京)有限公司 | Control method and electronic equipment |
CN109885444B (en) * | 2019-01-16 | 2022-03-15 | 深圳壹账通智能科技有限公司 | Testing method and device based on gesture recognition, storage medium and terminal equipment |
CN110069132A (en) * | 2019-03-28 | 2019-07-30 | 努比亚技术有限公司 | Application control method, intelligent wearable device and computer readable storage medium |
CN110825295B (en) * | 2019-11-05 | 2021-07-13 | 维沃移动通信有限公司 | Application program control method and electronic equipment |
CN111290579B (en) * | 2020-02-10 | 2022-05-20 | Oppo广东移动通信有限公司 | Control method and device of virtual content, electronic equipment and computer readable medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622116A (en) * | 2011-01-30 | 2012-08-01 | 联咏科技股份有限公司 | Single-finger gesture judgment method, touch control sensing control chip and touch control system |
CN102810023A (en) * | 2011-06-03 | 2012-12-05 | 联想(北京)有限公司 | Gesture recognition method and terminal equipment |
CN104536563A (en) * | 2014-12-12 | 2015-04-22 | 林云帆 | Electronic equipment control method and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102182297B1 (en) * | 2014-01-13 | 2020-11-24 | 삼성전자 주식회사 | Method Of Finger Scan And Mobile Terminal Supporting The Same |
-
2015
- 2015-11-03 CN CN201510740186.9A patent/CN105389113B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102622116A (en) * | 2011-01-30 | 2012-08-01 | 联咏科技股份有限公司 | Single-finger gesture judgment method, touch control sensing control chip and touch control system |
CN102810023A (en) * | 2011-06-03 | 2012-12-05 | 联想(北京)有限公司 | Gesture recognition method and terminal equipment |
CN104536563A (en) * | 2014-12-12 | 2015-04-22 | 林云帆 | Electronic equipment control method and system |
Also Published As
Publication number | Publication date |
---|---|
CN105389113A (en) | 2016-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105389113B (en) | Application control method, apparatus based on gesture and terminal | |
CN105204742B (en) | Control method, device and the terminal of electronic equipment | |
CN104820675B (en) | Photograph album display methods and device | |
CN104951500B (en) | Method for displaying head portrait, device and terminal | |
CN104486451B (en) | Application program recommends method and device | |
CN105204356B (en) | Display methods, device and the terminal of application | |
CN105183835B (en) | The method and device of information flag in social software | |
CN105099841B (en) | Sending method, device, terminal and the router of message | |
CN105278841B (en) | Control method and device for terminal device | |
CN105120084A (en) | Image-based communication method and apparatus | |
CN108496317A (en) | The lookup method and device of the public resource set of remaining critical system information | |
CN107769881A (en) | Information synchronization method, apparatus and system, storage medium | |
CN105549960B (en) | Control the method and device of camera | |
CN106791092A (en) | The searching method and device of contact person | |
CN105487805A (en) | Object operating method and device | |
CN105912258A (en) | Method and device for operation processing | |
CN108733807A (en) | Search the method and device of photo | |
CN106802808A (en) | Suspension button control method and device | |
CN105847124B (en) | Method, apparatus, server and terminal for being thumbed up to social network information | |
CN109922098A (en) | A kind of content share method, device and the device for content share | |
CN105468281B (en) | The method and apparatus for showing set interface | |
CN107132769A (en) | Smart machine control method and device | |
CN106453032A (en) | Information pushing method, device and system | |
CN105187671A (en) | Recording method and device | |
CN105430469B (en) | Playback method, device, terminal and the server of audio, video data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |