[go: up one dir, main page]

CN107885450B - Realize the method and mobile terminal of mouse action - Google Patents

Realize the method and mobile terminal of mouse action Download PDF

Info

Publication number
CN107885450B
CN107885450B CN201711098276.8A CN201711098276A CN107885450B CN 107885450 B CN107885450 B CN 107885450B CN 201711098276 A CN201711098276 A CN 201711098276A CN 107885450 B CN107885450 B CN 107885450B
Authority
CN
China
Prior art keywords
target
touch area
effective touch
mouse
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711098276.8A
Other languages
Chinese (zh)
Other versions
CN107885450A (en
Inventor
李决定
陈实
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711098276.8A priority Critical patent/CN107885450B/en
Publication of CN107885450A publication Critical patent/CN107885450A/en
Application granted granted Critical
Publication of CN107885450B publication Critical patent/CN107885450B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of method for realizing mouse action and mobile terminals, this method comprises: target touch operation of the identification user's finger in the target effective touch area at least one effective touch area, at least one effective touch area is located on the target object outside mobile terminal, and there is no effective touch areas with overlapping region at least one effective touch area;Determine that the target mouse action in the corresponding mouse action in the target effective touch area, target mouse action are corresponding with the target touch operation;Execute the target mouse action.The method of the embodiment of the present invention, the mouse action that user needs to carry out can be determined according to touch operation of the user on the object outside mobile terminal, and the mouse action that user needs to carry out is executed automatically, mouse action is directly carried out on mobile terminals without user, it avoids the display screen physical size of mobile terminal too small caused inconvenient, promotes the working efficiency and user experience of user.

Description

Realize the method and mobile terminal of mouse action
Technical field
The present invention relates to field of terminal more particularly to a kind of method for realizing mouse action and mobile terminals.
Background technique
It is universal with mobile Internet and smart phone, it may be implemented using mobile phone remote control computer office. User can simply be handled official business by the Remote desk process software on mobile phone, the office computer for connecting oneself, for example, clear The operations such as file, the urgent workflow examination and approval of progress look on computer.
But the physical size very little of mobile phone display screen, when showing the remote desktop of computer, icon, text all very littles, It is very inconvenient when carrying out mouse action on mobile phone, influence user experience.
Summary of the invention
The embodiment of the present invention provides a kind of method for realizing mouse action, carries out mouse behaviour on mobile terminals to solve user When making, the problem of inconvenient influence user experience.
In order to solve the above-mentioned technical problem, the present invention is implemented as follows:
In a first aspect, a kind of method for realizing mouse action is provided, this method comprises:
Identify that target of the user's finger in the target effective touch area at least one effective touch area touches behaviour Make, at least one described effective touch area is located on the target object outside mobile terminal, at least one described effective Petting Area There is no effective touch areas with overlapping region in domain;
Determine the target mouse action in the corresponding mouse action in the target effective touch area, the target mouse behaviour Make corresponding with the target touch operation;
Execute the target mouse action.
Second aspect provides a kind of mobile terminal, which includes:
Recognition unit, user's finger is in the target effective touch area at least one effective touch area for identification Target touch operation, at least one described effective touch area is located on the target object outside mobile terminal, described at least one There is no effective touch areas with overlapping region in a effective touch area;
First determination unit, for determining that the target mouse in the corresponding mouse action in the target effective touch area is grasped Make, the target mouse action is corresponding with the target touch operation;
Mouse action execution unit, for executing the target mouse action.
The third aspect provides a kind of mobile terminal, which includes: memory, processor and be stored in described It is real when the computer program is executed by the processor on memory and the computer program that can run on the processor Now the step of method of realization mouse action as described in relation to the first aspect.
Fourth aspect provides a kind of computer-readable medium, stores computer program on the computer-readable medium, The step of method of realization mouse action as described in relation to the first aspect is realized when the computer program is executed by processor.
In embodiments of the present invention, the target touch operation by identification user's finger in target effective touch area, And determine target mouse action corresponding with target touch operation in the corresponding mouse action in target effective touch area, it holds later Row target mouse action can determine the mouse that user needs to carry out according to touch operation of the user on the object outside mobile terminal Mark operation, and the mouse action that user needs to carry out is executed automatically, mouse is directly carried out on mobile terminals without user Mark operation avoids the display screen physical size of mobile terminal too small caused inconvenient, promotes the working efficiency and use of user Family experience.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes a part of the invention, this hair Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is the schematic flow chart of the method according to an embodiment of the invention for realizing mouse action.
Fig. 2 is the schematic diagram of effective touch area of one embodiment of the present of invention.
Fig. 3 is the schematic diagram of effective touch area in accordance with another embodiment of the present invention.
Fig. 4 is the schematic diagram of effective touch area of further embodiment according to the present invention.
Fig. 5 is the correspondence diagram of effective touch area and mouse action according to an embodiment of the present invention.
Fig. 6 is effective touch area and the correspondence diagram of mouse action of another implementation according to the present invention.
Fig. 7 is the signal of effective touch area of further embodiment according to the present invention and the corresponding relationship of mouse action Figure.
Fig. 8 is the schematic diagram of finger moving operation according to an embodiment of the present invention and mobile mouse pointer operation.
Fig. 9 is the structural schematic diagram of mobile terminal according to an embodiment of the invention.
Figure 10 is the structural schematic diagram of mobile terminal in accordance with another embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation description, it is clear that described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on this hair Embodiment in bright, every other implementation obtained by those of ordinary skill in the art without making creative efforts Example, shall fall within the protection scope of the present invention.
Fig. 1 is the schematic flow chart of the method according to an embodiment of the invention for realizing mouse action.Such as Fig. 1 institute It shows, method includes:
S110 identifies target touching of the user's finger in the target effective touch area at least one effective touch area Operation is touched, at least one described effective touch area is located on the target object outside mobile terminal, at least one described effectively touching It touches and effective touch area with overlapping region is not present in region.
Optionally, in some embodiments, at least one camera by setting on mobile terminals, for example, at least one Target touch operation of a front camera identification user's finger in target effective touch area.At least one front camera It can be at least one high definition front camera.For example, when only one front camera of mobile terminal, before at least one high definition Setting camera is high definition single camera, and when mobile terminal has 2 front cameras, at least one high definition front camera is high definition Dual camera, when mobile terminal is had greater than 2 front cameras, at least one high definition front camera is high definition multi-cam.
Optionally, in further embodiments, at least one high definition front camera has light compensating lamp.It is possible thereby to make Mobile terminal can also identify that at least one effective touch area and user's finger are touched in target effective under dark surrounds Target touch operation in region.
Optionally, in some embodiments, before S110, mobile terminal needs to identify at least one effective Petting Area Domain.Mobile terminal can determine at least one effective touch area according to the position for the multiple markers being arranged on target object.
It should be noted that the embodiment of the present invention does not limit shape, the color of the marker being arranged on target object It is fixed.The shape of marker can be the arbitrary shapes such as circle, star.Marker can have the shape of rule, can not also have Well-regulated shape.Multiple markers can be the similar object of feature, for example, it may be being twisted by the toilet paper of white small Ball.As long as can be identified that the object of feature all can serve as the mark in the embodiment of the present invention by high definition front camera Object.
Optionally, as an example, as illustrated in FIG. 2, using mobile terminal as mobile phone, target object is that desktop is Example, mobile phone are placed on the table, have 4 circular markers on the table, the front camera of mobile phone can clearly be shot To 4 markers, and it can determine according to the position of this 4 markers a quadrangle being made of this 4 markers Region, this quadrilateral area are an effective touch area.Alternatively, as illustrated in FIG. 3, the shape of marker is star Shape can determine a quadrilateral area being made of this 4 star markers according to the substance of 4 star markers, this A quadrilateral area is also an effective touch area.
Optionally, as another example, target object can be a piece of paper, such as A4 paper, and user can be in this A4 4 origins are drawn on paper as 4 markers, can constitute a quadrilateral area by this 4 dots, that is, constitute one effectively Touch area.User can come out this layout print, carry, be convenient for the user to use.
Optionally, in further embodiments, mobile terminal can be according to the multiple color fillings being arranged on target object Region determines at least one effective touch area, and corresponding color filling region is not present at least one effective touch area The identical effective touch area of Fill Color, the filling in the corresponding color filling region in described at least one effective touch area Color is different from the color of target object.
Optionally, as an example, target object is a blank sheet of paper, is provided with multiple color filling regions above, At least one color filling region can be selected as at least one described effective Petting Area in this multiple color filling region Domain.For example, being provided with the filling of the filling region of filling blue, the filling region of filling red, filling green on this blank sheet of paper The filling region in region and filling black, can choose wherein filling blue, red and green filling region as it is described extremely A few effective touch area.It should be noted that the number of at least one effective touch area can according to actual needs really It is fixed, and color filling region can be replaced with to pattern filling region.
Optionally, in some embodiments, the shape of each effectively touch area is at least one effective touch area Convex quadrangle, since convex quadrangle is closer to the screen shape of computer or mobile terminal, so that simulating mouse action It realizes more convenient.
For example, as illustrated in FIG. 4, four markers in 4 small figures in Fig. 4, marked as 1,2,3 and 4 Convex quadrangle can be constituted, therefore the quadrilateral area in 4 small figures marked as 1,2,3 and 4 may be considered effective touch Region.And four markers in two small figures marked as 5 and 6 cannot constitute convex quadrangle, therefore two marked as 5 and 6 Four markers in small figure cannot constitute effective touch area.
Optionally, in some embodiments, the judgment rule of touch operation can be set in advance, for example, can be by user Finger to contact the duration of a position in effective touch area fixed less than the operation for lifting finger after the first preset duration Justice is finger single-click operation.It is mobile for finger that the duration that finger contacts this position is greater than the Operation Definition of the first preset duration Operation, or after finger is contacted effective touch area, continuous contact is kept, the operation move in effective touch area is determined Justice is finger moving operation.The finger of user is contacted a position in effective touch area duration it is default less than first when The Operation Definition that this position is contacted again in the second duration lifting finger after long, and starting at the time of lifting finger is Finger double click operation.Here the first preset duration for example can be 2s, and the second preset duration for example can be 2s.
S120 determines the target mouse action in the corresponding mouse action in the target effective touch area, the target Mouse action is corresponding with the target touch operation.
Optionally, in some embodiments, at least one effective touch area is at least two effective touch areas, at least The corresponding mouse action in the effective touch area of any two is different in two effective touch areas, and any two are effectively touched in other words It is different to touch the corresponding mouse function in region.
For example, go out as shown in Figure 5 and Figure 6, constituted 4 effective touch areas by multiple circular markers, This 4 corresponding mouse actions in effective touch area are respectively as follows: mobile mouse pointer operation, click left mouse button operation, click Right mouse button operation and roll mouse wheel operation.The difference of Fig. 5 and Fig. 6 is that the layout of 4 effective touch areas is different, It is understood that the layout of effectively touch area can be according to the habit and fancy setting of user.
Specifically, shown corresponding mouse action be the effective touch for moving mouse pointer operation in fig. 5 and fig. In region, the touch operation of user can only correspond to mobile mouse pointer operation.It is to click left mouse button in corresponding mouse action In effective touch area of operation, clicking left mouse button operation may include single left button mouse click operation, double left button mouse click behaviour Make or the operation of multi-hit left mouse button, the touch operation of user may correspond in the aforesaid operations that click left mouse button operation includes A kind of operation, and user can only execute and the corresponding touch operation of click left mouse button operation in the region.In correspondence Mouse action be click right mouse button operation effective touch area in, the touch operation of user can only correspond to click right mouse Key operation.In effective touch area that corresponding mouse action is roll mouse wheel operation, the touch operation of user can only Corresponding roll mouse wheel operation, mobile terminal simulation roll mouse wheel operation realize sliding up and down for the page, such as mobile phone At this time just in a page that can be slided up and down (for example, browser) on the computer remotely connected, then mobile terminal can be with mould The page on quasi- mouse roller operation control remote computer slides up and down.
Alternatively, in the case where determining effective touch area according to the color filling region on target object, Fill Color Different color filling regions correspond to different mouse actions.Such as it is illustrated in fig. 7, region 1 is that Fill Color is red Color filling region, and the corresponding mouse action in region 1 is mobile mouse pointer operation.Region 2 is that Fill Color is green Color filling region, and the corresponding mouse action in region 2 is to click left mouse button operation.Region 3 is that Fill Color is purple Color filling region, and the corresponding mouse action in region 3 is to click right mouse button operation.Region 4 is that Fill Color is blue Color filling region, and the corresponding mouse action in region 4 is roll mouse wheel operation.In this case, user can incite somebody to action The layout of effective touch area is come out by color printer print, is carried, and is easy to use.
Optionally, in some embodiments, it after mobile terminal identifies at least one effective touch area, determines each The corresponding mouse action in effective touch area, and display reminding information on the screen prompt user each effective touch area pair The mouse action answered, in order to which user carries out suitable touch operation in each effective touch area.Or mobile terminal can According to each effectively corresponding mouse action in touch area of the setting information of user setting, such as the setting of reception user's input Information, according to each effectively corresponding mouse action in touch area of setting information setting.Or mobile terminal identify to Behind a few effective touch area, each effectively corresponding mouse action in touch area is determined, and display reminding is believed on the screen Breath, mobile terminal receives the setting information of user's input later, corresponding according to each effectively touch area of setting information modification Mouse action.
S130 executes the target mouse action.
Specifically, in some embodiments, in performance objective mouse action, according to preset sensitivity performance objective mouse Mark operation.For example, target touch operation is finger moving operation, target mouse action is mobile mouse pointer operation, it is determined that The corresponding finger moving direction of finger moving operation and finger moving distance, then according to finger moving distance and preset it is sensitive Degree, determines mouse pointer moving distance, and default sensitivity is for characterizing between finger moving distance and mouse pointer moving distance Corresponding relationship, and then control mouse pointer move mouse pointer moving distance in the direction of finger movement.
For example, as illustrated in FIG. 8, finger is moved to the upper right corner (arrow from the lower left corner in effective touch area Head direction), the mouse pointer in mobile phone is followed by same angle (arrow direction) according to default sensitivity and moves a section It moves.Here presetting at sensitivity can be user oneself setting, such as can be set when finger moves in effective touch area When 3cm, mouse pointer needs mobile 900 pixels.
The method that mouse action is realized shown in Fig. 1, through identification user's finger in target effective touch area Target touch operation, and determine target mouse corresponding with target touch operation in the corresponding mouse action in target effective touch area Mark operation, performance objective mouse action, can determine according to touch operation of the user on the object outside mobile terminal and use later Family needs the mouse action carried out, and executes the mouse action that user needs to carry out automatically, is directly moving without user Mouse action is carried out in dynamic terminal, avoids the display screen physical size of mobile terminal too small caused inconvenient, promotes user Working efficiency and user experience.
It combines Fig. 1 to Fig. 8 that the method according to an embodiment of the present invention for realizing mouse action is described in detail above, below will Mobile terminal according to an embodiment of the present invention is described in detail in conjunction with Fig. 9.As shown in figure 9, mobile terminal 90 includes:
Recognition unit 91, for identification target effective touch area of the user's finger at least one effective touch area Interior target touch operation, at least one described effective touch area are located on the target object outside mobile terminal, it is described at least Effective touch area with overlapping region is not present in one effective touch area;
First determination unit 92, for determining the target mouse in the corresponding mouse action in the target effective touch area Operation, the target mouse action are corresponding with the target touch operation;
Mouse action execution unit 93, for executing the target mouse action.
Optionally, as one embodiment, at least one described effective touch area is at least two effective touch areas, The corresponding mouse action in the effective touch area of any two is different at least two effective touch area.
Optionally, as one embodiment, the mobile terminal 90 further include:
Receiving unit, for receiving the setting information of user's input;
Setting unit, for each effectively corresponding mouse action in touch area to be arranged according to the setting information.
Optionally, as one embodiment, the mobile terminal 90 further include:
Second determination unit, for the position according to the multiple markers being arranged on the target object, determination is described extremely A few effective touch area.
Optionally, as one embodiment, the mobile terminal 90 further include:
Third determination unit, for according to the multiple color filling regions being arranged on the target object, determination to be described extremely Lack an effective touch area, the filling face in corresponding color filling region is not present at least one described effective touch area Color it is identical with effective touch area, the Fill Color in the corresponding color filling region in described at least one effective touch area with The color of the target object is different.
Optionally, as one embodiment, the shape of at least one effective touch area is convex quadrangle.
Optionally, as one embodiment, the recognition unit 91 includes:
Camera controls subelement, and at least one camera by setting on the mobile terminal, identification is used Target touch operation of the family finger in the target effective touch area.
Optionally, as one embodiment, at least one described camera has light compensating lamp.
Optionally, as one embodiment, the target touch operation is finger moving operation, the target mouse action For mobile mouse pointer operation, the mobile terminal 90 further include:
4th determination unit, for determine the corresponding finger moving direction of the finger moving operation and finger it is mobile away from From;
4th determination unit is also used to determine mouse pointer according to the finger moving distance and default sensitivity Moving distance, the default sensitivity are used to characterize the corresponding relationship between finger moving distance and mouse pointer moving distance;
Wherein, the mouse action execution unit 93 is specifically used for:
Control mouse pointer mobile mouse pointer moving distance on the finger moving direction.
Mobile terminal provided in an embodiment of the present invention can be realized in the embodiment of the method for the realization mouse action in Fig. 1 Each process, to avoid repeating, which is not described herein again.
Target touching of the mobile terminal by identification user's finger in target effective touch area in the embodiment of the present invention Operation is touched, and determines target mouse behaviour corresponding with target touch operation in the corresponding mouse action in target effective touch area Make, performance objective mouse action, can determine user's need according to touch operation of the user on the object outside mobile terminal later The mouse action to be carried out, and the mouse action that user needs to carry out is executed automatically, without user directly mobile whole Mouse action is carried out on end, is avoided the display screen physical size of mobile terminal too small caused inconvenient, is promoted the work of user Make efficiency and user experience.
The hardware structural diagram of Figure 10 mobile terminal of embodiment to realize the present invention.As shown in Figure 10, the movement is whole End 1000 includes but is not limited to: radio frequency unit 1001, audio output unit 1003, input unit 1004, passes network module 1002 Sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, processor 1010, with And the equal components of power supply 1011.It will be understood by those skilled in the art that mobile terminal structure shown in Figure 10 is not constituted to shifting The restriction of dynamic terminal, mobile terminal may include perhaps combining certain components or difference than illustrating more or fewer components Component layout.In embodiments of the present invention, mobile terminal includes but is not limited to mobile phone, tablet computer, laptop, palm Computer, car-mounted terminal, wearable device and pedometer etc..
Wherein, processor 1010 identify target effective Petting Area of the user's finger at least one effective touch area Target touch operation in domain, at least one described effective touch area are located on the target object outside mobile terminal, it is described extremely There is no effective touch areas with overlapping region in a few effective touch area;Determine the target effective touch area Target mouse action in corresponding mouse action, the target mouse action are corresponding with the target touch operation;Execute institute State target mouse action.
Mobile terminal according to an embodiment of the present invention passes through target of the identification user's finger in target effective touch area Touch operation, and determine target mouse action corresponding with target touch operation in the corresponding operation in target effective touch area, Performance objective mouse action later, can be determined according to touch operation of the user on the object outside mobile terminal user need into Capable mouse action, and the mouse action that user needs to carry out is executed automatically, directly on mobile terminals without user Mouse action is carried out, avoids the display screen physical size of mobile terminal too small caused inconvenient, promotes the work effect of user Rate and user experience.
It should be understood that the embodiment of the present invention in, radio frequency unit 1001 can be used for receiving and sending messages or communication process in, signal Send and receive, specifically, by from base station downlink data receive after, to processor 1010 handle;In addition, by uplink Data are sent to base station.In general, radio frequency unit 1001 includes but is not limited to antenna, at least one amplifier, transceiver, coupling Device, low-noise amplifier, duplexer etc..In addition, radio frequency unit 1001 can also by wireless communication system and network and other Equipment communication.
Mobile terminal provides wireless broadband internet by network module 1002 for user and accesses, and such as user is helped to receive It sends e-mails, browse webpage and access streaming video etc..
Audio output unit 1003 can be received by radio frequency unit 1001 or network module 1002 or in memory The audio data stored in 1009 is converted into audio signal and exports to be sound.Moreover, audio output unit 1003 can be with Audio output relevant to the specific function that mobile terminal 1000 executes is provided (for example, call signal receives sound, message sink Sound etc.).Audio output unit 1003 includes loudspeaker, buzzer and receiver etc..
Input unit 1004 is for receiving audio or video signal.Input unit 1004 may include graphics processor (Graphics Processing Unit, GPU) 10041 and microphone 10042, graphics processor 10041 are captured in video In mode or image capture mode by image capture apparatus (such as camera) obtain static images or video image data into Row processing.Treated, and picture frame may be displayed on display unit 1006.Through treated the picture frame of graphics processor 10041 It can store in memory 1009 (or other storage mediums) or carried out via radio frequency unit 1001 or network module 1002 It sends.Microphone 10042 can receive sound, and can be audio data by such acoustic processing.Audio that treated Data can be converted to the lattice that mobile communication base station can be sent to via radio frequency unit 1001 in the case where telephone calling model Formula output.
Mobile terminal 1000 further includes at least one sensor 1005, for example, optical sensor, motion sensor and other Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein ambient light sensor can be according to ring The light and shade of border light adjusts the brightness of display panel 10061, proximity sensor can when mobile terminal 1000 is moved in one's ear, Close display panel 10061 and/or backlight.As a kind of motion sensor, accelerometer sensor can detect in all directions The size of (generally three axis) acceleration, can detect that size and the direction of gravity, can be used to identify mobile terminal appearance when static State (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) Deng;Sensor 1005 can also include fingerprint sensor, pressure sensor, iris sensor, molecule sensor, gyroscope, gas Meter, hygrometer, thermometer, infrared sensor etc. are pressed, wherein infrared sensor can be by emitting and receiving infrared flash ranging The distance between object and mobile terminal are measured, details are not described herein.
Display unit 1006 is for showing information input by user or being supplied to the information of user.Display unit 1006 can Including display panel 10061, liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diodes can be used Forms such as (Organic Light-Emitting Diode, OLED) are managed to configure display panel 10061.
User input unit 1007 can be used for receiving the number or character information of input, and generate the use with mobile terminal Family setting and the related key signals input of function control.Specifically, user input unit 1007 include touch panel 10071 with And other input equipments 10072.Touch panel 10071, also referred to as touch screen collect the touch behaviour of user on it or nearby Make (for example user uses any suitable objects or attachment such as finger, stylus on touch panel 10071 or in touch panel Operation near 10071).Touch panel 10071 may include both touch detecting apparatus and touch controller.Wherein, it touches The touch orientation of detection device detection user is touched, and detects touch operation bring signal, transmits a signal to touch controller; Touch controller receives touch information from touch detecting apparatus, and is converted into contact coordinate, then gives processor 1010, It receives the order that processor 1010 is sent and is executed.Furthermore, it is possible to using resistance-type, condenser type, infrared ray and surface The multiple types such as sound wave realize touch panel 10071.In addition to touch panel 10071, user input unit 1007 can also include Other input equipments 10072.Specifically, other input equipments 10072 can include but is not limited to physical keyboard, function key (ratio Such as volume control button, switch key), trace ball, mouse, operating stick, details are not described herein.
Further, touch panel 10071 can be covered on display panel 10061, when touch panel 10071 detects After touch operation on or near it, processor 1010 is sent to determine the type of touch event, is followed by subsequent processing device 1010 Corresponding visual output is provided on display panel 10061 according to the type of touch event.Although in Figure 10, touch panel 10071 and display panel 10061 are the functions that outputs and inputs of realizing mobile terminal as two independent components, but In some embodiments, touch panel 10071 can be integrated with display panel 10061 and realize outputting and inputting for mobile terminal Function, specifically herein without limitation.
Interface unit 1008 is the interface that external device (ED) is connect with mobile terminal 1000.For example, external device (ED) may include Wired or wireless headphone port, external power supply (or battery charger) port, wired or wireless data port, storage card Port, port, the port audio input/output (I/O), video i/o port, earphone for connecting the device with identification module Port etc..Interface unit 1008 can be used for receiving the input (for example, data information, electric power etc.) from external device (ED) simultaneously And by one or more elements that the input received is transferred in mobile terminal 1000 or it can be used in mobile terminal Data are transmitted between 1000 and external device (ED).
Memory 1009 can be used for storing software program and various data.Memory 1009 can mainly include storage program Area and storage data area, wherein storing program area can application program needed for storage program area, at least one function (such as Sound-playing function, image player function etc.) etc.;Storage data area, which can be stored, uses created data (ratio according to mobile phone Such as audio data, phone directory) etc..In addition, memory 1009 may include high-speed random access memory, it can also include non- Volatile memory, for example, at least a disk memory, flush memory device or other volatile solid-state parts.
Processor 1010 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part by running or execute the software program and/or module that are stored in memory 1009, and calls and is stored in storage Data in device 1009 execute the various functions and processing data of mobile terminal, to carry out integral monitoring to mobile terminal.Place Managing device 1010 may include one or more processing units;Preferably, processor 1010 can integrate application processor and modulation /demodulation Processor, wherein the main processing operation system of application processor, user interface and application program etc., modem processor master Handle wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 1010.
Mobile terminal 1000 can also include the power supply 1011 (such as battery) powered to all parts, it is preferred that power supply 1011 can be logically contiguous by power-supply management system and processor 1010, to realize that management is filled by power-supply management system The functions such as electricity, electric discharge and power managed.
In addition, mobile terminal 1000 includes some unshowned functional modules, details are not described herein.
Preferably, the embodiment of the present invention also provides a kind of mobile terminal, including processor 1010, memory 1009, storage On memory 1009 and the computer program that can run on the processor 1010, the computer program is by processor 1010 Each process of above-mentioned embodiment of the method shown in FIG. 1 is realized when execution, and can reach identical technical effect, to avoid weight Multiple, which is not described herein again.
The embodiment of the present invention also provides a kind of computer readable storage medium, and meter is stored on computer readable storage medium Calculation machine program, the computer program realize each process of above-mentioned method shown in FIG. 1 when being executed by processor, and can reach phase Same technical effect, to avoid repeating, which is not described herein again.Wherein, the computer readable storage medium, such as read-only storage Device (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or light Disk etc..
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more, The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want There is also other identical elements in the process, method of element, commodity or equipment.
It will be understood by those skilled in the art that the embodiment of the present invention can provide as method, system or computer program product. Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the present invention Form.It is deposited moreover, the present invention can be used to can be used in the computer that one or more wherein includes computer usable program code The shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) Formula.
The above description is only an embodiment of the present invention, is not intended to restrict the invention.For those skilled in the art For, the invention may be variously modified and varied.All any modifications made within the spirit and principles of the present invention are equal Replacement, improvement etc., should be included within scope of the presently claimed invention.

Claims (15)

1. a kind of method for realizing mouse action characterized by comprising
Identify target touch operation of the user's finger in the target effective touch area at least one effective touch area, institute It states on the target object that at least one effective touch area is located at outside mobile terminal, at least one described effective touch area not In the presence of effective touch area with overlapping region;
Determine the target mouse action in the corresponding mouse action in the target effective touch area, the target mouse action with The target touch operation is corresponding;
Execute the target mouse action.
2. the method according to claim 1, wherein at least one described effective touch area has at least two Touch area is imitated, the corresponding mouse action in the effective touch area of any two is different at least two effective touch area.
3. method according to claim 1 or 2, which is characterized in that the method also includes:
According to the position for the multiple markers being arranged on the target object, at least one described effective touch area is determined.
4. method according to claim 1 or 2, which is characterized in that the method also includes:
According to the multiple color filling regions being arranged on the target object, at least one described effective touch area, institute are determined The identical effective touch area of Fill Color that corresponding color filling region is not present at least one effective touch area is stated, The Fill Color in the corresponding color filling region in described at least one effective touch area is different from the color of the target object.
5. method according to claim 1 or 2, which is characterized in that the identification user's finger is effectively touched at least one Touch the target touch operation in the target effective touch area in region, comprising:
By the way that at least one camera on the mobile terminal is arranged, identify user's finger in the target effective Petting Area Target touch operation in domain.
6. method according to claim 1 or 2, which is characterized in that the target touch operation is finger moving operation, institute Stating target mouse action is mobile mouse pointer operation, the method also includes:
Determine the corresponding finger moving direction of the finger moving operation and finger moving distance;
According to the finger moving distance and default sensitivity, mouse pointer moving distance is determined, the default sensitivity is used for Characterize the corresponding relationship between finger moving distance and mouse pointer moving distance;
It is wherein, described to execute the target mouse action, comprising:
Control mouse pointer mobile mouse pointer moving distance on the finger moving direction.
7. method according to claim 1 or 2, which is characterized in that the mouse action include in following operation at least A kind of: mobile mouse pointer operation clicks left mouse button operation, clicks right mouse button operation and roll mouse wheel operation.
8. a kind of mobile terminal characterized by comprising
Recognition unit, for identification mesh of the user's finger in the target effective touch area at least one effective touch area Mark touch operation, at least one described effective touch area is located on the target object outside mobile terminal, it is described at least one have It imitates and effective touch area with overlapping region is not present in touch area;
First determination unit, for determining the target mouse action in the corresponding mouse action in the target effective touch area, The target mouse action is corresponding with the target touch operation;
Mouse action execution unit, for executing the target mouse action.
9. mobile terminal according to claim 8, which is characterized in that at least one described effective touch area is at least two A effective touch area, the corresponding mouse action in the effective touch area of any two is not at least two effective touch area Together.
10. mobile terminal according to claim 8 or claim 9, which is characterized in that the mobile terminal further include:
Second determination unit determines described at least one for the position according to the multiple markers being arranged on the target object A effective touch area.
11. mobile terminal according to claim 8 or claim 9, which is characterized in that the mobile terminal further include:
Third determination unit, for determining described at least one according to the multiple color filling regions being arranged on the target object The Fill Color phase in corresponding color filling region is not present at least one described effective touch area for a effective touch area With effective touch area, the Fill Color and the mesh in the corresponding color filling region in described at least one effective touch area The color for marking object is different.
12. mobile terminal according to claim 8 or claim 9, which is characterized in that the recognition unit includes:
Camera controls subelement, at least one camera by setting on the mobile terminal, identifies user hand Refer to the target touch operation in the target effective touch area.
13. mobile terminal according to claim 8 or claim 9, which is characterized in that the target touch operation is the mobile behaviour of finger Make, the target mouse action is mobile mouse pointer operation, the mobile terminal further include:
4th determination unit, for determining the corresponding finger moving direction of the finger moving operation and finger moving distance;
4th determination unit is also used to determine that mouse pointer is mobile according to the finger moving distance and default sensitivity Distance, the default sensitivity are used to characterize the corresponding relationship between finger moving distance and mouse pointer moving distance;
Wherein, the mouse action execution unit is specifically used for:
Control mouse pointer mobile mouse pointer moving distance on the finger moving direction.
14. mobile terminal according to claim 8 or claim 9, which is characterized in that the mouse action includes in following operation At least one: mobile mouse pointer operation clicks left mouse button operation, clicks right mouse button operation and roll mouse idler wheel behaviour Make.
15. a kind of mobile terminal characterized by comprising memory, processor and be stored on the memory and can be in institute The computer program run on processor is stated, such as claim 1 to 7 is realized when the computer program is executed by the processor Any one of described in realization mouse action method the step of.
CN201711098276.8A 2017-11-09 2017-11-09 Realize the method and mobile terminal of mouse action Active CN107885450B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711098276.8A CN107885450B (en) 2017-11-09 2017-11-09 Realize the method and mobile terminal of mouse action

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711098276.8A CN107885450B (en) 2017-11-09 2017-11-09 Realize the method and mobile terminal of mouse action

Publications (2)

Publication Number Publication Date
CN107885450A CN107885450A (en) 2018-04-06
CN107885450B true CN107885450B (en) 2019-10-15

Family

ID=61779778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711098276.8A Active CN107885450B (en) 2017-11-09 2017-11-09 Realize the method and mobile terminal of mouse action

Country Status (1)

Country Link
CN (1) CN107885450B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109327756B (en) * 2018-09-13 2020-09-22 歌尔科技有限公司 Charging box of wireless earphone and control method and device thereof
CN109902473B (en) * 2019-02-27 2021-09-07 Oppo广东移动通信有限公司 Pattern generating method, pattern generating device and mobile terminal
CN110908580B (en) * 2019-11-11 2021-11-02 广州视源电子科技股份有限公司 Method and device for controlling application

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446032A (en) * 2010-09-30 2012-05-09 中国移动通信有限公司 Information input method and terminal based on camera
CN102591497A (en) * 2012-03-16 2012-07-18 上海达龙信息科技有限公司 Mouse simulation system and method on touch screen
CN102830819A (en) * 2012-08-21 2012-12-19 曾斌 Method and equipment for simulating mouse input
CN103530546A (en) * 2013-10-25 2014-01-22 东北大学 Identity authentication method based on mouse behaviors of user
CN104793744A (en) * 2015-04-16 2015-07-22 天脉聚源(北京)传媒科技有限公司 Gesture operation method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7761814B2 (en) * 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
US20100214218A1 (en) * 2009-02-20 2010-08-26 Nokia Corporation Virtual mouse
CN102141847A (en) * 2011-03-16 2011-08-03 梁庆生 Method for simulating mouse input
WO2014073345A1 (en) * 2012-11-09 2014-05-15 ソニー株式会社 Information processing device, information processing method and computer-readable recording medium
CN104808810B (en) * 2015-05-08 2017-12-01 三星电子(中国)研发中心 Carry out the method and mobile terminal of mouse input
CN105549883A (en) * 2015-12-09 2016-05-04 小米科技有限责任公司 Operation control method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102446032A (en) * 2010-09-30 2012-05-09 中国移动通信有限公司 Information input method and terminal based on camera
CN102591497A (en) * 2012-03-16 2012-07-18 上海达龙信息科技有限公司 Mouse simulation system and method on touch screen
CN102830819A (en) * 2012-08-21 2012-12-19 曾斌 Method and equipment for simulating mouse input
CN103530546A (en) * 2013-10-25 2014-01-22 东北大学 Identity authentication method based on mouse behaviors of user
CN104793744A (en) * 2015-04-16 2015-07-22 天脉聚源(北京)传媒科技有限公司 Gesture operation method and device

Also Published As

Publication number Publication date
CN107885450A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
CN107896279A (en) Screenshotss processing method, device and the mobile terminal of a kind of mobile terminal
CN108200289A (en) A kind of unread message processing method and mobile terminal
CN109164949A (en) A kind of chat messages localization method and mobile terminal
CN107493389A (en) Singlehanded mode implementation method, terminal and computer-readable medium
CN109343759A (en) A control method and terminal for screen display
CN108920059A (en) Message processing method and mobile terminal
CN109525874A (en) A kind of screenshotss method and terminal device
CN109213401A (en) Double-sided screen application icon method for sorting, mobile terminal and readable storage medium storing program for executing
CN109379484A (en) An information processing method and terminal
CN109409244A (en) An output method and mobile terminal of an object placement scheme
CN109151367A (en) A kind of video call method and terminal device
CN109815462A (en) A text generation method and terminal device
CN109857289A (en) Display control method and terminal device
CN108304109A (en) Icon protrusion forming method, mobile terminal and computer readable storage medium
CN109164477A (en) A kind of application positioning method and mobile terminal
CN110502162A (en) The creation method and terminal device of file
CN109213407A (en) A kind of screenshot method and terminal device
CN107885450B (en) Realize the method and mobile terminal of mouse action
CN109739394A (en) A kind of processing method of SAR value, mobile terminal
CN104820546B (en) Function information methods of exhibiting and device
CN107908355A (en) Touch control method, mobile terminal and the storage medium of mobile terminal
CN109947249A (en) Exchange method, wearable device and the computer storage medium of wearable device
CN109189260A (en) A touch detection method and device
CN109068063A (en) Method, device and mobile terminal for processing and displaying three-dimensional image data
CN109085963A (en) A kind of interface display method and terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant