CN105138259B - Operation executes method and device - Google Patents
Operation executes method and device Download PDFInfo
- Publication number
- CN105138259B CN105138259B CN201510440950.0A CN201510440950A CN105138259B CN 105138259 B CN105138259 B CN 105138259B CN 201510440950 A CN201510440950 A CN 201510440950A CN 105138259 B CN105138259 B CN 105138259B
- Authority
- CN
- China
- Prior art keywords
- input
- keyboard
- application
- instruction
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000001960 triggered effect Effects 0.000 claims description 35
- 238000005516 engineering process Methods 0.000 abstract description 9
- 238000011079 streamline operation Methods 0.000 abstract 1
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to a kind of operations to execute method and device, belong to field of terminal technology.The method includes:The user interface of the first application of display;When detecting keyboard exhalation instruction, layer Overlapping display inputs keyboard on the user interface;User is obtained according to the object content in the user interface, and the input content by inputting keyboard input;Obtain the operation instruction corresponding to the input content;Pass through the operation corresponding with the operation instruction of the second application execution.The disclosure solves the relevant technologies when in face of the object content for including in the user interface of the first application is used the operational requirements gone in the second application, needs to switch between the first application and the second application, causes to operate excessively cumbersome, less efficient problem;Reach streamline operation, improves the technique effect of operating efficiency.
Description
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an operation execution method and apparatus.
Background
When a user uses a terminal such as a mobile phone, the following operation requirements are often met: for the target content contained in the user interface of the first application, the target content needs to be used in the second application.
For example, a short message sending and receiving interface displayed by the terminal includes a telephone number, and the user needs to dial the telephone number through a telephone application. In this case, the user first needs to register the telephone number by memory or a pen and then switches to the telephone application, and the user inputs the above-mentioned registered telephone number in a dial pad provided in the telephone application and then performs a dialing operation.
However, the above method requires switching between the first application and the second application, which results in a cumbersome operation and low efficiency.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide an operation execution method and apparatus. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an operation execution method, the method including:
displaying a user interface of a first application;
when a keyboard calling instruction is detected, overlaying a display input keyboard on the user interface;
acquiring target content in the user interface and input content input by the user through the input keyboard;
acquiring an operation instruction corresponding to the input content;
and executing the operation corresponding to the operation instruction through the second application.
Optionally, the transparency of the input keyboard is 0, the area of the input keyboard is smaller than the area of the screen, and the user interface and/or the input keyboard is draggable.
Optionally, the method further comprises:
acquiring a first dragging instruction corresponding to the user interface, and adjusting the display position of the content in the user interface in the screen according to the first dragging instruction;
and/or the presence of a gas in the gas,
and acquiring a second dragging indication corresponding to the input keyboard, and adjusting the display position of the input keyboard in the screen according to the second dragging indication.
Optionally, the transparency of the input keyboard is greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
Optionally, the executing, by the second application, an operation corresponding to the operation instruction includes:
when the operation instruction is a dialing instruction, dialing a telephone number corresponding to the input content through the second application; or,
when the operation instruction is a storage instruction, storing the input content through the second application; or,
when the operation instruction is a sending instruction, sending the input content to a target user through the second application; or,
and when the operation instruction is a search instruction, searching information corresponding to the input content through the second application.
Optionally, the input keyboard includes at least one operation control, and the at least one operation control includes at least one of a dialing control, a storage control, a sending control, and a search control;
the acquiring of the operation instruction corresponding to the input content includes:
when a trigger signal corresponding to a target operation control is detected, acquiring an operation instruction corresponding to the target operation control;
wherein the target operational control is one of the at least one operational control.
Optionally, the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard outgoing call indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the keys of the entity.
According to a second aspect of the embodiments of the present disclosure, there is provided an operation performing apparatus including:
an interface display module configured to display a user interface of a first application;
a keyboard display module configured to overlay and display an input keyboard on the user interface when a keyboard outgoing call instruction is detected;
the content acquisition module is configured to acquire input content input by a user through the input keyboard according to target content in the user interface;
an instruction acquisition module configured to acquire an operation instruction corresponding to the input content;
and the operation execution module is configured to execute the operation corresponding to the operation instruction through a second application.
Optionally, the transparency of the input keyboard is 0, the area of the input keyboard is smaller than the area of the screen, and the user interface and/or the input keyboard is draggable.
Optionally, the apparatus further comprises:
the first position adjusting module is configured to acquire a first dragging instruction corresponding to the user interface and adjust the display position of the content in the user interface in the screen according to the first dragging instruction;
and/or the presence of a gas in the gas,
a second position adjusting module configured to acquire a second dragging indication corresponding to the input keyboard, and adjust a display position of the input keyboard in the screen according to the second dragging indication.
Optionally, the transparency of the input keyboard is greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
Optionally, the operation execution module is configured to dial, by the second application, a phone number corresponding to the input content when the operation instruction is a dialing instruction; or,
the operation execution module is configured to store the input content through the second application when the operation indication is a storage indication; or,
the operation execution module is configured to send the input content to a target user through the second application when the operation indication is a sending indication; or,
the operation execution module is configured to search for information corresponding to the input content through the second application when the operation indication is a search indication.
Optionally, the input keyboard includes at least one operation control, and the at least one operation control includes at least one of a dialing control, a storage control, a sending control, and a search control;
the indication acquisition module is configured to acquire an operation indication corresponding to a target operation control when a trigger signal corresponding to the target operation control is detected;
wherein the target operational control is one of the at least one operational control.
Optionally, the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard outgoing call indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the keys of the entity.
According to a third aspect of the embodiments of the present disclosure, there is provided an operation performing apparatus including:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
displaying a user interface of a first application;
when a keyboard calling instruction is detected, overlaying a display input keyboard on the user interface;
acquiring target content in the user interface and input content input by the user through the input keyboard;
acquiring an operation instruction corresponding to the input content;
and executing the operation corresponding to the operation instruction through the second application.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the method comprises the steps that an input keyboard is additionally and additionally arranged on a user interface of a first application in a stacking mode, so that a user can input content by using the input keyboard according to target content in the user interface, and after an operation instruction corresponding to the input content is obtained, operation corresponding to the operation instruction is executed through a second application; the problems that in the prior art, when the operation requirement that the target content contained in the user interface of the first application is used in the second application is met, switching needs to be carried out between the first application and the second application, so that the operation is complicated and the efficiency is low are solved; because the user interface and the input keyboard of the first application are displayed on the screen together, a user can directly perform input operation on target content in the user interface without memorizing or recording through a paper pen, the requirements on the user are reduced, and the user experience is improved; in addition, after the user inputs the content in the input keyboard, the operation instruction corresponding to the input content can be directly triggered, and the operation corresponding to the operation instruction is executed through the second application.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow chart illustrating a method of operation execution in accordance with an exemplary embodiment;
FIGS. 2-6 are schematic diagrams of interfaces shown according to an exemplary embodiment;
FIG. 7 is a block diagram illustrating an operation performing apparatus in accordance with an exemplary embodiment;
fig. 8 is a block diagram illustrating an operation performing apparatus according to another exemplary embodiment;
FIG. 9 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
FIG. 1 is a flow chart illustrating a method of operation execution according to an exemplary embodiment. The operation execution method can be applied to a terminal such as a mobile phone, a tablet computer, a multimedia player or an electronic book reader. The operation execution method can comprise the following steps:
in step 101, a user interface of a first application is displayed.
The first application may be any type of application, such as an instant messaging application, a social contact application, a web browsing application, a multimedia application, a web shopping application, a game application, a map navigation application, and the like, which is not limited in this embodiment.
The user interface of the first application may include text, characters, graphics, pictures, controls, and the like.
In step 102, upon detecting a keyboard callout indication, a display input keyboard is overlaid on the user interface.
The input keyboard is a virtual keyboard displayed on a screen of the terminal. Such as a dial pad, input method pad, etc.
The keyboard calling-out instruction is triggered by a user and is used for controlling the terminal to display an operation instruction for inputting the keyboard in the screen. The keyboard outgoing call indication includes but is not limited to the following several possible triggering modes:
1. the keyboard exhale indication is triggered by shaking the device. For example, the terminal may detect whether the user shakes the device through a built-in sensor, and when it is detected that the user shakes the device, superimpose a display input keypad on a currently displayed user interface.
2. The keyboard outgoing call indication is triggered by touch operation. For example, the terminal may be an electronic device with a touch screen, and the user may trigger a keyboard outgoing call instruction through a touch operation such as clicking, sliding, and the like.
3. The keyboard outgoing call indication is triggered by a voice signal. For example, the terminal may collect a voice signal input by a user through a microphone, recognize the collected voice signal through a voice recognition technology, and superimpose and display an input keyboard on a currently displayed user interface when a recognition result satisfies a preset voice recognition result.
4. The keyboard calling instruction is triggered by operating the physical key. For example, the user may trigger the keyboard outgoing call indication by pressing or toggling a physical key of the terminal.
The shortcut operation for triggering the keyboard outgoing call instruction may be set by default of the system, or may be set by user-defined, which is not limited in this embodiment.
Alternatively, the terminal may store in advance the correspondence between different types of keyboard outgoing call instructions and different types of input keyboards. And when the keyboard calling-out instruction is detected, determining an input keyboard corresponding to the detected keyboard calling-out instruction according to the corresponding relation, and overlaying and displaying the corresponding input keyboard on the currently displayed user interface.
For example, the user triggers a first keyboard outgoing call instruction through a "shake-and-shake" operation, the first keyboard outgoing call instruction corresponds to a dial keyboard, and when the terminal detects the first keyboard outgoing call instruction, the terminal displays the dial keyboard in an overlaying mode on a currently displayed user interface. And the user simultaneously presses two entity keys, namely the volume key and the power key, to trigger a second keyboard calling-out instruction, wherein the second keyboard calling-out instruction corresponds to the input method keyboard, and the terminal displays the input method keyboard on the currently displayed user interface in a stacking mode when detecting the second keyboard calling-out instruction.
In this embodiment, when the user needs to use the target content in the user interface of the first application in the second application, the user does not need to memorize or record the target content through a paper pen, or input the target content after switching to the second application, the user can quickly call out the input keyboard through a shortcut operation, the input keyboard is displayed on the upper layer of the user interface of the first application in an overlapped manner, and the user can directly perform input operation on the target content in the user interface.
For example, with reference to fig. 2 in combination, the terminal displays a user interface 20 for the first application, the user interface 20 containing the phone number "130 XXXX 1111" of zhang. Assuming that the user needs to dial the telephone number, he does not need to remember the telephone number by memory or a pen and then switch to the telephone application for input and dialing operations. The user can quickly call out the dial pad 21 by a shortcut operation such as shaking the device. As shown in fig. 2, the dial pad 21 is displayed superimposed on the user interface 20 of the first application. Thus, the telephone number "130 XXXX 1111" included in the user interface 20 and the dial pad 21 are displayed together on the screen, and the user can perform an input operation with ease.
In step 103, the input content input by the user according to the target content in the user interface and through the input keyboard is obtained.
After the user inputs the keyboard through the quick operation outgoing call, the target content required to be used in the second application can be input through the input keyboard. Correspondingly, the terminal obtains the target content in the user interface and inputs the input content through the input keyboard. The input content refers to content actually input by a user through an input keyboard according to target content in a user interface of the first application. The input content may or may not be identical to the target content.
For example, with combined reference to FIG. 2, the user enters the telephone number "130 XXXX 1111" to be dialed via the dial pad 21. The terminal displays the telephone number entered by the user in the input field 22.
In step 104, an operation instruction corresponding to the input content is acquired.
And after the user finishes inputting, triggering an operation instruction corresponding to the input content. Accordingly, the terminal acquires an operation instruction corresponding to the input content. The operation indication includes, but is not limited to, any one of a dialing indication, a storing indication, a sending indication and a searching indication. The user can trigger different operation instructions according to actual operation requirements.
For example, with combined reference to FIG. 2, clicking on the dial control 23 by the user triggers a dial indication corresponding to the telephone number "130 XXXX 1111" entered as described above.
In step 105, an operation corresponding to the operation instruction is executed by the second application.
After acquiring the operation instruction corresponding to the input content, the terminal executes the operation corresponding to the operation instruction through the second application. Wherein the second application is another application different from the first application.
Optionally, this step includes several possible embodiments as follows:
1. and when the operation instruction is a dialing instruction, dialing the telephone number corresponding to the input content through the second application. For example, when the operation instruction is a dialing instruction, the terminal dials a telephone number corresponding to the input content through the telephone application. Referring in conjunction to fig. 2, the terminal dials the telephone number "130 XXXX 1111" via the telephony application upon detecting a dial indication triggered by the user by clicking on the dial control 23.
2. And when the operation instruction is a storage instruction, storing the input content through the second application. For example, when the operation instruction is a storage instruction, the terminal stores the input phone number into an existing contact entry or a new contact entry through a phone application. For another example, when the operation instruction is a storage instruction, the terminal stores the input content through the notepad application.
3. And when the operation instruction is a sending instruction, sending the input content to the target user through the second application. For example, when the operation instruction is a transmission instruction, the terminal transmits the input content to the target user selected by the user through the instant messaging application.
4. When the operation instruction is a search instruction, information corresponding to the input content is searched for by the second application. For example, when the operation instruction is a search instruction, the terminal searches by a web page search engine using the input content as a keyword, and searches for a web page corresponding to the input content. For another example, when the operation instruction is a search instruction, the terminal searches for a multimedia resource corresponding to the input content through the multimedia application.
At least one operation control can be included in the input keyboard, and different operation controls are used for triggering different operation instructions. For example, the input keyboard includes at least one operation control of a dialing control, a storage control, a sending control and a searching control. The dialing control is used for triggering dialing indication, the storage control is used for triggering storage indication, the sending control is used for triggering sending indication, and the searching control is used for triggering searching indication. The method comprises the steps that when a terminal detects a trigger signal corresponding to a target operation control, an operation instruction corresponding to the target operation control is obtained; and the target operation control is one of the at least one operation control. For example, referring to fig. 2 in combination, when detecting the trigger signal corresponding to the dialing control 23, the terminal obtains a dialing instruction, and dials a phone number input by the user through a phone application; upon detection of a trigger signal corresponding to the storage control 24, a storage indication is retrieved and the telephone number entered by the user is stored by the telephone application.
In addition, the corresponding relationship between each operation control and the second application which is correspondingly triggered and invoked by the operation control can be preset by a system or a user, or can be selected by the user in real time.
Each operation control corresponds to a second application when preset by the system or the user. And when detecting a trigger signal corresponding to the target operation control, the terminal directly calls a second application corresponding to the target operation control to respond to the operation instruction. Different operational controls may correspond to the same second application. For example, referring collectively to FIG. 2, the dialing control 23 and the storage control 24 each correspond to a telephony application. Alternatively, different operational controls may correspond to different second applications. For example, referring to FIG. 3 in conjunction, merchandise information is displayed in the merchandise display interface 30 of the web shopping application. When the user needs to send the commodity information to the friend in the instant messaging application, the user can quickly operate the quick call input method keyboard 31, and after the user finishes inputting the commodity information corresponding to the commodity display interface 30, the user can call the instant messaging application to send the input commodity information to the friend by clicking the sending control 32. Alternatively, when the user needs to save the input commodity information into the notepad, the user can call the notepad application to store the input commodity information by clicking the storage control 33.
Each operational control may correspond to a plurality of second applications when selected by the user in real-time. When the terminal detects a trigger signal corresponding to the target operation control, the terminal displays identification information of each second application corresponding to the target operation control, and a user can select a second application to be called according to the identification information. And then, the terminal calls the second application selected by the user and executes the operation corresponding to the corresponding operation instruction. For example, referring to FIG. 4 in conjunction, merchandise information is displayed in the merchandise display interface 40 of the web shopping application. The user clicks the sending control 42 after completing inputting the commodity information corresponding to the commodity display interface 40 by quickly operating the quick call input method keyboard 41. When the terminal detects a trigger signal corresponding to the sending control 42, an application selection box 43 is displayed, and the application selection box 43 includes a plurality of icons of second applications, such as icons of a short message application, an instant messaging application, a mail application and a microblog application. The user may select the second application to be invoked from the application selection box 43. For example, after the user clicks an icon of the microblog application, the terminal calls the microblog application to send commodity information input by the user to the microblog platform.
In summary, in the operation execution method provided in this embodiment, the input keyboard is superimposed and displayed on the user interface of the first application, so that the user can input content using the input keyboard according to the target content in the user interface, and after the operation instruction corresponding to the input content is obtained, the operation corresponding to the operation instruction is executed through the second application; the problems that in the prior art, when the operation requirement that the target content contained in the user interface of the first application is used in the second application is met, switching needs to be carried out between the first application and the second application, so that the operation is complicated and the efficiency is low are solved; because the user interface and the input keyboard of the first application are displayed on the screen together, a user can directly perform input operation on target content in the user interface without memorizing or recording through a paper pen, the requirements on the user are reduced, and the user experience is improved; in addition, after the user inputs the content in the input keyboard, the operation instruction corresponding to the input content can be directly triggered, and the operation corresponding to the operation instruction is executed through the second application.
It should be added that, in order to enable the user interface and the input keyboard of the first application to be displayed on the screen together, that is, in order to enable the user interface and the input keyboard of the first application to be visible to the user, so that the user can directly perform an input operation on target content in the user interface, the input keyboard displayed on the upper layer of the user interface of the first application in an overlapping manner exists in the following two possible display forms.
In a first possible display form, the transparency of the input keyboard is greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
The input keyboard is displayed on the upper layer of the user interface of the first application in a semi-transparent display mode in an overlapping mode, so that the content in the user interface of the first application is not shielded by the input keyboard, and the user can conveniently view and input the content.
In addition, when the input keyboard is displayed in a semi-transparent display form, the area of the input keyboard may be smaller than or equal to the area of the screen. Since it does not obscure the user interface of the underlying first application even if the input keyboard is displayed in full screen.
For example, referring collectively to fig. 5, the dial pad 50 is displayed in a semi-transparent display, full screen, in the screen. The user can clearly see the content in the user interface of the underlying first application through the dial pad 50, and when inputting the target content to be used, the user does not need to memorize or record through a paper pen, so that the user experience is better.
In a second possible display form, the input keyboard has a transparency of 0, the area of the input keyboard is smaller than the area of the screen, and the user interface and/or the input keyboard is draggable.
When the input keyboard is displayed in an opaque display manner in an overlying manner on the user interface of the first application, in order to ensure that the content in the user interface of the first application is not blocked by the input keyboard, the area of the input keyboard needs to be smaller than that of the screen. Therefore, the user interface and/or the input keyboard of the first application are/is set to be draggable, so that the user can move the target content to be used out of the input keyboard, and the user can conveniently view and input the target content.
Correspondingly, if the user interface is draggable, the terminal acquires a first dragging instruction corresponding to the user interface, and the display position of the content in the user interface in the screen is adjusted according to the first dragging instruction. For example, referring to the left side illustration of FIG. 6 in combination, the user may slide the user interface 60 of the first application up and down. If the input keyboard is draggable, the terminal acquires a second dragging instruction corresponding to the input keyboard, and the display position of the input keyboard in the screen is adjusted according to the second dragging instruction. For example, referring to the right-hand illustration of FIG. 6 in conjunction, the user may drag the input keyboard 61 up and down.
In summary, through the two display forms, it is ensured that the content in the user interface of the first application is not shielded by the input keyboard, so that when the user inputs the target content required to be used, the user can directly observe and input the target content without memorizing or recording through a paper pen, the operation is simple, the requirement on the user is reduced, and the user experience is improved.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 7 is a block diagram illustrating an operation performing apparatus according to an exemplary embodiment. The operation execution device can be applied to a terminal such as a mobile phone, a tablet computer, a multimedia player or an electronic book reader. The operation performing means may include: an interface display module 710, a keyboard display module 720, a content acquisition module 730, an indication acquisition module 740, and an operation execution module 750.
An interface display module 710 configured to display a user interface of a first application.
A keyboard display module 720 configured to, upon detecting the keyboard outgoing call instruction, overlay and display an input keyboard on the user interface displayed by the interface display module 710.
A content obtaining module 730 configured to obtain the target content in the user interface displayed by the interface display module 710 and the input content input through the input keyboard displayed by the keyboard display module 720.
An instruction acquisition module 740 configured to acquire an operation instruction corresponding to the input content acquired by the content acquisition module 730.
An operation execution module 750 configured to execute, by the second application, an operation corresponding to the operation instruction acquired by the instruction acquisition module 740.
In summary, in the operation execution device provided in this embodiment, the input keyboard is superimposed and displayed on the user interface of the first application, so that the user can input content using the input keyboard according to the target content in the user interface, and after the operation instruction corresponding to the input content is obtained, the operation corresponding to the operation instruction is executed through the second application; the problems that in the prior art, when the operation requirement that the target content contained in the user interface of the first application is used in the second application is met, switching needs to be carried out between the first application and the second application, so that the operation is complicated and the efficiency is low are solved; because the user interface and the input keyboard of the first application are displayed on the screen together, a user can directly perform input operation on target content in the user interface without memorizing or recording through a paper pen, the requirements on the user are reduced, and the user experience is improved; in addition, after the user inputs the content in the input keyboard, the operation instruction corresponding to the input content can be directly triggered, and the operation corresponding to the operation instruction is executed through the second application.
Fig. 8 is a block diagram illustrating an operation performing apparatus according to another exemplary embodiment. The operation execution device can be applied to a terminal such as a mobile phone, a tablet computer, a multimedia player or an electronic book reader. The operation performing means may include: an interface display module 710, a keyboard display module 720, a content acquisition module 730, an indication acquisition module 740, and an operation execution module 750.
An interface display module 710 configured to display a user interface of a first application.
A keyboard display module 720 configured to, upon detecting the keyboard outgoing call instruction, overlay and display an input keyboard on the user interface displayed by the interface display module 710.
A content obtaining module 730 configured to obtain the target content in the user interface displayed by the interface display module 710 and the input content input through the input keyboard displayed by the keyboard display module 720.
An instruction acquisition module 740 configured to acquire an operation instruction corresponding to the input content acquired by the content acquisition module 730.
An operation execution module 750 configured to execute, by the second application, an operation corresponding to the operation instruction acquired by the instruction acquisition module 740.
Optionally, the input keyboard has a transparency of 0, the area of the input keyboard is smaller than the area of the screen, and the user interface and/or the input keyboard is draggable.
Optionally, the operation execution device further includes: the first position adjustment module 722 and/or the second position adjustment module 724.
The first position adjustment module 722 is configured to obtain a first dragging indication corresponding to the user interface displayed by the interface display module 710, and adjust a display position of content in the user interface in the screen according to the first dragging indication.
And a second position adjusting module 724 configured to acquire a second dragging indication of the input keyboard corresponding to the wire harness of the keyboard display module 720, and adjust the display position of the input keyboard in the screen according to the second dragging indication.
Optionally, the transparency of the input keyboard is greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
Optionally, the operation executing module 750 is configured to dial, by the second application, the phone number corresponding to the input content acquired by the content acquiring module 730 when the operation instruction acquired by the instruction acquiring module 740 is a dialing instruction; or,
an operation execution module 750 configured to store the input content acquired by the content acquisition module 730 through the second application when the operation instruction acquired by the instruction acquisition module 740 is a storage instruction; or,
an operation execution module 750 configured to transmit the input content acquired by the content acquisition module 730 to the target user through the second application when the operation instruction acquired by the instruction acquisition module 740 is a transmission instruction; or,
an operation execution module 750 configured to search for information corresponding to the input content acquired by the content acquisition module 730 through the second application when the operation instruction acquired by the instruction acquisition module 740 is a search instruction.
Optionally, the input keyboard includes at least one operation control, and the at least one operation control includes at least one of a dialing control, a storage control, a sending control, and a search control;
the indication acquiring module 740 is configured to, when a trigger signal corresponding to the target operation control is detected, acquire an operation indication corresponding to the target operation control;
and the target operation control is one of the at least one operation control.
Optionally, the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard calling indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the physical key.
In summary, in the operation execution device provided in this embodiment, the input keyboard is superimposed and displayed on the user interface of the first application, so that the user can input content using the input keyboard according to the target content in the user interface, and after the operation instruction corresponding to the input content is obtained, the operation corresponding to the operation instruction is executed through the second application; the problems that in the prior art, when the operation requirement that the target content contained in the user interface of the first application is used in the second application is met, switching needs to be carried out between the first application and the second application, so that the operation is complicated and the efficiency is low are solved; because the user interface and the input keyboard of the first application are displayed on the screen together, a user can directly perform input operation on target content in the user interface without memorizing or recording through a paper pen, the requirements on the user are reduced, and the user experience is improved; in addition, after the user inputs the content in the input keyboard, the operation instruction corresponding to the input content can be directly triggered, and the operation corresponding to the operation instruction is executed through the second application.
In addition, the operation execution device provided by this embodiment also provides two different display forms of the input keyboard, which can both ensure that the content in the user interface of the first application is not blocked by the input keyboard, so that when the user inputs the target content to be used, the user can directly input the target content while observing the target content without memorizing or recording the target content by a paper pen, and the operation execution device is simple in operation, reduces the requirement on the user, and improves the user experience.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
An exemplary embodiment of the present disclosure also provides an operation execution apparatus, which can implement the operation execution method provided by the present disclosure. The operation execution device includes: a processor, and a memory for storing executable instructions for the processor. Wherein the processor is configured to:
displaying a user interface of a first application;
when a keyboard calling instruction is detected, overlaying a display input keyboard on the user interface;
acquiring target content in the user interface and input content input by the user through the input keyboard;
acquiring an operation instruction corresponding to the input content;
and executing the operation corresponding to the operation instruction through the second application.
Optionally, the transparency of the input keyboard is 0, the area of the input keyboard is smaller than the area of the screen, and the user interface and/or the input keyboard is draggable.
Optionally, the processor is further configured to:
acquiring a first dragging instruction corresponding to the user interface, and adjusting the display position of the content in the user interface in the screen according to the first dragging instruction; and/or the presence of a gas in the gas,
and acquiring a second dragging indication corresponding to the input keyboard, and adjusting the display position of the input keyboard in the screen according to the second dragging indication.
Optionally, the transparency of the input keyboard is greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
Optionally, a processor configured to:
when the operation instruction is a dialing instruction, dialing a telephone number corresponding to the input content through the second application; or,
when the operation instruction is a storage instruction, storing the input content through the second application; or,
when the operation instruction is a sending instruction, sending the input content to a target user through the second application; or,
and when the operation instruction is a search instruction, searching information corresponding to the input content through the second application.
Optionally, the input keyboard includes at least one operation control, and the at least one operation control includes at least one of a dialing control, a storage control, a sending control, and a search control;
a processor configured to: when a trigger signal corresponding to a target operation control is detected, acquiring an operation instruction corresponding to the target operation control;
wherein the target operational control is one of the at least one operational control.
Optionally, the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard outgoing call indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the keys of the entity.
Fig. 9 is a block diagram illustrating an apparatus 900 according to an example embodiment. For example, the apparatus 900 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 9, apparatus 900 may include one or more of the following components: processing component 902, memory 904, power component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, and communication component 916.
The processing component 902 generally controls overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing component 902 may include one or more processors 920 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 902 can include one or more modules that facilitate interaction between processing component 902 and other components. For example, the processing component 902 can include a multimedia module to facilitate interaction between the multimedia component 908 and the processing component 902.
The memory 904 is configured to store various types of data to support operation at the apparatus 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 904 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 906 provides power to the various components of the device 900. The power components 906 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 900.
The multimedia component 908 comprises a screen providing an output interface between the device 900 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 908 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 900 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 910 is configured to output and/or input audio signals. For example, audio component 910 includes a Microphone (MIC) configured to receive external audio signals when apparatus 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 904 or transmitted via the communication component 916. In some embodiments, audio component 910 also includes a speaker for outputting audio signals.
I/O interface 912 provides an interface between processing component 902 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 914 includes one or more sensors for providing status assessment of various aspects of the apparatus 900. For example, sensor assembly 914 may detect an open/closed state of device 900, the relative positioning of components, such as a display and keypad of device 900, the change in position of device 900 or a component of device 900, the presence or absence of user contact with device 900, the orientation or acceleration/deceleration of device 900, and the change in temperature of device 900. The sensor assembly 914 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 914 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 916 is configured to facilitate communications between the apparatus 900 and other devices in a wired or wireless manner. The apparatus 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 916 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 916 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 904 comprising instructions, executable by the processor 920 of the apparatus 900 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein that, when executed by a processor of the apparatus 900, enable the apparatus 900 to perform the above-described operations performing the method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (15)
1. An operation execution method, characterized in that the method comprises:
displaying a user interface of a first application;
when a keyboard calling instruction is detected, overlaying a display input keyboard on the user interface; the different types of keyboard outgoing call indications correspond to different types of input keyboards; the input keyboard comprises at least one operation control; each operation control corresponds to at least one second application;
acquiring target content in the user interface and input content input by the user through the input keyboard;
acquiring an operation instruction corresponding to the input content;
and executing the operation corresponding to the operation instruction through the second application.
2. The method of claim 1, wherein the input keyboard has a transparency of 0, wherein the input keyboard has an area smaller than an area of a screen, and wherein the user interface and/or the input keyboard is draggable.
3. The method of claim 2, further comprising:
acquiring a first dragging instruction corresponding to the user interface, and adjusting the display position of the content in the user interface in the screen according to the first dragging instruction;
and/or the presence of a gas in the gas,
and acquiring a second dragging indication corresponding to the input keyboard, and adjusting the display position of the input keyboard in the screen according to the second dragging indication.
4. The method of claim 1, wherein the input keyboard has a transparency greater than 0 and less than 1, and the input keyboard has an area less than or equal to an area of the screen.
5. The method according to any one of claims 1 to 4, wherein the performing, by the second application, the operation corresponding to the operation instruction comprises:
when the operation instruction is a dialing instruction, dialing a telephone number corresponding to the input content through the second application; or,
when the operation instruction is a storage instruction, storing the input content through the second application; or,
when the operation instruction is a sending instruction, sending the input content to a target user through the second application; or,
and when the operation instruction is a search instruction, searching information corresponding to the input content through the second application.
6. The method of claim 5, wherein the at least one operational control comprises at least one of a dial control, a store control, a send control, and a search control;
the acquiring of the operation instruction corresponding to the input content includes:
when a trigger signal corresponding to a target operation control is detected, acquiring an operation instruction corresponding to the target operation control;
wherein the target operational control is one of the at least one operational control.
7. The method according to any one of claims 1 to 4,
the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard outgoing call indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the keys of the entity.
8. An operation execution apparatus, characterized in that the apparatus comprises:
an interface display module configured to display a user interface of a first application;
a keyboard display module configured to overlay and display an input keyboard on the user interface when a keyboard outgoing call instruction is detected; the different types of keyboard outgoing call indications correspond to different types of input keyboards; the input keyboard comprises at least one operation control; each operation control corresponds to at least one second application;
the content acquisition module is configured to acquire input content input by a user through the input keyboard according to target content in the user interface;
an instruction acquisition module configured to acquire an operation instruction corresponding to the input content;
and the operation execution module is configured to execute the operation corresponding to the operation instruction through a second application.
9. The device of claim 8, wherein the input keyboard has a transparency of 0, wherein the input keyboard has an area smaller than an area of the screen, and wherein the user interface and/or the input keyboard is draggable.
10. The apparatus of claim 9, further comprising:
the first position adjusting module is configured to acquire a first dragging instruction corresponding to the user interface and adjust the display position of the content in the user interface in the screen according to the first dragging instruction;
and/or the presence of a gas in the gas,
a second position adjusting module configured to acquire a second dragging indication corresponding to the input keyboard, and adjust a display position of the input keyboard in the screen according to the second dragging indication.
11. The device of claim 8, wherein the input keyboard has a transparency greater than 0 and less than 1, and the area of the input keyboard is less than or equal to the area of the screen.
12. The apparatus according to any one of claims 8 to 11,
the operation execution module is configured to dial a telephone number corresponding to the input content through the second application when the operation indication is a dialing indication; or,
the operation execution module is configured to store the input content through the second application when the operation indication is a storage indication; or,
the operation execution module is configured to send the input content to a target user through the second application when the operation indication is a sending indication; or,
the operation execution module is configured to search for information corresponding to the input content through the second application when the operation indication is a search indication.
13. The apparatus of claim 12, wherein the at least one operational control comprises at least one of a dial control, a store control, a send control, and a search control;
the indication acquisition module is configured to acquire an operation indication corresponding to a target operation control when a trigger signal corresponding to the target operation control is detected;
wherein the target operational control is one of the at least one operational control.
14. The apparatus according to any one of claims 8 to 11,
the keyboard exhale indication is triggered by shaking the device; or,
the keyboard calling instruction is triggered by touch operation; or,
the keyboard outgoing call indication is triggered by a voice signal; or,
the keyboard calling instruction is triggered by operating the keys of the entity.
15. An operation execution apparatus, characterized in that the apparatus comprises:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to:
displaying a user interface of a first application;
when a keyboard calling instruction is detected, overlaying a display input keyboard on the user interface; the different types of keyboard outgoing call indications correspond to different types of input keyboards; the input keyboard comprises at least one operation control; each operation control corresponds to at least one second application;
acquiring target content in the user interface and input content input by the user through the input keyboard;
acquiring an operation instruction corresponding to the input content;
and executing the operation corresponding to the operation instruction through the second application.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510440950.0A CN105138259B (en) | 2015-07-24 | 2015-07-24 | Operation executes method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510440950.0A CN105138259B (en) | 2015-07-24 | 2015-07-24 | Operation executes method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105138259A CN105138259A (en) | 2015-12-09 |
CN105138259B true CN105138259B (en) | 2018-07-27 |
Family
ID=54723621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510440950.0A Active CN105138259B (en) | 2015-07-24 | 2015-07-24 | Operation executes method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105138259B (en) |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3958557B1 (en) | 2015-04-23 | 2024-09-25 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
JP6500830B2 (en) * | 2016-04-27 | 2019-04-17 | 京セラドキュメントソリューションズ株式会社 | Handwritten character input device, image forming device, and handwritten character input method |
US10009536B2 (en) | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
CN106569796A (en) * | 2016-09-30 | 2017-04-19 | 努比亚技术有限公司 | Display method and terminal |
CN106557231A (en) * | 2016-11-14 | 2017-04-05 | 北京小米移动软件有限公司 | Page display method and device |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | User interfaces for simulated depth effects |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
CN110225187B (en) * | 2019-05-16 | 2020-11-03 | 珠海格力电器股份有限公司 | Method and equipment for making call |
US10860178B1 (en) * | 2019-09-05 | 2020-12-08 | Shabu Ans Kandamkulathy | Task management through soft keyboard applications |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
CN115586861A (en) * | 2022-09-26 | 2023-01-10 | 维沃移动通信有限公司 | Input control method, device, electronic device and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103197880A (en) * | 2012-01-05 | 2013-07-10 | 三星电子株式会社 | Method and apparatus for displaying keypad in terminal having touch screen |
CN103309616A (en) * | 2013-06-26 | 2013-09-18 | 华为终端有限公司 | Soft keyboard display method and terminal |
CN104216973A (en) * | 2014-08-27 | 2014-12-17 | 小米科技有限责任公司 | Data search method and data search device |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100083036A (en) * | 2009-01-12 | 2010-07-21 | 삼성전자주식회사 | Message service support method and portable device using the same |
-
2015
- 2015-07-24 CN CN201510440950.0A patent/CN105138259B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103197880A (en) * | 2012-01-05 | 2013-07-10 | 三星电子株式会社 | Method and apparatus for displaying keypad in terminal having touch screen |
CN103309616A (en) * | 2013-06-26 | 2013-09-18 | 华为终端有限公司 | Soft keyboard display method and terminal |
CN104216973A (en) * | 2014-08-27 | 2014-12-17 | 小米科技有限责任公司 | Data search method and data search device |
Also Published As
Publication number | Publication date |
---|---|
CN105138259A (en) | 2015-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105138259B (en) | Operation executes method and device | |
CN105955607B (en) | Content sharing method and device | |
CN107908351B (en) | Application interface display method and device and storage medium | |
US11086482B2 (en) | Method and device for displaying history pages in application program and computer-readable medium | |
EP3098701B1 (en) | Method and apparatus for managing terminal application | |
EP3316105A1 (en) | Instant message processing method and device | |
CN105487805B (en) | Object operation method and device | |
CN109600303B (en) | Content sharing method and device and storage medium | |
US20170085697A1 (en) | Method and device for extending call function | |
CN105956486B (en) | Long-range control method and device | |
CN109918001B (en) | Interface display method, device and storage medium | |
US10078422B2 (en) | Method and device for updating a list | |
CN104679599A (en) | Application program duplicating method and device | |
CN106775202B (en) | Information transmission method and device | |
CN106095236A (en) | The method and device of arranging desktop icons | |
CN104461236A (en) | Method and device for displaying application icons | |
CN111381737B (en) | Dock display method and device and storage medium | |
CN109358788B (en) | Interface display method and device and terminal | |
CN104049864A (en) | Object control method and device | |
CN108011990B (en) | Contact management method and device | |
CN108881634A (en) | Terminal control method, device and computer readable storage medium | |
CN115935099A (en) | Information display method and device, electronic equipment and storage medium | |
CN112115947A (en) | Text processing method and device, electronic equipment and storage medium | |
CN111427449A (en) | Interface display method, device and storage medium | |
CN106325712B (en) | Terminal display control method and device and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |