[go: up one dir, main page]

CN113821129B - Display window control method and electronic equipment - Google Patents

Display window control method and electronic equipment Download PDF

Info

Publication number
CN113821129B
CN113821129B CN202010570732.XA CN202010570732A CN113821129B CN 113821129 B CN113821129 B CN 113821129B CN 202010570732 A CN202010570732 A CN 202010570732A CN 113821129 B CN113821129 B CN 113821129B
Authority
CN
China
Prior art keywords
preset
user
mode
electronic device
display window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010570732.XA
Other languages
Chinese (zh)
Other versions
CN113821129A (en
Inventor
吴思举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010570732.XA priority Critical patent/CN113821129B/en
Publication of CN113821129A publication Critical patent/CN113821129A/en
Application granted granted Critical
Publication of CN113821129B publication Critical patent/CN113821129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a display window control method and electronic equipment, which relate to the technical field of terminals and can reload corresponding actions of each preset operation in a first mode when the electronic equipment is detected to enter the first mode, match the acquired operation with the preset operation when the user is detected to input the operation through a touch screen, and execute the corresponding actions of the matched preset operation. The size, the position and the layout of each display window in the first mode can be controlled without selecting a dragging component, so that the operation difficulty is reduced, and the simplicity of operating windows in a multi-window office scene is improved.

Description

Display window control method and electronic equipment
Technical Field
The present application relates to the field of terminals, and in particular, to a display window control method and an electronic device.
Background
Mobile terminal devices such as smart phones and tablet computers are becoming popular, and users can operate the mobile terminal devices through touch screens. With the development of large-screen touch terminal devices capable of supporting numerous applications, the user needs to cooperatively process multiple tasks on the terminal device, however, while the user wants the terminal device to support cooperatively processing multiple tasks, in order to pursue better user experience, the user also wants to simultaneously present display areas of multiple application programs on the same display interface.
Under the condition of split screen display, the user can only adjust the position and the size of the display window through the corresponding dragging components of the display window of two (or more) application programs, however, because the size of the corresponding dragging components of the display window is smaller, the user needs to carefully aim for selection, and the operation is very inconvenient.
Disclosure of Invention
The application provides a display window control method and electronic equipment, which solve the problem that a user needs to carefully aim at a dragging component of a display window to realize the operation of the display window in the prior art.
In order to achieve the above purpose, the application adopts the following technical scheme:
In a first aspect, a display window control method is provided, where, when an electronic device enters a first mode, actions corresponding to preset operations in the first mode are reloaded, and a display window of the electronic device is controlled according to actions corresponding to preset operations that match operations input by a user.
The first mode may be a mode capable of adjusting and controlling the multiple display windows by reloading actions corresponding to the operations of the original system level and the application level, mainly for controlling the multiple display windows in the split screen case, and the preset operation may be a common clicking operation, a gesture touch operation, a stylus touch operation, or the like. The corresponding actions of each preset operation in the first mode can be set according to the habit of the user. Specifically, the user may preset actions corresponding to each preset operation according to his own habit. For example, the operation corresponding to the single-finger push and slide operation is set to adjust the position of the display window, and the operation corresponding to the double-click operation is set to adjust the scale of the display window.
The display windows in the first mode can be controlled by reloading corresponding actions of each preset operation in the first mode, matching the acquired operation with the preset operation under the condition that the user input operation through the touch screen is detected, and then executing corresponding actions of the matched preset operation. The size, the position and the layout of each display window in the first mode can be controlled without selecting a dragging component, so that the operation difficulty is reduced, and the simplicity of operating windows in a multi-window office scene is improved.
In a possible implementation manner of the first aspect, before reloading the actions corresponding to the respective preset operations in the first mode when the electronic device enters the first mode, the method further includes: if the user input preset awakening operation is detected, responding to the preset awakening operation, and entering a first mode.
The preset wake-up operation may be set according to a usage habit of the user, for example, the preset wake-up operation may be to press a specific key on the keyboard (the key may be preset, and the press may be a short press or a long press, and the keyboard may be a physical keyboard or a virtual keyboard, which is not limited herein); for another example, the preset wake-up operation may be a touch gesture input on the touch screen with a specific gesture, for example, a gesture of sliding two fingers from the edges of two sides of the screen from outside to inside; in addition, the preset wake-up operation may be a pressing operation of pressing a suspension ball on the screen with a finger. For the electronic device connected with the touch pen, the preset wake-up operation may be a sliding operation of sliding the touch pen from the edge of one side of the screen from outside to inside; the preset wake-up operation can also be a double-click operation of a pen holder of a double-click touch pen; the pressing operation may be performed by pressing a button on the pen, or by pressing a specific pressing position such as a cap of the pen. It should be noted that the preset wake-up operations may include, but are not limited to, those set forth above, which are only examples and not limitations.
The first mode can be quickly entered through the preset awakening operation, so that the control convenience of the display window is improved.
In a possible implementation manner of the first aspect, after reloading actions corresponding to respective preset operations in the first mode when the electronic device enters the first mode, the method further includes: if the user input preset exit operation is detected, responding to the preset exit operation, exiting the first mode and returning to the second mode.
After the user completes the adjustment of the display window, the user can exit the first mode through a preset exit operation and return to the second mode. The preset exit operation may also be set according to a usage habit of the user, for example, the preset exit operation may be a specific key that is released to be pressed, or for example, the preset exit operation may also be a touch gesture that is input on the touch screen with a specific gesture, for example, a gesture that two fingers slide from the middle of the screen to the two sides of the screen from inside to outside. For the electronic device connected with the stylus, the preset exit operation may be a sliding operation of sliding the stylus from an edge of a display window near a screen boundary to an edge of the screen; the preset exit operation may also be a double-click operation of double-clicking the pen holder of the stylus after the first mode has been entered; the above-described preset escape operation may also be an operation of releasing a specific pressed position (e.g., cap, button on pen, etc.) pressed after the first mode has been entered. It will be appreciated that the preset exit operations described above may include, but are not limited to, those set forth above, by way of example only, and not limitation. The second mode refers to a normal mode, namely, a mode that the electronic equipment resumes the operation corresponding to the operation of the original system level and the application level to control the display window.
In addition, the preset exit operation may be set according to a preset wake operation, for example, if the preset wake operation is to press a specific key on the keyboard, the preset exit operation may be to release the pressed key; if the preset wake-up operation is a specific gesture of double-finger pinch, the preset exit operation may be a specific gesture of double-finger zoom-in (or a specific gesture of double-finger pinch again); if the preset wake-up operation is to press the hover button, the preset exit operation may be that the finger is away from the hover button. For the electronic device connected with the touch control pen, if the preset awakening operation is to press a specific button of the touch control pen, the preset exiting operation is to release the specific button; if the preset wake-up operation is to double-click the pen holder, the preset exit operation may be to double-click the pen holder again.
The first mode can be quickly exited through the preset exiting operation, and the convenience of controlling the display window is improved.
In a possible implementation manner of the first aspect, the determining a preset operation matched with an operation input by a user includes: receiving an operation input by a user; and matching the operation input by the user with each preset operation according to the operation information of the operation input by the user, and determining the preset operation matched with the operation input by the user.
When the electronic device receives the operation input by the user, operation information such as the clicking or touching position, the sliding track, the clicking or pressing time length and the like of the operation can be detected, then the operation input by the user is identified according to the operation information, and the identified operation is matched with the preset operation to determine the action required to be responded by the operation input by the user. When the first mode is entered, each preset operation has corresponding actions in the first mode, when the user is detected to input the operation on the touch screen, the electronic equipment matches the received operation with each preset operation, and after the corresponding preset operation is matched, the electronic equipment responds to the matched preset operation to correspondingly control each display window, so that the simplicity of the operation window in the multi-window office scene is effectively improved.
In a possible implementation manner of the first aspect, if the user input of a preset wake-up operation is detected, the entering the first mode in response to the preset wake-up operation includes: when the user input operation is detected, matching the operation input by the user with a preset awakening operation; if the operation input by the user is consistent with the preset awakening operation, determining that the preset awakening operation is input by the user, responding to the preset awakening operation, and entering a first mode.
The electronic equipment can detect whether the user inputs the operation through the input devices such as the keyboard or the touch screen, and only when the operation input by the user is consistent with the preset awakening operation, the electronic equipment enters the first mode, so that the accuracy of entering the first mode can be improved, the preset awakening operation can be defined by the user, the operability can be improved, and the user experience is improved.
In a possible implementation manner of the first aspect, if the user input of the preset exit operation is detected, the first mode is exited in response to the preset exit operation, and the second mode is returned, where the method includes: when the user input operation is detected, matching the operation input by the user with a preset exit operation; if the operation input by the user is consistent with the preset exit operation, determining that the user inputs the preset exit operation, responding to the preset exit operation, exiting the first mode, and returning to the second mode.
The electronic equipment can detect whether the user inputs the operation through the input devices such as the keyboard or the touch screen, and only when the operation input by the user is consistent with the preset exit operation, the electronic equipment enters the first mode, so that the accuracy of exiting a mode can be improved, the preset exit operation can be defined by the user, the operability can be improved, and the user experience is improved.
In a possible implementation manner of the first aspect, the display window management method further includes: and when the operation input by the user is not matched with the preset operation, feeding back invalid operation prompt information to the user.
When the electronic equipment cannot find the preset operation matched with the operation input by the user, the electronic equipment can feed back a prompt message of the invalid operation to the user so as to inform the user that the input operation is the invalid operation, and the user can be prompted in time so as to be convenient for the user to input again.
In a second aspect, an electronic device is provided, which may include a reload unit, a determination unit, and a control unit.
The reloading unit is used for reloading corresponding actions of each preset operation in the first mode when the electronic equipment enters the first mode;
a determining unit configured to determine a preset operation that matches an operation input by a user;
and the control unit is used for controlling the display window according to an action corresponding to a preset operation matched with the operation input by the user.
In a possible implementation manner of the second aspect, the electronic device may further include a wake-up unit, an exit unit, and a feedback unit.
The wake-up unit is used for responding to the preset wake-up operation and entering a first mode if the user input preset wake-up operation is detected;
And the exit unit is used for responding to the preset exit operation and exiting the first mode and returning to the second mode if the user is detected to input the preset exit operation.
And the feedback unit is used for feeding back invalid operation prompt information to the user when the operation input by the user is not matched with the preset operation.
In a possible implementation manner of the second aspect, the determining unit is mainly configured to receive an operation input by a user; and matching the operation input by the user with each preset operation according to the operation information of the operation input by the user, and determining the preset operation matched with the operation input by the user.
In a possible implementation manner of the second aspect, the wake-up unit is mainly configured to match an operation input by a user with a preset wake-up operation when the user input operation is detected; if the operation input by the user is consistent with the preset awakening operation, determining that the preset awakening operation is input by the user, responding to the preset awakening operation, and entering a first mode.
In a possible implementation manner of the second aspect, the exit unit is mainly configured to match an operation input by a user with a preset exit operation when the user input operation is detected; if the operation input by the user is consistent with the preset exit operation, determining that the user inputs the preset exit operation, responding to the preset exit operation, exiting the first mode, and returning to the second mode.
In a third aspect, an embodiment of the present application provides an electronic device, including: a memory and a processor, the memory for storing a computer program; the processor is configured to execute the method according to the first aspect when the computer program is called.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to the first aspect
In a fifth aspect, embodiments of the present application provide a computer program product for causing an electronic device to perform the method of the first aspect described above when the computer program product is run on the electronic device
In a sixth aspect, an embodiment of the present application provides a chip system, where the chip system includes a memory and a processor, and the processor executes a computer program stored in the memory to implement the method in the first aspect, where the chip system may be a single chip or a chip module formed by multiple chips.
The advantages of the second to sixth aspects may be found in the relevant description of the first aspect, and are not described here.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a window control by a drag assembly in accordance with the prior art;
FIG. 3 is a schematic diagram of a set of interfaces provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of an interface according to an embodiment of the present application;
FIG. 5 is a schematic illustration of another interface provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of another interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of another interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of another interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of another interface provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of another set of interfaces provided in accordance with an embodiment of the present application;
FIG. 11 is a schematic diagram of another interface provided by an embodiment of the present application;
FIG. 12 is a schematic diagram of another set of interfaces provided in accordance with an embodiment of the present application;
FIG. 13 is a schematic view of another interface provided by an embodiment of the present application;
FIG. 14 is a schematic view of another interface provided by an embodiment of the present application;
FIG. 15 is a schematic view of another interface provided by an embodiment of the present application;
FIG. 16 is a schematic illustration of another interface provided by an embodiment of the present application;
FIG. 17 is a schematic diagram of another set of interfaces provided in accordance with an embodiment of the present application;
FIG. 18 is a schematic diagram of another set of interfaces provided in accordance with an embodiment of the present application;
Fig. 19 is a flowchart of a method for controlling a display window according to an embodiment of the present application;
fig. 20 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular operating system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known operating systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in the present description and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
Furthermore, the terms "first," "second," "third," and the like in the description of the present specification and in the appended claims, are used for distinguishing between descriptions and not necessarily for indicating or implying a relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a, B, and AB. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The display window control method provided by the application can be applied to electronic equipment comprising a touch sensitive screen, the electronic equipment can reload actions corresponding to each preset operation in a first mode when detecting that the electronic equipment enters the first mode, then the acquired operation is matched with the preset operation when detecting the operation input by a user through the touch screen, and then the matched actions corresponding to the preset operation are executed. The control of the size, position and layout of each display window in the first mode can be achieved without selecting a drag component. The operation difficulty is reduced, and the simplicity of operating windows in the multi-window office scene is improved.
The electronic device may be an electronic device such as a mobile phone, a tablet computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), or the like, which is not limited in the specific type of the electronic device according to the embodiments of the present application.
By way of example, fig. 1 shows a schematic diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a memory, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In this embodiment of the present application, the processor 110 may be configured to invoke the instruction in the memory to reload the actions corresponding to the respective preset operations in the first mode after the electronic device 100 enters the first mode, and then match the actions corresponding to the matched preset operations according to the operations input by the user.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called directly from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters.
In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate.
In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include a global system for mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects at a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on.
In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When an operation is applied to the display 194, the electronic apparatus 100 detects the intensity of the operation according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A.
In some embodiments, operations that act on the same touch location, but at different operation strengths, may correspond to different operation instructions. For example: when an operation with the operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for checking the short message. When the operation with the operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon, executing the instruction of newly building the short message.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect an operation acting thereon or thereabout. The touch sensor may communicate the detected operation to the application processor to determine the touch event type. Visual output related to the operation may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
In an embodiment of the present application, the electronic device 100 is provided with a touch screen composed of the touch sensor 180k and the display screen 194, where the touch screen may be referred to as a touch-sensitive display system or a display having a touch-sensitive surface (touch-SENSITIVE SURFACE). The display having a touch-sensitive surface includes a touch-sensitive surface and a display screen; the touch screen may display a screen interface, or may receive touch actions.
The touch screen provides an input interface and an output interface between the device and the user. The touch screen may collect operations on or near the user, such as operations on or near the touch screen by the user using any suitable object, such as a finger, a joint, a stylus, etc. The touch screen may detect a user input operation and transmit the user input operation to the processor 110. The operation information of the operation input by the user may include a touch trajectory, a grid capacitance value of the touch sensitive surface, and coordinates of a touch point. The touch screen is also capable of receiving and executing commands from the processor 110. Such as a touch screen, displays visual output. Visual output may include graphics, text, logos, video, and any combination thereof (collectively, "graphics"). In some embodiments, some or all of the visual output may correspond to a display interface object.
The touch screen may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, but other display technologies may be used in other embodiments. The touch screen may utilize any of a variety of touch sensing technologies now known or later developed, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen to detect contact and any movement or interruption thereof. The various touch sensing technologies include, but are not limited to, capacitive, resistive, infrared, and surface acoustic wave technologies. In an exemplary embodiment, a projected mutual capacitance sensing technique is used.
The user may make contact with the touch screen using any suitable object or appendage, such as a stylus, finger, joint, etc. In some embodiments, the display interface is designed to work together based on multi-finger gestures. In other embodiments, the display interface is designed to operate based on a stylus. In still other embodiments, the display interface is designed to work with touch trajectories based on pressure touches. In some embodiments, the electronic device translates the coarse input of the user-entered operation into a precise pointer/cursor position or command to perform the action desired by the user. In the embodiment of the application, the electronic equipment can control the plurality of display windows according to the operation input by the user.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal.
In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may vibrate the bone pieces based on the vibration signals of the sound part acquired by the bone conduction sensor 180M, and analyzing the voice signal to realize the voice function. The application processor can analyze heart rate information based on the blood pressure beat signals acquired by the bone conduction sensor 180M, so that a heart rate detection function is realized.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by operating on different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The following mainly takes the electronic device 100 as a tablet computer as an example to describe the display window management method provided by the embodiment of the application.
Currently, electronic devices such as tablet computers allow users to manage multiple display windows through gestures. As shown in fig. 2, in the case where the user has opened one application, the user may drag out the shortcut window 201 of the application from the bottom of the display screen, the user may drag out the other application from the shortcut window 201 to place the other side of the screen and release, at which time the two applications may enter the split screen display. The drag component 202 at the top of two display windows can be used to adjust the left-right relationship between the display windows, and the drag component 202 in the middle of a display window can be used to adjust the ratio of two display windows, or to exit a display window.
However, since the display window has a small size corresponding to the drag component, the user needs to carefully aim to select, which is inconvenient to operate, and the application program that is not previously placed in the shortcut window 301 cannot be dragged to the display screen to realize the split screen display.
In the embodiment of the present application, the preset operation may be a normal clicking operation, a gesture touch operation, a stylus touch operation, or the like, and in addition, the preset operation may also be a blank operation, for example, a gesture operation obtained through infrared, a camera, or the like, and the preset operation may include, but is not limited to, a two-finger pinch operation, a two-finger zoom operation, a one-finger drag operation, a three-finger drag operation, a four-finger pinch operation, a stylus sliding operation, a double-click operation, or the like. When the electronic equipment detects the operation of entering the first mode, the preset operation is reloaded, and the corresponding actions of the preset operations in the first mode are reapplied. The multi-window split screen operation method has the advantages that corresponding operation can be carried out on multiple windows of the split screen through the preset operation, so that a user can flexibly control the size, the position and the layout of each display window in the first mode, the operation difficulty is reduced, and the simplicity of the operation window in the multi-window office scene is improved.
The reloading of the operation commonly used in the second mode means redefining the corresponding actions of the original system level and application level operation in the first mode, and when the operation is redefined with the corresponding actions in the first mode, the corresponding actions in the second mode are closed. That is, when the user inputs an operation through the touch screen in the first mode and the input operation matches a preset operation, the electronic device will only respond to its corresponding action in the first mode.
In the embodiment of the present application, the first mode refers to a display window control mode, and the second mode refers to a normal mode. The display window control mode is a mode in which a plurality of display windows can be adjusted and controlled by reloading operations corresponding to the operations of the original system level and the application level, and is mainly directed to control of a plurality of display windows in the case of split screen.
In other implementations, the first mode may refer to a normal mode, and the second mode may refer to a display window control mode, which is not limited herein.
For example, in the second mode, the action corresponding to the two-finger pinch operation is to wake up the desktop management function, in the first mode, the action corresponding to the reloaded two-finger pinch operation is to reduce the size of the current display window according to a preset proportion, when the electronic device is detected to enter the first mode, the electronic device reloads the action corresponding to the two-finger pinch operation, and when the operation input by the user is detected to be the two-finger pinch operation, the size of the current display window is reduced according to the preset proportion in response to the two-finger pinch operation. The above-mentioned preset ratio may be set according to the actual application scenario, for example, may be set according to the kneading distance of the two-finger kneading operation, may be a preset fixed reduction ratio (for example, 1/2, 1/3, 2/3, etc. of the original display window), or may be set by other means, which is not limited herein.
For another example, the action corresponding to the double-click operation in the second mode is to open the application program corresponding to the double-click position, and in the first mode, the action corresponding to the double-click operation after reloading is to adjust the display proportion of two or more display windows according to the preset proportion, when the electronic device is detected to enter the first mode, the electronic device reloads the action corresponding to the double-click operation, and when the operation input by the user is detected to be the double-click operation, the display proportion of two or more display windows is adjusted according to the preset proportion. For example, the display ratio of the double-clicked display window may be enlarged while the display ratio of the other display windows is reduced, or the display ratio of the double-clicked display window may be reduced while the display ratio of the other display windows is enlarged. Similarly, the preset proportion may be set according to an actual application scenario, and the preset proportion may be a preset fixed proportion, or may be a proportion obtained by adaptively adjusting according to the number of the current display windows, or may be set by other manners, which is not limited herein.
In the embodiment of the application, the electronic equipment can detect whether the user inputs the preset awakening operation in real time through the touch screen or the keyboard, and when detecting that the user inputs the preset awakening operation, the electronic equipment responds to the preset awakening operation and enters the first mode.
In the embodiment of the application, after the electronic equipment enters the first mode, actions corresponding to preset operations of the original system level and the application level can be reloaded, so that a user can use the operations of the original system level and the application level to adjust, control and the like a plurality of display windows.
In the embodiment of the present application, the entering into the first mode may be after the electronic device has entered the split-screen display mode and detected that the user inputs a preset wake-up operation on the display screen of the electronic device, and then enters into the first mode. The entering into the first mode may further be dividing the display screen of the electronic device into at least two display windows, and reloading the actions corresponding to the operations of the respective original system level and the application level. The method comprises the steps that when the electronic equipment detects that a user inputs a preset awakening operation, screen separation operation is conducted on a display screen, the display screen of the electronic equipment is divided into at least two display windows, and then actions corresponding to the operation of each original system level and the operation of an application level are loaded. Here, it should be noted that, when the electronic device enters the first mode, only the action corresponding to the preset operation may be loaded, and the preset operation may be set according to the user's needs, which is not limited herein.
In the embodiment of the present application, the dividing the display screen of the electronic device into at least two display windows and reloading the actions corresponding to the operations of the original system level and the application level may include:
1) When the display interface of the display screen displays the operation interface of the first application program, reducing the size of the operation interface of the first application program, and displaying the reduced operation interface of the first application program on a first display window; generating at least one display window in a display area except the first display window, displaying identifiers of one or more application programs (a main menu interface of the electronic equipment can be displayed) related to the first application program in the generated display window, and reloading original system-level and application-level preset touch mode gestures;
2) When the display interface of the display screen displays the operation interface of the first application program, reducing the size of the operation interface of the first application program, and displaying the reduced operation interface of the first application program on a first display window; generating at least one display window in a display area except the first display window, displaying an operation interface of an application program related to the first application program in the generated display window, and reloading the original system-level and application-level preset touch mode gestures;
3) When the display interface of the display displays the operation interface of the first application program, reducing the size of the operation interface of the first application program, and displaying the reduced operation interface of the first application program on a first display window; generating at least one display window in a display area except the first display window, displaying a main menu interface in the generated display window, and reloading the original touch gestures of the system level and the application level;
4) When the display interface of the display screen displays the operation interface of the first application program, reducing the size of the operation interface of the first application program, and displaying the reduced operation interface of the first application program on a first display window; generating at least one display window in a display area except the first display window, displaying a history program identifier in the generated display window, and reloading the original system-level and application-level touch gestures;
5) When the display interface of the display screen displays the operation interface of the first application program, reducing the size of the operation interface of the first application program, and displaying the reduced operation interface of the first application program on a first display window; and generating at least one display window in a display area except the first display window, displaying a thumbnail of an operation interface of the history program in the generated display window, and reloading the original system-level and application-level touch mode gestures.
In the embodiment of the present application, the preset wake-up operation may be predefined by a user, and when the electronic device detects that the user inputs the preset wake-up operation in the second mode, the electronic device enters the first mode, and reloads actions corresponding to the preset operations in the first mode.
In the embodiment of the present application, the preset wake-up operation may be set according to a usage habit of a user, for example, the preset wake-up operation may be to press a specific key on a keyboard (the key may be preset, and the press may be a short press or a long press, and the keyboard may be a physical keyboard or a virtual keyboard, which is not limited herein); for another example, the preset wake-up operation may be a touch gesture input on the touch screen with a specific gesture, for example, a gesture of sliding two fingers from the edges of two sides of the screen from outside to inside; in addition, the preset wake-up operation may be a pressing operation of pressing a suspension ball on the screen with a finger. For the electronic device connected with the touch pen, the preset wake-up operation may be a sliding operation of sliding the touch pen from the edge of one side of the screen from outside to inside; the preset wake-up operation can also be a double-click operation of a pen holder of a double-click touch pen; the pressing operation may be performed by pressing a button on the pen, or by pressing a specific pressing position such as a cap of the pen. It should be noted that the preset wake-up operations may include, but are not limited to, those set forth above, which are only examples and not limitations.
Illustratively, as shown in fig. 3 (a), the user causes the electronic device to enter a first mode by pressing a particular button 201 on the keyboard; as shown in (b) of fig. 3, the user causes the electronic device to enter a first mode by a specific gesture in which two fingers slide from the edges of both sides of the screen from outside to inside; as shown in fig. 3 (c), the user brings the electronic device into a first mode by holding down the hover ball 202 on the screen.
For example, as shown in fig. 4, a user presses an edge of one side of the screen through a stylus and slides the edge towards the middle of the screen, so that the display screen is divided into a first display window 301 and a second display window 302, an operation interface of an original application program APP1 is displayed in the first display window, a desktop is displayed in the second display window, and if the user is detected to click on a certain application program APP2 on the desktop, the operation interface of the application program APP2 is displayed in the second display window. Thus, the user can also select an application program which is not in the shortcut window for split-screen display.
In the embodiment of the application, the electronic device can also detect whether the user inputs the preset exit operation through the touch screen or the keyboard, and respond to the preset exit operation to exit the first mode when detecting that the user inputs the preset exit operation.
In the embodiment of the application, after the user completes the adjustment of the display window, the user can exit the first mode through the preset exit operation and return to the second mode. The preset exit operation may also be set according to a usage habit of the user, for example, the preset exit operation may be a specific key that is released to be pressed, or for example, the preset exit operation may also be a touch gesture that is input on the touch screen with a specific gesture, for example, a gesture that two fingers slide from the middle of the screen to the two sides of the screen from inside to outside. For the electronic device connected with the stylus, the preset exit operation may be a sliding operation of sliding the stylus from an edge of a display window near a screen boundary to an edge of the screen; the preset exit operation may also be a double-click operation of double-clicking the pen holder of the stylus after the first mode has been entered; the above-described preset escape operation may also be an operation of releasing a specific pressed position (e.g., cap, button on pen, etc.) pressed after the first mode has been entered. It will be appreciated that the preset exit operations described above may include, but are not limited to, those set forth above, by way of example only, and not limitation.
For example, as shown in fig. 5, in the first mode, the user can make the electronic device exit the first mode and return to the second mode by using a two-finger zoom-in gesture in which two fingers slide from inside to outside at the first display window 401.
Illustratively, as shown in fig. 6, after the user presses the edge of the display window of the application APP1 near the screen boundary with the stylus to slide toward the edge of the screen, the user can exit the first mode and return to the second mode.
In the embodiment of the present application, the preset exit operation may be set according to a preset wake operation, for example, if the preset wake operation is to press a specific key on the keyboard, the preset exit operation may be to release the pressed key; if the preset wake-up operation is a specific gesture of double-finger pinch, the preset exit operation may be a specific gesture of double-finger zoom-in (or a specific gesture of double-finger pinch again); if the preset wake-up operation is to press the hover button, the preset exit operation may be that the finger is away from the hover button. For the electronic device connected with the touch control pen, if the preset awakening operation is to press a specific button of the touch control pen, the preset exiting operation is to release the specific button; if the preset wake-up operation is to double-click the pen holder, the preset exit operation may be to double-click the pen holder again.
In the embodiment of the application, after the electronic equipment reloads actions corresponding to each preset operation, the operation input by the user through the touch screen is continuously detected, the received operation is matched with the preset operation, and then the matched actions of the preset operation are executed, so that the operation on a plurality of display windows is realized.
In the embodiment of the application, after the first mode is entered, each preset operation has corresponding action in the first mode, when the user is detected to input the operation on the touch screen, the electronic equipment matches the received operation with each preset operation, and after the corresponding preset operation is matched, the electronic equipment responds to the matched preset operation to perform the corresponding operation on each display window.
In the embodiment of the application, after the electronic device enters the first mode, the electronic device reloads the actions corresponding to the preset operations in the first mode, and the actions corresponding to the preset operations in the first mode can be set according to the habit of the user. Specifically, the user may preset actions corresponding to each preset operation according to his own habit. For example, the operation corresponding to the single-finger push and slide operation is set to adjust the position of the display window, and the operation corresponding to the double-click operation is set to adjust the scale of the display window.
In the embodiment of the application, the user can set a plurality of preset operations, and define the corresponding actions of each preset operation in the first mode in advance, so long as the electronic equipment enters the first mode, the electronic equipment gives the corresponding actions of each preset operation in the first mode.
The following will illustrate how an electronic device responds after detecting that a user has entered an operation on a touch screen, in conjunction with the accompanying drawings, and is described in detail below:
as shown in fig. 7, when the application APP1 and the application APP2 are displayed on separate screens, at this time, in the case where the first display window 601 displays the operation interface of the application APP1 and the second display window 602 displays the operation interface of the application APP2, after the electronic device enters the first mode (taking a specific key of the keyboard being pressed here as an example), the user presses the application APP1 with a single finger and slides in the direction of the application APP2, after detecting the single-finger press and slide operation, the electronic device matches the received single-finger press and slide operation with each preset operation to determine the action corresponding to the single-finger press and slide operation, after the matching is completed, it can be known that the action corresponding to the single-finger press and slide operation is to adjust the current position of the display window, at this time, the electronic device adjusts the positions of the application APP1 and the application APP2, so that the first display window 601 displays the operation interface of the application APP2 and the second display window 602 displays the operation interface of the application APP 1.
As shown in fig. 8, when the application APP1 and the application APP2 are displayed on separate screens, at this time, the operation interface of the application APP1 is displayed on the first display window 701, the operation interface of the application APP2 is displayed on the second display window 702, and when the proportion of the display screen of the electronic device occupied by the first display window 701 is equal to the proportion of the display screen of the electronic device occupied by the second display window 702, after the electronic device enters the first mode (taking the case that a specific key of the keyboard is pressed here as an example), the user double-clicks the application APP1, after detecting the double-clicks, the electronic device matches with a preset operation according to the received double-clicks operation to determine an action corresponding to the double-clicks operation, and after the matching is completed, it can be known that the action corresponding to the double-clicks is to adjust the display proportion of the first display window 701. In the embodiment of the present application, the double-click operation may correspond to an action of enlarging the current display window according to a preset ratio. As shown in fig. 8 (a), when the proportion of the first display window 701 to the display screen of the electronic device is equal to the proportion of the second display window 702 to the display screen of the electronic device, the user double clicks the application APP1, and the electronic device enlarges the current display window after detecting the double click operation. As shown in fig. 8 (b), when the user double-clicks the application APP1 again in the case of enlarging the current display window, the electronic device may again equalize the proportion of the display screen of the electronic device occupied by the first display window 701 with the proportion of the display screen of the electronic device occupied by the second display window 702. It will be appreciated that the double-click operation described above may also correspond to an action of scaling down the current display window by a preset scale, which may be set as desired, and is merely exemplary and not limiting herein.
As shown in fig. 9, when the application App1 and the application App2 are displayed on the split screen, at this time, the running interface of the application App1 is displayed on the first display window 801, the running interface of the application App2 is displayed on the second display window 802, after the electronic device enters the first mode (taking the specific key of the keyboard being pressed here as an example), the user may use the four-finger pinch gesture on the application App2 to make the electronic device enter the half-screen desktop (i.e. the main menu interface of the electronic device is displayed on the second display window 802), at this time, if the user clicks on other application icons in the main menu interface, the application (for example, the application App 3) is selected, at this time, the running interface of the application App3 is displayed on the second display window.
As shown in fig. 10 (a), when the application App1 and the application App2 are displayed on separate screens, the running interface of the application App1 is displayed on the first display window 901, the running interface of the application App2 is displayed on the second display window 902, after the electronic device enters the first mode (taking the specific key of the keyboard being pressed here as an example), the user may slide from the screen dividing line to the screen edge using three fingers in the application App2, and switch the application App2 to the previous application App3, and the running interface of the application App3 is displayed on the second display window 902. Similarly, as shown in fig. 10 (b), the user may slide the application App3 from the edge of the screen to the screen dividing line by using three fingers to switch the application App3 to the next application App2, and the second display window 902 displays the running interface of the application App 2.
As shown in fig. 11, when the application APP1 and the application APP2 are displayed on the split screen, the running interface of the application APP1 is displayed on the first display window 1001, the running interface of the application APP2 is displayed on the second display window 1002, after the electronic device enters the first mode (taking the specific key of the keyboard being pressed here as an example), the user may drag one application (for example, the application APP 1) to the edge of the screen with a single finger until the first display window 1001 disappears, and then the application may be turned off, and at this time, the other application (that is, the application APP 2) may be adjusted to be displayed on the full screen.
As shown in fig. 12 (a), when the application APP1 and the application APP2 are displayed in a split screen mode, the running interface of the application APP1 is displayed in the first display window 1101, the running interface of the application APP2 is displayed in the second display window 1102, and after the electronic device enters the first mode, the user may use a double-finger pinch gesture on the window of one application (for example, the application APP 2) to zoom out to the floating window mode, and at the same time, adjust the other application (for example, the application APP 1) to the full screen mode. As shown in (b) of fig. 12, when one application (e.g., application APP 1) is in full screen mode and the other application (application APP 2) is in floating window mode, the two applications can be adjusted to split screen mode using the double-finger zoom gesture in the application (application APP 2) in floating window mode.
For the electronic equipment connected with the touch pen, a user can input operation on the touch screen of the electronic equipment through the touch pen, then the electronic equipment is matched with preset operation according to the operation input by the user, and further the action executed by response corresponding to the operation input by the user is identified, so that control and adjustment of a plurality of display windows are realized.
For example, as shown in fig. 13, when the application APP1 and the application APP2 are displayed in a split screen manner, at this time, the operation interface of the application APP1 is displayed in the first display window 1201, the operation interface of the application APP2 is displayed in the second display window 1202, after the electronic device enters the first mode, the stylus is dropped in the middle area of the application APP1, and then the position of the two applications can be adjusted by dragging to one side of the application APP2, so that the operation interface of the application APP2 is displayed in the first display window 1201, and the operation interface of the application APP1 is displayed in the second display window 1202.
As shown in fig. 14, when the application APP1 and the application APP2 are displayed in a split screen manner, at this time, the running interface of the application APP1 is displayed in the first display window 1301, the running interface of the application APP2 is displayed in the second display window 1302, after the electronic device enters the first mode, the stylus is dropped in the middle area of one application APP1, and is dragged to the screen edge of the application APP1 to release it, so that the application may be closed, so that the other application (application APP 2) is adjusted to be displayed in a full screen manner.
As shown in fig. 15, when the application APP1 and the application APP2 are displayed in a split screen manner, at this time, the operation interface of the application APP1 is displayed in the first display window 1401, the operation interface of the application APP2 is displayed in the second display window 1402, and when the proportion of the display screen of the electronic device occupied by the first display window 1401 is equal to the proportion of the display screen of the electronic device occupied by the second display window 1402, after the electronic device enters the first mode, the application APP1 is double-clicked by the stylus, after detecting the double-click operation, the electronic device matches with a preset operation according to the received double-click operation, so as to determine the action corresponding to the double-click operation, and after the matching is completed, it can be known that the action corresponding to the double-click operation is to adjust the display proportion of the first display window 1401. In the embodiment of the present application, the double-click operation may correspond to an action of enlarging the current display window according to a preset ratio. As shown in fig. 15 (a), when the proportion of the display screen of the electronic device occupied by the first display window 1401 is equal to the proportion of the display screen of the electronic device occupied by the second display window 1402, the user double-clicks the application APP1 with the stylus, and the electronic device enlarges the current display window after detecting the double-click operation. As shown in fig. 15 (b), when the user double-clicks the application APP1 again with the stylus under the condition that the current display window is enlarged, the electronic device may again equalize the proportion of the display screen of the electronic device occupied by the first display window 1401 with the proportion of the display screen of the electronic device occupied by the second display window 1402. It will be appreciated that the double-click operation described above may also correspond to an action of scaling down the current display window by a preset scale, which may be set as desired, and is merely exemplary and not limiting herein.
As shown in fig. 16, when the application APP1 and the application APP2 are displayed in a split screen manner, the running interface of the application APP1 is displayed in the first display window 1501, the running interface of the application APP2 is displayed in the second display window 1502, after the electronic device enters the first mode, the second display window 1502 can be made to enter the half-screen desktop (i.e. the main menu interface of the electronic device is displayed in the second display window 1502) by sliding the electronic device from the bottom of the application APP2 with a stylus, at this time, if the user clicks on another application icon in the main menu interface, the application (e.g. the application APP 3) is selected, and at this time, the running interface of the application APP3 is displayed in the second display window.
As shown in fig. 17 (a), when the application APP1 and the application APP2 are displayed on separate screens, the running interface of the application APP1 is displayed on the first display window 1601, the running interface of the application APP2 is displayed on the second display window 1602, and after the electronic device enters the first mode, the second display window 1602 can be made to display the running interface of another application APP3 called in the background by sliding the stylus pen laterally from the screen boundary to the screen edge at the bottom of the application APP 2. As shown in fig. 17 (b), the second display window 1602 can be made to display as the running interface of the original application APP2 by sliding the stylus pen laterally from the bottom of the newly opened application APP3 toward the screen edge to the screen parting line.
As shown in fig. 18 (a), when the application APP1 and the application APP2 are displayed on separate screens, the running interface of the application APP1 is displayed on the first display window 1701, the running interface of the application APP2 is displayed on the second display window 1702, after the electronic device enters the first mode, the user slides on the top of the floating window to the top of the display screen with a pen, and the user can switch the two applications back to the separate screen mode, and can slide down from the top of the screen on the window of one application (for example, the application APP 2) with a stylus, so that the second display window 1702 of the application APP2 is reduced to the floating window mode, and at the same time, the other application (for example, the application APP 1) is adjusted to the full screen mode. As shown in fig. 18 (b), when one application (e.g., application APP 1) is in full screen mode and the other application (e.g., application APP 2) is in floating window mode, the two applications can be adjusted to split screen mode by sliding the stylus from the top of the floating window to the top of the display screen in the application (e.g., application APP 2) in floating window mode.
In the embodiment of the application, when the electronic equipment receives the operation input by the user, the electronic equipment can detect the operation information such as the touch position, the sliding track, the clicking or pressing time length and the like of the operation, then identify the operation according to the operation information, and then match the identified operation with the preset operation so as to determine the action required to be responded by the operation input by the user. It should be noted that, when the electronic device cannot find the preset operation matching the operation input by the user, the electronic device may not respond to the operation, or may feedback a prompt message of the invalid operation to the user, so as to inform the user that the operation input by the user is the invalid operation.
In the embodiment of the application, the plurality of display windows can be controlled and adjusted by using the touch gestures by reloading the actions corresponding to various common touch gestures in the first display, and the control of the size, the position and the layout of each display window in the first mode can be realized without selecting a dragging component. The operation difficulty is reduced, and the simplicity of operating windows in the multi-window office scene is improved.
In combination with the above embodiments and the corresponding drawings, another embodiment of the present application provides a display window control method, which may be implemented in an electronic device having a hardware structure shown in fig. 1. As shown in fig. 19, the method may include:
s1801: the electronic device detects whether a user inputs a preset wake-up operation, if so, S1802 is executed; otherwise, continuing to detect.
In this step, the electronic device may be a portable electronic device having a touch screen, and the electronic device may detect whether a preset operation is input by a user through the touch screen. Specifically, the electronic device may receive an operation acting on the touch screen, compare the received operation with a preset wake-up operation, and determine that the electronic device receives the preset wake-up operation when the received operation is consistent with the preset wake-up operation.
In this step, the electronic device may be an electronic device having an input device such as a keyboard, and the electronic device may detect whether or not a user inputs a preset operation through the keyboard. Specifically, the electronic device may receive a pressing operation acting on the keyboard, then determine whether the pressed key is a specific key corresponding to a preset wake operation, and determine that the electronic device has received the preset wake operation when the pressed key is the specific key corresponding to the preset wake operation.
In this step, if the electronic device does not receive the operation of the user on the touch screen or the keyboard or the electronic device receives the operation of the user on the touch screen or the keyboard, but the operation is not matched with the preset wake-up operation, it may be determined that the electronic device does not receive the preset wake-up operation.
S1802: a first mode is entered.
In this step, when the electronic device detects that the user inputs a preset wake-up operation, the electronic device will enter a first mode in response to the preset wake-up operation. Specifically, entering the first mode may be reloading the actions that each preset operation needs to respond in that you mode. The reloading of the preset operation means redefining the corresponding actions of the original system level and application level operation in the first mode, and when the corresponding actions are redefined in the first mode, the corresponding actions are closed in the second mode. That is, when the user inputs an operation through the touch screen in the first mode and the input operation matches the preset operation, the electronic device will only respond to the corresponding action in the first mode.
In this step, the first mode may be further entered by dividing the display screen of the electronic device into at least two display windows and reloading the actions corresponding to the operations (preset operations) of the respective original system level and application level.
S1803: the electronic device detects whether the user inputs an operation, and if so, S1804 is executed; otherwise, continuing to detect.
In this step, after the electronic device enters the first mode, the electronic device may also detect whether the user has input an operation through the touch screen. User input (either a finger or a stylus, etc.) is detected in real time by a touch screen having a touch sensitive surface.
S1804: and matching the input operation with a preset operation, and determining the preset operation matched with the input operation.
In this step, in the case where it is detected that the user has input an operation, it is necessary to further determine whether the operation input by the user is a preset operation that has been reloaded in the first mode. The electronic device may determine a specific type of operation according to operation information such as a touch position, a sliding track, a pressing duration, and a number of times of the operation input by the user, and then match the received operation with a plurality of preset operations to determine a preset operation matched with the operation input by the user.
In this step, if the operation input by the user does not have the preset operation matched with the operation input by the user, the operation input by the user is an invalid operation, and at this time, any action may not be executed, or a prompt message of the invalid operation may be returned to inform the user that the operation input by the user is an invalid operation.
S1805: and controlling the display window according to the action corresponding to the preset operation matched with the input operation.
In this step, since each preset operation is newly given a corresponding action in the first mode after entering the first mode, after the matching is completed, the plurality of display windows can be operated by executing the corresponding action only according to the preset operation matched with the input operation.
S1806: the electronic device detects whether a user inputs a preset exit operation, if yes, S1807 is executed; otherwise, continuing to detect.
Likewise, the electronic device may be a portable electronic device having a touch screen, and the electronic device may detect whether a user inputs a preset exit operation through the touch screen. Specifically, the electronic device may receive an operation acting on the touch screen, compare the received operation with a preset exit operation, and determine that the electronic device receives the preset exit operation when the received operation is consistent with the preset exit operation.
Similarly, the electronic device may also be an electronic device having an input device such as a keyboard, and the electronic device may detect whether the user inputs a preset exit operation through the keyboard. Specifically, the electronic device may receive an operation acting on a key released from the keyboard, then determine whether the released key is a specific key corresponding to a preset exit operation, and determine that the electronic device has received the preset exit operation when the released key is the specific key corresponding to the preset exit operation.
In this step, if the electronic device does not receive the operation of the user on the touch screen or the keyboard or the electronic device receives the operation of the user on the touch screen or the keyboard, but the operation is not matched with the preset exit operation, it may be determined that the electronic device does not receive the preset exit operation.
S1807: the first mode is exited.
In this step, when the electronic device receives the preset exit operation, the electronic device may respond to the preset exit operation and exit the first mode and return to the second mode. At this time, the electronic device may release the corresponding actions of each preset operation in the first mode, so that each preset operation corresponds to the corresponding actions of the system level or the application level in the second mode.
According to the display window control method provided by the embodiment of the application, when the electronic equipment is detected to enter the first mode, the corresponding actions of each preset operation in the first mode can be reloaded, then when the operation input by a user through the touch screen is detected, the acquired operation is matched with the preset operation, and then the corresponding actions of the matched preset operation are executed. The control of the size, position and layout of each display window in the first mode can be achieved without selecting a drag component. The operation difficulty is reduced, and the simplicity of operating windows in the multi-window office scene is improved.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the electronic device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, fig. 20 shows a schematic diagram of one possible composition of the electronic device 1900 involved in the above-described embodiment, and as shown in fig. 20, the electronic device 1900 may include: a first detection unit 1901, a wake-up unit 1902, a second detection unit 1903, a matching unit 1904, a control unit 1905, a third detection unit 1906, an exit unit 1907, and the like.
Wherein the first detection unit 1901 may be used to support the electronic device 1900 to perform step 1801 described above, and/or other processes for the techniques described herein.
The wake-up unit 1902 may be used to support the electronic device 1900 to perform step 1802, described above, etc., and/or other processes for the techniques described herein.
The second detection unit 1903 may be used to support the electronic device 1900 to perform steps 1803 and the like described above, and/or other processes for the techniques described herein.
The matching unit 1904 may be used to support the electronic device 1900 to perform steps 1804 and the like described above, and/or other processes for the techniques described herein.
The control unit 1905 may be used to support the electronic device 1900 in performing steps 1805 and the like described above, and/or other processes for the techniques described herein.
The third detection unit 1906 may be used to support the electronic device 1900 to perform steps 1806 and the like described above, and/or other processes for the techniques described herein.
The exit unit 1907 may be used to support the electronic device 1900 to perform steps 1807 and the like described above, and/or other processes for the techniques described herein.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided by the embodiment of the application is used for executing the display window control method, so that the same effect as the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module and a memory module. The processing module may be configured to control and manage an action of the electronic device, for example, may be configured to support the electronic device to execute the steps performed by the first detection unit 1901, the wake-up unit 1902, the second detection unit 1903, the matching unit 1904, the control unit 1905, the third detection unit 1906, and the exit unit 1907.
The memory module may be used to support the electronic device in storing program code, data, etc.
In addition, the electronic device may also include a communication module that may be used to support communication of the electronic device with other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital Signal Processing (DSP) and a combination of microprocessors, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a wifi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to the embodiment of the present application may be an electronic device having the structure shown in fig. 1. Specifically, the internal memory 121 shown in fig. 1 may store computer program instructions that, when executed by the processor 110, enable the electronic device to perform the steps of the display window control method described above.
The embodiment of the application also provides a computer storage medium, in which computer instructions are stored, which when run on an electronic device, cause the electronic device to execute the related method steps to implement the display window control method in the above embodiment.
The embodiment of the present application also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the display window control method in the above-mentioned embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is used for storing computer-executable instructions, and when the device is operated, the processor can execute the computer-executable instructions stored in the memory, so that the chip executes the display window control method in each method embodiment.
The electronic device, the computer storage medium, the computer program product, or the chip provided by the embodiments of the present application are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Finally, it should be noted that: the foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A display window control method, characterized by comprising:
When the electronic equipment enters a first mode, reloading actions corresponding to each preset operation in the first mode, wherein the first mode is a mode for adjusting and controlling a plurality of display windows by reloading actions corresponding to the operation of an original system level and an application level;
determining a preset operation matched with the operation input by the user;
And controlling any display window in the plurality of display windows according to an action corresponding to a preset operation matched with the operation input by the user.
2. The display window control method according to claim 1, wherein before reloading the actions corresponding to the respective preset operations in the first mode when the electronic device enters the first mode, further comprising:
if the user input preset awakening operation is detected, responding to the preset awakening operation, and entering a first mode.
3. The display window control method according to claim 1 or 2, characterized by, after reloading respective actions corresponding to preset operations in a first mode when the electronic device enters the first mode, further comprising:
If the user is detected to input a preset exit operation, responding to the preset exit operation, and exiting the first mode.
4. The display window control method according to claim 1, wherein the determining a preset operation matching an operation input by a user includes:
Receiving an operation input by a user;
And matching the operation input by the user with each preset operation according to the operation information of the operation input by the user, and determining the preset operation matched with the operation input by the user.
5. The method according to claim 2, wherein if a user input of a preset wake-up operation is detected, the method is performed in response to the preset wake-up operation, and includes:
when the user input operation is detected, matching the operation input by the user with a preset awakening operation;
If the operation input by the user is consistent with the preset awakening operation, determining that the preset awakening operation is input by the user, responding to the preset awakening operation, and entering a first mode.
6. The display window control method according to claim 3, wherein if a user input of a preset exit operation is detected, the first mode is exited in response to the preset exit operation, and the second mode is returned, comprising:
When the user input operation is detected, matching the operation input by the user with a preset exit operation;
If the operation input by the user is consistent with the preset exit operation, determining that the user inputs the preset exit operation, and responding to the preset exit operation to exit the first mode.
7. The display window control method according to claim 1, characterized by further comprising:
And when the operation input by the user is not matched with the preset operation, feeding back invalid operation prompt information to the user.
8. An electronic device, comprising:
The reloading unit is used for reloading actions corresponding to each preset operation in a first mode when the electronic equipment enters the first mode, wherein the first mode is a mode for adjusting and controlling a plurality of display windows by reloading actions corresponding to the operation of an original system level and an application level;
a determining unit configured to determine a preset operation that matches an operation input by a user;
and the control unit is used for controlling any display window in the plurality of display windows according to an action corresponding to a preset operation matched with the operation input by the user.
9. An electronic device, the electronic device comprising: at least one processor; at least one memory; wherein the at least one memory stores computer program instructions therein that, when executed by the at least one processor, cause the electronic device to perform the display window control method of any of claims 1-7.
10. A chip system comprising a processor coupled to a memory, the processor executing a computer program stored in the memory to implement the display window control method of any of claims 1 to 7.
11. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the display window control method according to any one of claims 1 to 7.
CN202010570732.XA 2020-06-19 2020-06-19 Display window control method and electronic equipment Active CN113821129B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010570732.XA CN113821129B (en) 2020-06-19 2020-06-19 Display window control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010570732.XA CN113821129B (en) 2020-06-19 2020-06-19 Display window control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN113821129A CN113821129A (en) 2021-12-21
CN113821129B true CN113821129B (en) 2024-08-02

Family

ID=78912187

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010570732.XA Active CN113821129B (en) 2020-06-19 2020-06-19 Display window control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113821129B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116578219B (en) * 2023-04-28 2024-06-14 北京洞悉网络有限公司 Form page self-adaptive display method and device suitable for left and right double screens, computer equipment and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102890610A (en) * 2011-07-18 2013-01-23 中兴通讯股份有限公司 Document processing method of terminal with touch screen and terminal with touch screen
CN104777979A (en) * 2015-03-31 2015-07-15 努比亚技术有限公司 Terminal as well as touch operation method and device for same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166517A (en) * 2014-07-31 2014-11-26 中兴通讯股份有限公司 Method and device for operating touch screen device
US10048856B2 (en) * 2014-12-30 2018-08-14 Microsoft Technology Licensing, Llc Configuring a user interface based on an experience mode transition
EP3232308B1 (en) * 2015-01-04 2020-09-30 Huawei Technologies Co. Ltd. Notification information processing method, device, and terminal
CN108509137A (en) * 2017-02-28 2018-09-07 中兴通讯股份有限公司 Redefine the method and device of the manipulation display area of screen

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102890610A (en) * 2011-07-18 2013-01-23 中兴通讯股份有限公司 Document processing method of terminal with touch screen and terminal with touch screen
CN104777979A (en) * 2015-03-31 2015-07-15 努比亚技术有限公司 Terminal as well as touch operation method and device for same

Also Published As

Publication number Publication date
CN113821129A (en) 2021-12-21

Similar Documents

Publication Publication Date Title
US11785329B2 (en) Camera switching method for terminal, and terminal
EP4221164B1 (en) Display method for electronic device with flexible display and electronic device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
US11994918B2 (en) Electronic device control method and electronic device
WO2020134869A1 (en) Electronic device operating method and electronic device
WO2020224449A1 (en) Split-screen display operation method and electronic device
CN110032307A (en) A kind of moving method and electronic equipment of application icon
WO2021036770A1 (en) Split-screen processing method and terminal device
CN112740152B (en) Handwriting pen detection method, handwriting pen detection system and related device
WO2021057699A1 (en) Method for controlling electronic device with flexible screen, and electronic device
CN115129410B (en) Desktop wallpaper configuration method and device, electronic equipment and readable storage medium
CN113805487A (en) Control instruction generation method and device, terminal equipment and readable storage medium
WO2021238370A1 (en) Display control method, electronic device, and computer-readable storage medium
WO2020221062A1 (en) Navigation operation method and electronic device
CN114201738A (en) Unlocking method and electronic equipment
CN113821129B (en) Display window control method and electronic equipment
CN119311357A (en) Translation result display method, device and electronic device
US12204741B2 (en) Screenshot method and related device
CN118444832B (en) Touch operation method and electronic device
CN119271063A (en) Operation response method, electronic device, chip and storage medium
WO2024046179A1 (en) Interaction event processing method and apparatus
WO2023207715A1 (en) Screen-on control method, electronic device, and computer-readable storage medium
WO2023142822A1 (en) Information interaction method, watch, and computer readable storage medium
CN116560768A (en) Interface display method and electronic equipment
CN116560544A (en) Interaction method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant