CN113703632A - Interface display method and device and electronic equipment - Google Patents
Interface display method and device and electronic equipment Download PDFInfo
- Publication number
- CN113703632A CN113703632A CN202111013110.8A CN202111013110A CN113703632A CN 113703632 A CN113703632 A CN 113703632A CN 202111013110 A CN202111013110 A CN 202111013110A CN 113703632 A CN113703632 A CN 113703632A
- Authority
- CN
- China
- Prior art keywords
- interface
- display
- area
- objects
- screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an interface display method and device and electronic equipment, belongs to the technical field of electronic equipment, and aims to solve the technical problem that a plurality of interfaces are displayed in a split screen mode at present, so that a user cannot see interface display contents clearly. The method comprises the following steps: receiving an interface display instruction; and displaying an interface of at least two objects on the screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible. The interface display method and device and the electronic equipment are used for displaying the interface.
Description
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to an interface display method and device and electronic equipment
Background
When a user wants to use different applications (APP for short), switching between the different applications is required, and the operation is cumbersome.
At present, a mode of split-screen displaying a plurality of applications is generally adopted to avoid a process of switching between applications. The split-screen display of the multiple applications is to display the display interfaces of the applications in different display areas of the screen, and each display interface adopts a reduced display mode due to the size of the screen, so that the pictures and fonts of each display interface are small, and the user cannot conveniently see the display contents of the interfaces clearly.
Disclosure of Invention
The embodiment of the application aims to provide an interface display method, an interface display device and electronic equipment, and the technical problem that a plurality of interfaces are displayed in a split screen mode at present, so that a user cannot see interface display contents clearly.
In a first aspect, an embodiment of the present application provides an interface display method, including:
receiving an interface display instruction;
and displaying an interface of at least two objects on the screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
In a second aspect, an embodiment of the present application provides an interface display apparatus, including:
the receiving module is used for receiving an interface display instruction;
and the display module is used for responding to the interface display instruction and displaying the interfaces of at least two objects on the screen, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, by receiving an interface display instruction, in response to the interface display instruction, displaying an interface of at least two objects on a screen, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible; the interfaces of the objects can be displayed in the same display area of the screen at the same time, and then the interfaces of the objects can have the same size as that of the interfaces displayed independently, so that the interfaces of the objects are prevented from being reduced and displayed in different parts of the display area respectively, and a user can see the display content of the interfaces clearly.
Drawings
FIG. 1 is a schematic flow chart of an interface display method according to an embodiment of the present invention;
2-1 and 2-2 are schematic diagrams of an interface display method provided by an embodiment of the invention in a practical application scenario;
fig. 3 is a second schematic diagram of an interface display method provided by an embodiment of the invention in an actual application scenario;
fig. 4 is a third schematic diagram of an interface display method provided by an embodiment of the present invention in an actual application scenario;
FIG. 5 is a flowchart illustrating an interface display method according to an embodiment of the present invention;
FIG. 6 is a fourth schematic diagram of an interface display method provided by an embodiment of the present invention in an actual application scenario;
FIG. 7 is a flowchart illustrating an interface display method according to an embodiment of the present invention;
FIG. 8 is a fifth schematic view of an interface display method provided by an embodiment of the present invention in an actual application scenario;
FIG. 9 is a sixth schematic view of an interface display method provided by an embodiment of the present invention in an actual application scenario;
fig. 10 is a seventh schematic diagram of an interface display method provided by an embodiment of the present invention in an actual application scenario;
FIG. 11 is a flowchart illustrating an interface display method according to an embodiment of the present invention;
FIG. 12 is a flowchart illustrating an interface display method according to an embodiment of the present invention;
FIG. 13 is a schematic structural diagram of an interface display apparatus according to an embodiment of the present invention;
fig. 14 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The interface display method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
As shown in fig. 1, an embodiment of the present application provides an interface display method, which may be performed by an electronic device, in other words, may be performed by software or hardware installed on the electronic device, the method including the steps of:
step 101: and receiving an interface display instruction.
The interface display instruction may be an instruction to simultaneously display interfaces of a plurality of objects. When the user inputs the interface display instruction, an interface for simultaneously displaying a plurality of objects on the screen can be determined.
The object may be an application, may be a document, may be a function key in an application, or the like. If the object is an application, the interfaces of the multiple objects may be interfaces of multiple applications. If the object is a document, the interfaces of the plurality of objects may be interfaces of a plurality of documents. Or, if the object is a function key in the application, the interfaces of the plurality of objects may be interfaces corresponding to the plurality of function keys; taking an application as an example of an instant messaging application, the instant messaging application is opened, a display interface comprises 3 function keys which are respectively 'message', 'address list' and 'setting', each function key has a corresponding interface, the interfaces corresponding to the function keys can be sequentially displayed by clicking different function keys in the prior art, and in the embodiment of the application, the interfaces corresponding to a plurality of function keys can be simultaneously displayed by inputting the interface display instruction.
The interface display instruction can be input in various ways, and the input way can also be set according to the type of the object, and the embodiment of the application does not limit the input way.
For example, in the case where the object is an application, in one embodiment, as shown in fig. 2-1, the input mode is: in the case of displaying an interface of APP1 on a screen, a user inputs a first preset operation, a list of applications recently used by the user is displayed on the interface, including APP2 and APP3, and after receiving a selection operation of the user in APP2 and APP3, a pop-up prompt box displays "whether to enter the same-screen mode", as shown in fig. 2-2, the interface of APP1 and the interface of the application selected by the user are simultaneously displayed based on the operation of determining to enter the same-screen mode, which is input by the user. Wherein, the first preset operation may be a long press operation, a click operation, a sliding operation, and the like; the user selects the APP2 and/or APP3 as the same screen object through selection operations in the APP2 and the APP 3; when the user selects APP2 and APP3, the interfaces of the three applications may be displayed simultaneously.
Alternatively, when the object is an application, in another embodiment, as shown in fig. 3, the input method is: the electronic equipment displays icons of APP1 and APP2 on a desktop, drags the icon of APP2, enables the icon of APP2 to be overlapped with the icon of APP1, and directly displays the interfaces of APP1 and APP2 at the same time after the overlapping area and/or the overlapping time reach preset conditions. That is, the interface display instruction can be input by the input method through the operation of dragging.
For another example, when the object is a function key in an application, in one embodiment, as shown in fig. 4, the input method is: displaying a display interface of a certain application on a screen, wherein the display interface comprises a function key A, a function key B and a function key C, the currently displayed interface is an interface corresponding to the function key A, receiving a second preset operation of a user on the function key B and/or the function key C, and simultaneously displaying the interface corresponding to the function key A and the interface of the function key selected by the user. The second preset operation may be a long-time pressing operation, and the like, for example, after the user presses the function key B for a long time, the currently displayed interface and the interface corresponding to the function key B are simultaneously displayed.
Step 102: and displaying an interface of at least two objects on the screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
For the object and the related definitions of the interface of the object, refer to step 101, which is not described herein again.
The display areas of the interfaces of the at least two objects on the screen are the same, and specifically, the interfaces of the objects are all displayed in the same display area, and the interface size of each object is the same as the size of the display area. The interfaces of the at least two objects are at least partially visible, and at least part of the interface contents of the interfaces of the at least two objects can be visible in the same display area. For example, the interface of the at least two objects includes a first interface and a second interface, and for the user, the user can see at least part of the interface content of the first interface and at least part of the interface content of the second interface in the same display area. If the first interface is only partially visible, it can be understood that the interface size of the first interface is the same as the size of the same display area, but the user can only see part of the interface content of the first interface in the same display area; similarly, if the second interface is only partially visible, the interface size of the second interface is the same as the size of the same display area, but the user can only see part of the interface content of the second interface in the same display area.
In this embodiment, the display area of the interface for displaying at least two objects may be the entire display area of the screen, and in this case, the interface of each object may be displayed in full screen. In practical applications, the display area for displaying the interface of at least two objects may also be a partial display area of the screen, the electronic device may be an electronic device with a curved screen, the curved screen is generally divided into a main display area and auxiliary display areas on the left and right sides, the auxiliary display areas on the left and right sides are generally function areas, and then for the electronic device, the interface of each object may be displayed in the main display area.
The method includes the steps that an interface display instruction is received, and interfaces of at least two objects are displayed on a screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible; the interfaces of the objects can be displayed in the same display area of the screen at the same time, and then the interfaces of the objects can have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently, for example, when the interfaces of the objects are displayed in a full screen mode, the interfaces of the objects have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently (the interfaces are displayed in a full screen mode), the interfaces of the objects are prevented from being reduced and are displayed in different parts of the display area respectively, and therefore a user can see the display content of the interfaces clearly.
For convenience of explaining the interface display method provided by the embodiment of the present application, hereinafter, the solution provided by the embodiment of the present application is described by taking the number of the objects as two as an example, then the at least two objects include a first object and a second object, and the interface of the at least two objects includes a first interface of the first object and a second interface of the second object. It should be understood that the at least two objects including the first object and the second object are only an example and do not represent a limitation to the embodiments of the present application, and the at least two objects may further include a third object, a fourth object, and so on, and when the at least two objects include more objects, reference may also be made to the interface display method provided in the following embodiments.
To facilitate the user's viewing of the first interface and the second interface, in one embodiment, as shown in fig. 5, step 102, in response to the interface display instruction, displays an interface of at least two objects on the screen, including:
step 1021: and determining the relative position relationship of the first interface and the second interface in response to the interface display instruction.
The relative position relationship between the first interface and the second interface can be the relative position relationship between the upper layer and the lower layer; specifically, the relative positional relationship of the first interface and the second interface includes: the first interface is displayed over the second interface, or the second interface is displayed over the first interface.
There may be various implementation manners for determining the relative position relationship between the first interface and the second interface, and the determination may be performed according to the opening sequence of the first interface and the second interface, for example, if the first interface is opened first and the second interface is opened later, it may be determined that the first interface is displayed on the upper layer of the second interface, or it may be determined that the second interface is displayed on the upper layer of the first interface. The relative position relationship between the first interface and the second interface may be determined according to a selection of a user, for example, after receiving an interface display instruction, a selection box pops up, and the user selects an interface displayed on an upper layer or an interface displayed on a lower layer in the selection box.
In a more preferred embodiment, determining a relative positional relationship between the first interface and the second interface in response to the interface display instruction specifically includes: responding to the interface display instruction, and acquiring display content of a first interface of the first object and display content of a second interface of the second object; determining the priority of the first interface and the priority of the second interface according to the display content of the first interface and the display content of the second interface; and determining the relative position relationship between the first interface and the second interface according to the priority of the first interface and the second interface.
The priority of the first interface and the priority of the second interface are determined according to the display content of the first interface and the display content of the second interface, and the priority of the first interface and the priority of the second interface may be determined according to the complexity of the pictures in the first interface and the second interface, the dynamic and static states of the pictures in the first interface and the second interface, and the like, wherein the complexity may be specifically determined according to the number of elements in the interface. For example, if the screen complexity of the first interface is greater than the screen complexity of the second interface, it may be determined that the priority of the first interface is higher than the priority of the second interface; for another example, if the display content of the first interface is static content and the display content of the second interface is dynamic content, it may be determined that the priority of the second interface is higher than the priority of the first interface.
In order to avoid the user missing important contents in the two interfaces, in one embodiment, the priority of the first interface and the second interface is determined according to the display contents of the first interface and the second interface, and the priority comprises the following steps: and determining the priority of the first interface and the priority of the second interface according to the importance of the display content of the first interface and the display content of the second interface, wherein the priority of the interface with higher importance of display content in the first interface and the second interface is higher than the priority of the interface with lower importance of display content.
In practical applications, the importance of the display contents of the first interface and the second interface is determined according to a preset algorithm or a rule established by a user in advance. For example, the electronic device may provide a setting interface for the interface display method provided in the embodiment of the present application, and in the setting interface, the user may input an importance ranking of the display content, for example, the importance of the video playing content is greater than the instant messaging content, the instant messaging content is greater than the advertisement content, and the like. Particularly, when the advertisement content includes advertisement content during video playing, as the display content of the first interface and the second interface changes, the importance of the display content of the two interfaces may also change, and thus the priority of the first interface and the second interface may also change. In other words, the priorities of the first interface and the second interface may be changed according to a change in the comparison result of the importance of the display contents of the first interface and the display contents of the second interface. For example, in the above example, based on the rule input by the user, the first interface is a video playing content interface, and the second interface is an instant messaging interface, when the video playing content interface is a feature film, the first interface may be preferentially displayed (e.g., the first interface is placed on the second interface), and when the video playing content interface is played from the feature film to the advertisement content, the second interface may be preferentially displayed (e.g., the second interface is placed on the first interface); by the method, the time for playing the advertisement can be utilized while important contents are prevented from being missed by the user.
Determining the relative position relationship between the first interface and the second interface according to the priorities of the first interface and the second interface, wherein the interface with higher priority is displayed on the upper side, and the interface with lower priority is displayed on the lower side. For example, when there are three interfaces, i.e., a first interface, a second interface and a third interface, and the priorities are ranked as first interface > second interface > third interface, the relative position relationship of the first interface, the second interface and the third interface may be that the second interface is below the first interface and the third interface is below the second interface.
Step 1022: and displaying a first interface of the first object and a second interface of the second object in the same display area on the screen according to the relative position relation.
If the relative position relationship is determined according to the priorities of the first interface and the second interface, in the case that the priorities of the first interface and the second interface are different, the interface with higher priority in the first interface and the second interface can be displayed on the interface with lower priority in the same display area on the screen.
For example, in the above embodiment, after the priorities of the first interface and the second interface are determined according to the importance of the display content of the first interface and the display content of the second interface, the interface with higher importance of display content in the first interface and the second interface may be displayed on the interface with lower importance of display content.
It can be understood that, by the above scheme, the priorities of the first interface and the second interface are determined, and the relative position relationship between the first interface and the second interface is determined according to the priorities, so that the interface with the higher priority can be displayed on the interface with the lower priority, thereby avoiding confusion of display contents of the first interface and the second interface caused by the fact that the first interface and the second interface are displayed on the same layer, and further facilitating the user to watch the first interface and the second interface.
In one embodiment, the step 1022 of displaying the first interface of the first object and the second interface of the second object in the same display area on the screen according to the relative position relationship further includes: and under the condition that the priorities of the first interface and the second interface meet preset conditions, simultaneously displaying the first interface and the second interface in a preset transparency in the same display area on the screen according to the relative position relation.
The preset condition may be that the priorities of the first interface and the second interface are the same, and it is understood that the priority of the first interface and the priority of the second interface are the same only as an example of the preset condition, and the preset condition may also be another condition in an actual application. When the priorities of the first interface and the second interface are the same, any one interface can be selected from the first interface and the second interface to be displayed on the upper layer, and the other interface is displayed on the lower layer. When the first interface and the second interface are displayed in the same display area on the screen, the first interface and the second interface can be simultaneously displayed with preset transparency, so that the first interface and the second interface can be clearly displayed, and a user can conveniently watch the first interface and the second interface.
Displaying the first interface and the second interface at the same time with a preset transparency, specifically, displaying the first interface and the second interface with the same transparency; or the first interface and the second interface are displayed with different transparencies, for example, the first interface is displayed with a first preset transparency, the second interface is displayed with a second preset transparency, and the first preset transparency is different from the second preset transparency. The same transparency value, the first preset transparency value and the second preset transparency value may be set and adjusted according to actual conditions, and are not described herein again.
In the foregoing embodiment, in the case where the priority of the first interface and the priority of the second interface are different, when the interface with the higher priority is displayed on the interface with the lower priority, the interface displayed on the upper layer may completely cover the interface which blocks the lower layer. Considering that although the interface of the upper layer is higher in priority, the user may have a need to view the interface of the lower layer at the same time, therefore, in one embodiment, the specific display manner of displaying the interface of higher priority on the interface of lower priority includes: and adjusting the transparency of the interface with higher priority in the first interface and the second interface, so that the transparency of the interface with higher priority is greater than that of the interface with lower priority.
Wherein the greater the transparency, the more transparent the interface. When the interface with higher priority is displayed on the interface with lower priority, the transparency of the upper layer interface and/or the lower layer interface can be adjusted to meet the requirement that a user can watch the two interfaces simultaneously. For example, when a higher priority interface is displayed on top of a lower priority interface, the transparency of the upper interface may be increased such that the transparency of the upper interface is greater than the transparency of the lower interface, such that a user may see the lower interface through the upper interface. The transparency of the lower interface can be reduced to enhance the display effect (such as enhancing the definition) of the lower interface while increasing the transparency of the upper interface. The specific value of transparency adjustment of the upper interface and the lower interface may be set according to actual needs, and this is not specifically limited in this application embodiment.
In practical application, under the condition that the upper interface is ensured to be displayed clearly, the brightness of the upper interface and/or the brightness of the lower interface can be further adjusted so as to enhance the display of the lower interface. For example, the brightness of the upper interface may be reduced and/or the brightness of the lower interface may be enhanced to achieve an enhanced display of the lower interface.
It can be understood that, by the above scheme, the transparency of the interface with higher priority in the first interface and the second interface is adjusted, so that the transparency of the interface with higher priority is greater than that of the interface with lower priority, and thus, a user can see the upper interface and the lower interface at the same time, and the user experience is improved.
Taking the priority of the first interface higher than that of the second interface as an example, the first interface may be displayed on an upper layer of the second interface, and considering that the first interface may include unimportant content in addition to important content and the second interface may include important content in addition to unimportant content, the first interface is completely displayed on the upper layer of the second interface, and if the user is not aware of the important content, the important content in the second interface may be missed.
In view of this, in an embodiment, the display area includes at least two display sub-areas, and the determining a relative position relationship between the first interface and the second interface in response to the interface display instruction in step 1021 specifically includes: in response to the interface display instruction, respectively determining the relative position relation of the first interface and the second interface in each display sub-area of the at least two display sub-areas. In step 1022, according to the relative position relationship, displaying a first interface of the first object and a second interface of the second object in the same display area on the screen, specifically including: and displaying the content corresponding to the first interface of the first object and the content corresponding to the second interface of the second object on each display sub-area of the display area on the screen according to the relative position relation of the first interface and the second interface in each display sub-area.
The display area is divided into a plurality of display sub-areas, which may be divided based on the display area, for example, according to the size and shape of the display area. In a preferred embodiment, the display area is divided into: and dividing the display area according to the display contents of the first interface and the second interface. Then, before the determining the relative position relationship between the first interface and the second interface in each display sub-area, the method further includes: acquiring display content of a first interface of a first object and display content of a second interface of a second object; and determining the at least two display sub-areas of the display area according to the display content of the first interface and the display content of the second interface. The rule for dividing the display area may refer to the above-mentioned manner for determining the priority of the two interfaces, and may divide the area according to the dynamic and static states, the complexity, the importance of the display content, and the like of the pictures of the first interface and the second interface. If the left area of the first interface is a dynamic picture, the right area is a static picture, the left area of the second interface is a static picture, and the right area is a dynamic picture, the display area can be divided into two left and right display sub-areas. If the complexity of the upper part area of the first interface is higher, the complexity of the lower part area is lower, the complexity of the upper part area of the second interface is lower, and the complexity of the lower part area is higher, the display area can be divided into an upper display sub-area and a lower display sub-area. Or, for example, the display content of the middle area of the first interface is important, the display content of the peripheral display area is less important than that of the middle area, the display content of the peripheral area of the second interface is important, and the display content of the middle display area is less important than that of the peripheral area, so that the display area can be divided into two display sub-areas, namely the middle area and the peripheral area.
Taking the example that the display area includes two display sub-areas, the two display sub-areas are respectively a first display sub-area and a second display sub-area, and because the sizes of the first interface and the second interface are the same as the size of the display area, the first interface is also equivalently divided into two interface sub-areas, respectively a first interface sub-area and a second interface sub-area, and the second interface is also equivalently divided into two interface sub-areas, respectively a third interface sub-area and a fourth interface sub-area; the first interface sub-area and the third interface sub-area correspond to the first display sub-area, and the second interface sub-area and the fourth interface sub-area correspond to the second display sub-area.
Then, taking the first display sub-region and the second display sub-region as an example, for each display sub-region of the at least two display sub-regions, determining a relative positional relationship of the first interface and the second interface in each display sub-region respectively may be:
determining the relative position relation of the first interface sub-area and the third interface sub-area in the first display sub-area; the first interface sub-region and the third interface sub-region can then be displayed in the first display sub-region according to the relative positional relationship. Determining the relative position relation of the second interface sub-area and the fourth interface sub-area in the second display sub-area; and the second interface sub-area and the fourth interface sub-area can be displayed on the second display sub-area according to the relative position relation.
Further, the relative position relationship between the first interface sub-area and the third interface sub-area in the first display sub-area may be determined by the priority of the first interface sub-area and the third interface sub-area, and then the interface sub-area with higher priority in the first interface sub-area and the third interface sub-area may be displayed on the interface sub-area with lower priority in the first display sub-area. Similarly, the relative position relationship between the second interface sub-area and the fourth interface sub-area in the second display sub-area may be determined by the priority of the second interface sub-area and the fourth interface sub-area, and then the interface sub-area with higher priority in the second interface sub-area and the fourth interface sub-area may be displayed on the interface sub-area with lower priority in the second display sub-area. For determining the priority of the first interface sub-region and the priority of the third interface sub-region, and determining the priority of the second interface sub-region and the priority of the fourth interface sub-region, reference may be made to the foregoing embodiments, which are not described herein again.
It can be understood that, by the above-mentioned scheme, the same layer may display contents of different interfaces, for example, in the above-mentioned embodiment, in each display sub-area, the interface sub-area with higher priority is displayed in the upper layer, and the interface sub-area with lower priority is displayed in the lower layer, so that the upper layer may display contents with higher priority in different interfaces, and the lower layer may display contents with lower priority in different interfaces.
As shown in fig. 6, the screen is divided into three display sub-areas, namely a display sub-area 1, a display sub-area 2 and a display sub-area 3, two objects are APP1 and APP2, the priority of the content corresponding to the display sub-area 1 in the APP1 interface in the display sub-area 1 is higher than that of the content corresponding to the display sub-area 1 in the APP2 interface, the content corresponding to the display sub-area 1 in the APP1 interface in the display sub-area 1 is displayed in the upper layer, and the content corresponding to the display sub-area 1 in the APP2 interface is displayed in the lower layer (the application corresponding to the interface content displayed in the lower layer is indicated by parentheses in the drawing); the priority of the content corresponding to the display subarea 2 in the APP2 interface in the display subarea 2 is higher than that of the content corresponding to the display subarea 2 in the APP1 interface, the content corresponding to the display subarea 2 in the APP2 interface is displayed on the upper layer in the display subarea 2, and the content corresponding to the display subarea 2 in the APP1 interface is displayed on the lower layer; the priority of the content corresponding to the display subarea 3 in the APP1 interface in the display subarea 3 is higher than that of the content corresponding to the display subarea 3 in the APP2 interface, the content corresponding to the display subarea 3 in the APP1 interface in the display subarea 3 is displayed on the upper layer, and the content corresponding to the display subarea 3 in the APP2 interface is displayed on the lower layer; the upper layer displays the interface contents of APP1 and APP2, and the lower layer also displays the interface contents of APP1 and APP 2.
It can be understood that, through the above scheme, important contents in the first interface and the second interface can be both displayed on an upper layer, and unimportant contents in the first interface and the second interface can be both displayed on a lower layer, so that a user can more directly acquire the important contents in the first interface and the second interface, thereby avoiding missing the important contents in the two interfaces.
It should be noted that different priority determination modes may be adopted when determining the priority of the two interfaces in each display sub-area. For example, in the display sub-area 1 and the display sub-area 2, the priority of the interface display can be determined according to the importance of the two interface contents, while in the display sub-area 3, the importance of the two interface contents is the same, and the priority of the interface display cannot be determined, at this time, the priority of the interface display can be determined in other ways, such as determining the priority according to the opening order of APP1 and APP 2.
After the relative position relationship between the content display of the two interfaces in each display sub-area is determined, the transparency, brightness, and the like of the interfaces in each display sub-area may be further adjusted, which may be referred to in the foregoing embodiments specifically, and is not described herein again.
To facilitate the user's operation of the first interface and the second interface, as shown in fig. 7, in one embodiment, after displaying an interface of at least two objects on the screen in response to the interface display instruction at step 102, the method further includes:
step 103: and responding to an interface operation instruction, and displaying interface options of the at least two objects.
The interface operation instruction can be long-time pressing operation, clicking operation, sliding operation and the like. When the user inputs the interface operation instruction, interface options of a first object and a second object may be displayed, where the first object and the second object are both applications, which are a first application APP1 and a second application APP2, respectively, as shown in fig. 8.
Step 104: receiving selection operation of the interface options, and determining an operation object corresponding to the selection operation; and performing target operation on the determined operation object.
The operation object may include a first object and/or a second object, and taking the objects as applications as an example, the operation object may include a first application and/or a second application.
As shown in fig. 8, according to the selection operation of the user on the interface option, the application that the user wants to operate can be determined. The user may select the first application and/or the second application in the interface options. After the operation object is determined, the user can operate the operation object through subsequent operations. Taking the subsequent operation of the user as an example of increasing the volume, if the user selects the first application as the operation object, when the user performs the operation of increasing the volume, the first application will respond, the volume is increased, the second application will not respond, and the volume remains unchanged. If the user selects the first application and the second application as the operation objects, when the user performs an operation of increasing the volume, both the first application and the second application respond, and the volume is increased.
Further, in an embodiment, in a case that the display area includes at least two display sub-areas, after the determining an operation object corresponding to the selection operation, the method further includes: determining an operation area corresponding to the operation object, wherein the operation area corresponding to the operation object comprises: one display sub-region of the display region or the at least two display sub-regions; the performing target operation on the determined operation object comprises: and performing the target operation on the operation object in the determined operation area.
In practical application, when the display area is divided into a plurality of display sub-areas to display two interfaces according to the above embodiment, the interface operation instruction may be a long-press operation, a click operation, a slide operation, or the like of a user on one of the display sub-areas, and then after an operation object is determined according to a selection operation of the user, according to an operation area corresponding to the determined operation object, an area to which a subsequent target operation is directed may be determined to be the operation area. If the operation area is the interface area corresponding to the certain display sub-area of the interface of the operation object, the area targeted by the subsequent target operation is the interface area, namely the certain display sub-area.
For example, the determined operation region corresponding to the operation object is an interface region a corresponding to a certain display sub-region of the interface of the operation object, the subsequent target operation is a transparency adjustment operation, and the user selects the first application as the operation object, so that after the user performs the transparency adjustment operation, the transparency of the interface region a in the first application interface is adjusted, and the transparency of other regions in the interface is unchanged; in this way, the transparency of the interface region corresponding to the display sub-region in the first application interface can be independently adjusted. In practical application, the operation of adjusting the transparency may specifically be a long-press operation, and the transparency may be continuously adjusted by controlling the long-press time, where if the long-press time is longer, the lower the transparency is, the more the interface region corresponding to the display sub-region in the first application interface is displayed in an enhanced manner, and the clearer the interface region is.
Or for example, the determined operation area corresponding to the operation object is all interfaces of the operation object, the subsequent target operation is a transparency adjustment operation, and the user selects the first application as the operation object, so that the transparency of the entire interface of the first application is adjusted after the user performs the transparency adjustment operation.
In practical application, after the operation object is determined, a selection box can be popped up for a user to select to determine the operation area corresponding to the operation object, that is, to determine the area targeted by the subsequent target operation. Referring to fig. 6, as shown in fig. 9, interface options of APP1 and APP2 are displayed in response to an interface operation instruction (left diagram in fig. 9), where the interface operation instruction may be triggered by a user pressing a certain display sub-region for a long time; after the user selects an operation object in the interface options (for example, APP1), popping up an area selection box (right diagram in fig. 9), where the area selection box may display "whether the operation object is globally valid", if the user selection is yes, the subsequent target operation may be performed on all interfaces of the operation object, and if the user selection is no, the subsequent target operation may be performed on the interface area corresponding to the display sub-area of the interface of the operation object.
By the scheme, the user can accurately operate the object and the area which are required to be operated, so that the user experience is further improved.
In one embodiment, when the two objects include the target application and the operation guide, the operation object may default to the target application and the operation guide, and may no longer be determined based on the selection operation of the user.
When the two objects include the target application and the operation guide, the guidance interface of the operation guide may be displayed on the target interface of the target application, and specifically, the guidance interface of the operation guide may be displayed on the target interface of the target application with a preset transparency. The preset transparency can be set according to actual needs.
The operation guide may be recorded for a target function of the target application, and the user may implement the target function of the target application under the operation guide of the operation guide. For example, the operation guide is recorded according to the "add-friend" function of the target application, and the user can add a friend in the target application according to the operation guide of the operation guide. In practical application, different operation guides can be recorded according to different functions of a target application, and when a certain function of the target application is to be realized, an interface of the target application and a guide interface of the operation guide corresponding to the function can be displayed on the same screen.
The display positions of the elements such as the characters and the icons in the guide interface of the operation guide can be designed according to the display positions of the elements such as the characters and the icons in the interface of the target application, so that when the guide interface of the operation guide is displayed on the target interface of the target application, the elements such as the characters and the icons in the guide interface can shield important contents in the interface of the target application as far as possible.
Further, after displaying the guide interface of the operation guide and the target interface of the target application on the screen, the method further includes: and playing the target operation guide in the operation guide, and responding to the received operation matched with the target operation guide, and automatically playing the next operation guide of the operation guide. And if the operation matched with the target operation guide is not received, repeatedly playing the target operation guide. As shown in fig. 10, the guidance interface of the operation guidance is displayed on the upper layer, the target interface of the target application is displayed on the lower layer (shown on the lower layer by brackets), and the target operation guidance "click here" is played on the screen, after the user clicks, both the target application and the operation guidance respond to the click, the target application executes the operation corresponding to the click, and the operation guidance plays the next operation guidance; if the user does not click, the operation guide circularly plays the target operation guide 'click here'.
In one embodiment, the method further comprises: and controlling the operation guide to exit the display in response to receiving the operation matched with the last operation guide of the operation guide.
And when the user executes the operation matched with the last operation guide of the operation guide, the target function of the target application can be realized, the operation guide can be controlled to exit at the moment, and only the target interface of the target application is displayed on the screen, so that the user can conveniently operate the target application subsequently.
It can be understood that, in the prior art, the operation guide is used, generally, after the user views the operation guide, the target application is operated by means of memory, and by the above scheme, the target application and the operation guide are displayed in the same display area of the screen, and the target operation guide of the operation guide is played; and responding to the received operation matched with the target operation guide, and automatically playing the next operation guide of the target operation guide, so that the user can perform corresponding operation on the target application according to each played operation guide in real time, and the target function of the target application can be quickly and accurately realized.
Based on the interface display method provided in the foregoing embodiment of the present application, a more specific interface display method is further provided in the embodiment of the present application, and it should be understood that the interface display method is only a specific example, and does not represent a limitation on the interface display method provided in the embodiment of the present application. In the method, the object includes: a first application and a second application; the interface of the object includes: a first interface of a first application and a second interface of a second application; the display area of the two interfaces is the same as the area of the screen, that is, both interfaces are displayed in full screen, as shown in fig. 11, the method includes the following steps:
step 201: and receiving an interface display instruction.
Step 202: responding to the interface display instruction, and acquiring display content of a first interface of a first application and display content of a second interface of a second application; and dividing the screen into at least two display sub-areas according to the display content of the first interface and the display content of the second interface.
Step 203: for each display sub-region of the at least two display sub-regions, determining a priority of the first interface and the second interface in the each display sub-region.
Step 204: and in each display subarea, displaying the content corresponding to the interface with higher priority in the first interface and the second interface on the content corresponding to the interface with lower priority.
Step 205: and responding to the interface operation instruction, and displaying interface options of the first application and the second application.
Step 206: receiving selection operation of the interface options, determining an operation object corresponding to the selection operation, and performing target operation on the determined operation object; wherein the operation object comprises a first application and/or a second application.
The specific implementation of steps 201 to 206 may refer to the above embodiments, and will not be described herein again.
It can be understood that, in the embodiment of the present application, by receiving an interface display instruction, in response to the interface display instruction, both the first interface of the first application and the second interface of the second application are displayed in a full screen, so that the interfaces of the multiple applications may have the same size as that of the interfaces of the single application when being displayed, and the interfaces of the multiple applications are prevented from being reduced and displayed in different display areas of the screen, so that a user can see interface display contents clearly.
Based on the interface display method provided by the above embodiment of the present application, another more specific interface display method is also provided by the embodiment of the present application. In the method, the object includes: a target application and an operation guide for the target application; the interface of the object includes: a target interface of the target application and a guide interface of the operation guide; the display area of the two interfaces is the same as the area of the screen, that is, both interfaces are displayed in full screen, as shown in fig. 12, the method includes the following steps:
step 301: and receiving an interface display instruction.
The interface display instruction may be to drag the operation guide to the target application in the desktop, so that the operation guide and the target application are overlapped.
Step 302: and responding to the interface display instruction, and displaying the target interface and the guide interface on the full screen on the screen.
And the guide interface is displayed on the upper layer of the target interface with preset transparency. The preset transparency can be set and adjusted according to actual needs.
Step 303: and playing the target operation guide of the operation guide.
Step 304: and automatically playing the next operation guide of the operation guide in response to receiving the operation matched with the target operation guide.
Step 305: and controlling the operation guide to exit the display in response to receiving the operation matched with the last operation guide of the operation guide.
The specific implementation of steps 301 to 305 may also refer to the above embodiments, and will not be described herein again.
Based on the interface display method provided by the embodiment of the application, the target application and the operation guide are displayed in a full screen mode, and the next operation guide of the target operation guide is automatically played in response to the fact that the operation matched with the target operation guide is received, so that a user can perform corresponding operation on the target application according to each played operation guide in real time, and therefore the target function of the target application can be quickly and accurately achieved.
It should be noted that, in the interface display method provided in the embodiment of the present application, the execution main body may be an interface display apparatus, or a control module in the interface display apparatus, which is used for executing a method for loading interface display. The method for displaying the interface provided by the embodiment of the present application is described by taking a method for executing the loading interface display by the interface display device as an example.
An embodiment of the present application further provides an interface display device 40, as shown in fig. 13, the interface display device includes:
the receiving module 401 is configured to receive an interface display instruction.
The interface display instruction may be an instruction to simultaneously display interfaces of a plurality of objects. When the user inputs the interface display instruction, an interface for simultaneously displaying a plurality of objects on the screen can be determined. The object may be an application, may be a document, may be a function key in an application, or the like.
A display module 402, configured to display an interface of at least two objects on a screen in response to the interface display instruction, where the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
Specifically, the interfaces of the at least two objects are all displayed in the same display area, and the interface size of each object is the same as the size of the display area. The interfaces of the at least two objects are at least partially visible, and at least part of the interface contents of the interfaces of the at least two objects can be visible in the same display area. The display area of the interface for displaying at least two objects may be the entire display area of the screen or a partial display area of the screen.
The interface display device provided by the embodiment of the application displays the interfaces of at least two objects on the screen by receiving the interface display instruction and responding to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible; the interfaces of the objects can be displayed in the same display area of the screen at the same time, and then the interfaces of the objects can have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently, for example, when the interfaces of the objects are displayed in a full screen mode, the interfaces of the objects have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently (the interfaces are displayed in a full screen mode), the interfaces of the objects are prevented from being reduced and are displayed in different parts of the display area respectively, and therefore user experience is improved.
In one embodiment, the at least two objects include: the interface of the at least two objects comprises: a first interface of a first object and a second interface of a second object; the display module 402 further includes a relative position relationship determining unit and a display unit, where the relative position relationship determining unit is configured to determine a relative position relationship between the first interface and the second interface in response to the interface display instruction, where the relative position relationship between the first interface and the second interface includes: the first interface is displayed on the second interface, or the second interface is displayed on the first interface; the display unit is used for displaying a first interface of the first object and a second interface of the second object in the same display area on the screen according to the relative position relation.
In one embodiment, the relative positional relationship determination unit includes an acquisition unit and a determination subunit. The acquisition unit is used for responding to the interface display instruction and acquiring the display content of a first interface of a first object and the display content of a second interface of a second object. The determining subunit is configured to determine priorities of the first interface and the second interface according to the display content of the first interface and the display content of the second interface, and determine a relative position relationship between the first interface and the second interface according to the priorities of the first interface and the second interface. The display unit is specifically configured to display, in the same display area on the screen, an interface with a higher priority of the first interface and the second interface on an interface with a lower priority, when the priorities of the first interface and the second interface are different. In a specific embodiment, the display unit is further configured to adjust transparency of an interface with a higher priority in the first interface and the second interface, so that the transparency of the interface with the higher priority is greater than that of the interface with the lower priority.
In an embodiment, the display unit is further specifically configured to, when the priorities of the first interface and the second interface satisfy a preset condition, simultaneously display the first interface and the second interface in a preset transparency in the same display area on the screen according to the relative position relationship.
In one embodiment, the display area comprises at least two display sub-areas; the relative position relationship determining unit is specifically configured to determine, in response to the interface display instruction, for each display sub-region of the at least two display sub-regions, a relative position relationship of the first interface and the second interface in each display sub-region, respectively. The display unit is specifically configured to display, on the screen, content corresponding to a first interface of the first object and content corresponding to a second interface of the second object in each display sub-area of the display area according to the relative positional relationship.
Further, in an embodiment, the interface display apparatus 40 further includes a display area dividing module, where the display area dividing module is configured to, before the determining of the relative position relationship between the first interface and the second interface in each display sub-area, acquire display content of the first interface of the first object and display content of the second interface of the second object; and determining the at least two display sub-areas of the display area according to the display content of the first interface and the display content of the second interface.
In one embodiment, the interface display device 40 further includes an interface option display module, an operation object determination module, and an operation execution module; the interface option display module is used for responding to an interface operation instruction and displaying the interface options of the at least two objects; the operation object determining module is used for receiving selection operation of the interface options and determining an operation object corresponding to the selection operation; and the operation execution module is used for performing target operation on the determined operation object.
In one embodiment, the interface display device 40 further includes an operation region determining module, where the operation region determining module is configured to determine, after determining an operation object corresponding to the selection operation, an operation region corresponding to the operation object if the display region includes at least two display sub-regions; wherein, the operation area corresponding to the operation object comprises: the display area or one of the at least two display sub-areas. The operation execution module is specifically configured to perform the target operation on the operation object in the determined operation area.
In one embodiment, the at least two objects include a target application and an operation guide for the target application; the interface of the at least two objects comprises: a target interface of the target application and a guide interface of the operation guide; the interface displaying at least two objects on a screen includes: displaying a target interface of the target application and a guide interface of the operation guide in the same display area on a screen; the interface display device 40 further includes a playing module, after the interface of the at least two objects is displayed on the screen, for playing the target operation guide of the operation guide; and automatically playing the next operation guide of the target operation guide in response to receiving the operation matched with the target operation guide.
Further, the interface display device 40 may further include an exit module configured to control the operation guide to exit from display in response to receiving an operation matching the last operation guide of the operation guide.
The interface display device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The interface display device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The interface display device provided in the embodiment of the present application can implement each process implemented by the interface display device in the method embodiments of fig. 1 to 12, and is not described here again to avoid repetition.
The interface display device provided by the embodiment of the application displays the interfaces of at least two objects on the screen by receiving the interface display instruction and responding to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible; the interfaces of the objects can be displayed in the same display area of the screen at the same time, and then the interfaces of the objects can have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently, for example, when the interfaces of the objects are displayed in a full screen mode, the interfaces of the objects have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently (the interfaces are displayed in a full screen mode), the interfaces of the objects are prevented from being reduced and are displayed in different parts of the display area respectively, and therefore user experience is improved.
Optionally, as shown in fig. 14, an electronic device 500 is further provided in this embodiment of the present application, and includes a processor 510, a memory 509, and a program or an instruction stored in the memory 509 and executable on the processor 510, where the program or the instruction is executed by the processor 510 to implement each process of the above-mentioned embodiment of the interface display method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 14 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 14 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 510 is configured to receive an interface display instruction; and displaying an interface of at least two objects on the screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
The electronic device provided by the embodiment of the application displays the interfaces of at least two objects on the screen by receiving the interface display instruction and responding to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible; the interfaces of the objects can be displayed in the same display area of the screen at the same time, and then the interfaces of the objects can have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently, for example, when the interfaces of the objects are displayed in a full screen mode, the interfaces of the objects have the same size as that of the interfaces of the objects when the interfaces of the objects are displayed independently (the interfaces are displayed in a full screen mode), the interfaces of the objects are prevented from being reduced and are displayed in different parts of the display area respectively, and therefore user experience is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The electronic device 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or a backlight when the electronic device 500 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 14, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The electronic device 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the electronic device 500 includes some functional modules that are not shown, and are not described in detail herein.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the interface display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the interface display method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (13)
1. An interface display method, characterized in that the method comprises:
receiving an interface display instruction;
and displaying an interface of at least two objects on the screen in response to the interface display instruction, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
2. The interface display method according to claim 1, wherein the interface of the at least two objects includes: a first interface of a first object and a second interface of a second object;
the interface for displaying at least two objects on the screen in response to the interface display instruction comprises:
in response to the interface display instruction, determining a relative positional relationship between the first interface and the second interface, wherein the relative positional relationship between the first interface and the second interface includes: the first interface is displayed on the second interface, or the second interface is displayed on the first interface;
and displaying a first interface of the first object and a second interface of the second object in the same display area on the screen according to the relative position relation.
3. The interface display method according to claim 2, wherein the determining the relative positional relationship of the first interface and the second interface in response to the interface display instruction includes:
responding to the interface display instruction, and acquiring display content of a first interface of the first object and display content of a second interface of the second object;
determining the priority of the first interface and the priority of the second interface according to the display content of the first interface and the display content of the second interface;
determining the relative position relationship between the first interface and the second interface according to the priority of the first interface and the second interface;
the displaying the first interface of the first object and the second interface of the second object in the same display area on the screen according to the relative position relationship comprises: and under the condition that the priorities of the first interface and the second interface are different, displaying the interface with higher priority in the first interface and the second interface on the interface with lower priority in the same display area on the screen.
4. The interface display method according to claim 3, wherein the displaying a first interface of the first object and a second interface of the second object in the same display area on the screen in accordance with the relative positional relationship further comprises: and under the condition that the priorities of the first interface and the second interface meet preset conditions, simultaneously displaying the first interface and the second interface in a preset transparency in the same display area on the screen according to the relative position relation.
5. The interface display method according to claim 3, wherein an area of the display region is the same as an area of the screen, the interface display method further comprising:
and adjusting the transparency of the interface with higher priority in the first interface and the second interface, so that the transparency of the interface with higher priority is greater than that of the interface with lower priority.
6. The interface display method of claim 2, wherein the display area comprises at least two display sub-areas;
the determining the relative positional relationship of the first interface and the second interface in response to the interface display instruction comprises: in response to the interface display instruction, respectively determining the relative position relation of the first interface and the second interface in each display sub-area of the at least two display sub-areas;
the displaying the first interface of the first object and the second interface of the second object in the same display area on the screen according to the relative position relationship comprises: and displaying the content corresponding to the first interface of the first object and the content corresponding to the second interface of the second object on each display sub-area of the display area on the screen according to the relative position relationship between the first interface and the second interface in each display sub-area.
7. The interface display method according to claim 6, wherein before the determining the relative positional relationship of the first interface and the second interface in each display sub-region, respectively, the interface display method further comprises:
acquiring display content of a first interface of the first object and display content of a second interface of the second object;
and determining the at least two display sub-areas of the display area according to the display content of the first interface and the display content of the second interface.
8. The interface display method according to claim 1, further comprising:
responding to an interface operation instruction, and displaying interface options of the at least two objects;
receiving selection operation of the interface options, and determining an operation object corresponding to the selection operation;
and performing target operation on the determined operation object.
9. The interface display method according to claim 8, wherein the display area includes at least two display sub-areas, and after the determining of the operation object corresponding to the selection operation, the interface display method further includes: determining an operation area corresponding to the operation object; wherein, the operation area corresponding to the operation object comprises: one display sub-region of the display region or the at least two display sub-regions;
the performing target operation on the determined operation object comprises: and performing the target operation on the operation object in the determined operation area.
10. The interface display method according to claim 1, wherein the at least two objects include a target application and an operation guide for the target application; the interface of the at least two objects comprises: a target interface of the target application and a guide interface of the operation guide;
the interface displaying at least two objects on a screen includes: displaying a target interface of the target application and a guide interface of the operation guide in the same display area on a screen;
after the interface of at least two objects is displayed on the screen, the interface display method further includes:
playing the target operation guide of the operation guide;
and automatically playing the next operation guide of the operation guide in response to receiving the operation matched with the target operation guide.
11. The interface display method according to claim 10, further comprising:
and controlling the operation guide to exit the display in response to receiving the operation matched with the last operation guide of the operation guide.
12. An interface display apparatus, the apparatus comprising:
the receiving module is used for receiving an interface display instruction;
and the display module is used for responding to the interface display instruction and displaying the interfaces of at least two objects on the screen, wherein the display areas of the interfaces of the at least two objects on the screen are the same, and the interfaces of the at least two objects are at least partially visible.
13. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the interface display method of any one of claims 1-11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111013110.8A CN113703632A (en) | 2021-08-31 | 2021-08-31 | Interface display method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111013110.8A CN113703632A (en) | 2021-08-31 | 2021-08-31 | Interface display method and device and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113703632A true CN113703632A (en) | 2021-11-26 |
Family
ID=78658057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111013110.8A Pending CN113703632A (en) | 2021-08-31 | 2021-08-31 | Interface display method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113703632A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111367483A (en) * | 2020-03-04 | 2020-07-03 | 维沃移动通信有限公司 | Interaction control method and electronic equipment |
CN111766945A (en) * | 2020-06-05 | 2020-10-13 | 维沃移动通信有限公司 | Interface display method and device |
CN112486386A (en) * | 2020-11-30 | 2021-03-12 | 维沃移动通信有限公司 | Screen projection method, screen projection device, electronic equipment and readable storage medium |
-
2021
- 2021-08-31 CN CN202111013110.8A patent/CN113703632A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111367483A (en) * | 2020-03-04 | 2020-07-03 | 维沃移动通信有限公司 | Interaction control method and electronic equipment |
CN111766945A (en) * | 2020-06-05 | 2020-10-13 | 维沃移动通信有限公司 | Interface display method and device |
CN112486386A (en) * | 2020-11-30 | 2021-03-12 | 维沃移动通信有限公司 | Screen projection method, screen projection device, electronic equipment and readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109375890B (en) | Screen display method and multi-screen electronic equipment | |
CN108536365B (en) | Image sharing method and terminal | |
US11675442B2 (en) | Image processing method and flexible-screen terminal | |
US20220300302A1 (en) | Application sharing method and electronic device | |
CN107943390B (en) | Character copying method and mobile terminal | |
CN111338530B (en) | Control method of application program icon and electronic equipment | |
CN109491738B (en) | Terminal device control method and terminal device | |
CN109525710B (en) | Method and device for accessing application program | |
CN109828706B (en) | Information display method and terminal | |
CN111026316A (en) | Image display method and electronic equipment | |
CN110602565A (en) | Image processing method and electronic equipment | |
CN111142769A (en) | Split screen display method and electronic equipment | |
WO2019184947A1 (en) | Image viewing method and mobile terminal | |
CN109683764B (en) | Icon management method and terminal | |
CN109407948B (en) | Interface display method and mobile terminal | |
CN110908750B (en) | Screen capturing method and electronic equipment | |
CN109407949B (en) | Display control method and terminal | |
CN110196668B (en) | Information processing method and terminal equipment | |
CN110968229A (en) | Wallpaper setting method and electronic equipment | |
CN111399715B (en) | Interface display method and electronic equipment | |
CN110908554B (en) | Long screenshot method and terminal device | |
US11669237B2 (en) | Operation method and terminal device | |
CN111049976B (en) | Interface display method, electronic device and computer readable storage medium | |
CN110928619B (en) | Wallpaper setting method and device, electronic equipment and medium | |
CN111124709A (en) | Text processing method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |