Disclosure of Invention
According to the image rendering method and the electronic device provided by the embodiment of the application, the high frame rate of the focus part (such as the focus window/focus control) is preferentially ensured, and the performance fluency experience of a user in a high-load scene is improved.
In order to achieve the above purpose, the following technical scheme is adopted in the embodiment of the application.
In a first aspect, the present application provides an image rendering method, where the execution subject of the method may be an electronic device or may be a component (for example, a chip system, or a processor) located in the electronic device, and the description will be given below taking the execution subject as an example of the electronic device. The method may include the electronic device displaying a first interface, the first interface including a first component and a second component, the first interface being an interface of a first application. In response to receiving an operation on the first interface, the electronic device determines, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focal nodes in a first refresh period. And when the first component is a focus node and the second component is a non-focus node, the electronic device renders the first component through a first process and renders the second component through a second process in the first refresh period, and the rendering priority of the first process is higher than that of the second process.
Wherein a first component may be understood as a first window or first control and a second component may be understood as a second window or second control.
The focus node may be understood as a node to be processed preferentially, such as a node operated by a current user, i.e. a node focused by the user, or a node acted by an input event.
Thus, after receiving the user operation on the interface, the electronic device determines whether each component on the interface is a focus node. If the first component is the focus node, the first process of the electronic device renders the first component in real-time. If the second component is a non-focus node, a second process of the electronic device renders the second component. Therefore, the electronic equipment only needs to ensure real-time rendering of the focus node focused by the user, and preferably ensures high frame rate of the focus node (such as a focus window/focus control), so that smooth image display of the focus node is ensured, the phenomenon of blocking white screen is avoided, and the smooth performance experience of the user in a high-load scene is improved.
In some designs, the method further includes multiplexing, by the electronic device, an existing texture image of the second component during a second refresh period that is a previous period to the first refresh period when the second process is not completing the rendering of the second component during the first refresh period. Therefore, when the second process does not render the second component, the existing texture image of the second component can be multiplexed, the second component does not need to be rendered in real time, the real-time rendering of the focus node focused by the user is ensured, and the high frame rate of the focus node (such as a focus window/focus control) is preferentially ensured.
In some designs, the method further includes the electronic device displaying a second interface, the second interface including a third component, the second interface being an interface of a second application, the second application being different from the first application. In response to an operation on the second interface, the electronic device determines whether the first component, the second component and the third component are focus nodes in a first refresh period according to the first information of the first component, the first information of the second component and the first information of the third component. And when the third component is a focus node and the first component and the second component are both non-focus nodes, the electronic device renders the third component through the first process and renders the first component and the second component through the second process in the first refresh period.
When a plurality of non-focus nodes exist, the electronic equipment determines the rendering priority of each non-focus node, and then the electronic equipment renders the corresponding parts of each non-focus node according to the rendering priority of each non-focus node, so that orderly rendering of the parts is realized, and the shortage of CPU and GPU resources under a high-load scene is relieved.
In one design, the rendering of the first component and the second component by the second process, specifically, the electronic device determines the rendering priority of the first component and the second component according to the second information of the first component and the second information of the second component, the second information and the weight thereof. And when the rendering priority of the first component is higher than that of the second component, the electronic equipment sequentially renders the first component and the second component through the second process.
In one design, the second information includes at least one of a visible window where no buffer is generated, an invisible window where no buffer is generated, a window where a buffer is generated and where there is a visible dirty region, a window where a buffer is generated and where there is no visible dirty region, and a static window where a buffer is generated.
In one design, the first information includes one or more of transparency, focus window of window management service, stacking order, whether it is occluded, whether it is animated.
In one design, the first information includes transparency, and the determining, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focal nodes in a first refresh period is specifically:
And when the transparency of the first component is larger than that of the second component, determining that the first component is a non-focus node, and the second component is a focus node.
In one design, the first information includes a focus window of a window management service, and the determining, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focus nodes in a first refresh period is specifically:
when the first component is detected to be operated by a user and the second component is not operated, the first component is determined to be a focus node, and the second component is determined to be a non-focus node.
In one design, the first information includes a focus window of a window management service, and the determining, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focus nodes in a first refresh period is specifically:
when the sight line of the user is detected to fall on the first component and not fall on the second component, the first component is determined to be a focus node, and the second component is determined to be a non-focus node.
In one design, the first information includes a stacking order, and the determining, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focal nodes in a first refresh period specifically includes:
And when the stacking sequence of the first component is higher than that of the second component, determining that the first component is a focus node, and determining that the second component is a non-focus node.
In one design, the first information includes whether to animate, and the determining, according to the first information of the first component and the first information of the second component, whether the first component and the second component are focus nodes in a first refresh period includes:
and when the image of the first component is a multi-frame image and the image of the second component is a single-frame image, determining that the first component is a focus node and the second component is a non-focus node.
In one design, the first information includes transparency, focus window of window management service, stacking order, whether it is blocked, and whether it is animated, and the determining whether the first component and the second component are focus nodes in the first refresh period according to the first information of the first component and the first information of the second component is specifically:
And determining whether the first component and the second component are focus nodes according to the transparency and the weight thereof, the focus window of the window management service and the weight thereof, the superposition order and the weight thereof, whether the first component and the second component are blocked and the weight thereof, and whether the animation and the weight thereof are performed.
In one design, the component includes a window or control.
In one design, the focus node is a node to be processed preferentially, such as a node operated by a current user, i.e., a node focused by the user, or a node acted by an input event.
In a second aspect, the application provides electronic equipment, which comprises a display module, a determination module and a rendering module, wherein the display module is used for displaying a first interface, the first interface comprises a first component and a second component, and the first interface is an interface of a first application. The determining module is used for determining whether the first component and the second component are focus nodes in a first refresh period according to the first information of the first component and the first information of the second component in response to receiving the operation on the first interface, wherein the focus nodes are the nodes focused by a user. The rendering module is used for rendering the first component through a first process and rendering the second component through a second process in the first refresh period when the first component is a focus node and the second component is a non-focus node, and the rendering priority of the first process is higher than that of the second process.
Wherein a first component may be understood as a first window or first control and a second component may be understood as a second window or second control.
The focus node is a node to be processed preferentially, such as a node operated by a current user, namely a node concerned by the user, or a node acted by an input event.
Thus, after receiving the user operation on the interface, the electronic device determines whether each component on the interface is a focus node. If the first component is the focus node, the first process of the electronic device renders the first component in real-time. If the second component is a non-focus node, a second process of the electronic device renders the second component. Therefore, the electronic equipment only needs to ensure real-time rendering of the focus node focused by the user, and preferably ensures high frame rate of the focus node (such as a focus window/focus control), so that smooth image display of the focus node is ensured, the phenomenon of blocking white screen is avoided, and the smooth performance experience of the user in a high-load scene is improved.
In one design, the rendering module is configured to multiplex an existing texture image of the second component in a second refresh period when the second process is not complete with rendering of the second component in the first refresh period, where the second refresh period is a period before the first refresh period. Therefore, when the second process does not render the second component, the existing texture image of the second component can be multiplexed, the second component does not need to be rendered in real time, the real-time rendering of the focus node focused by the user is ensured, and the high frame rate of the focus node (such as a focus window/focus control) is preferentially ensured.
In one design, the display module is configured to display a second interface, where the second interface includes a third component, and the second interface is an interface of a second application, where the second application is different from the first application. The determining module is used for responding to the operation of the second interface and determining whether the first component, the second component and the third component are focus nodes in a first refresh period according to the first information of the first component, the first information of the second component and the first information of the third component. And the rendering module is used for rendering the third component through the first process and rendering the first component and the second component through the second process in the first refresh period when the third component is a focus node and the first component and the second component are both non-focus nodes.
When a plurality of non-focus nodes exist, the electronic equipment determines the rendering priority of each non-focus node, and then the electronic equipment renders the corresponding parts of each non-focus node according to the rendering priority of each non-focus node, so that orderly rendering of the parts is realized, and the shortage of CPU and GPU resources under a high-load scene is relieved.
In one design, the rendering module is configured to determine rendering priorities of the first component and the second component according to second information of the first component and second information of the second component, the second information and weights thereof, and sequentially render the first component and the second component through the second process when the rendering priority of the first component is higher than the rendering priority of the second component.
In one design, the second information includes at least one of a visible window where no buffer is generated, an invisible window where no buffer is generated, a window where a buffer is generated and where there is a visible dirty region, a window where a buffer is generated and where there is no visible dirty region, and a static window where a buffer is generated.
In one design, the first information includes one or more of transparency, focus window of window management service, stacking order, whether it is occluded, whether it is animated.
In one design, the first information includes transparency, and the determining module is configured to:
And when the transparency of the first component is larger than that of the second component, determining that the first component is a non-focus node, and the second component is a focus node.
In one design, the first information includes a focus window of a window management service, and the determining module is configured to:
when the first component is detected to be operated by a user and the second component is not operated, the first component is determined to be a focus node, and the second component is determined to be a non-focus node.
In one design, the first information includes a focus window of a window management service, and the determining module is configured to:
when the sight line of the user is detected to fall on the first component and not fall on the second component, the first component is determined to be a focus node, and the second component is determined to be a non-focus node.
In one design, the first information includes a stacking order, and the determining module is configured to:
And when the stacking sequence of the first component is higher than that of the second component, determining that the first component is a focus node, and determining that the second component is a non-focus node.
In one design, the first information includes whether to animate, and the determining module is configured to:
and when the image of the first component is a multi-frame image and the image of the second component is a single-frame image, determining that the first component is a focus node and the second component is a non-focus node.
In one design, the first information includes transparency, focus window of window management service, stacking order, whether it is blocked, and whether it is animated, and the determining module is configured to:
And determining whether the first component and the second component are focus nodes according to the transparency and the weight thereof, the focus window of the window management service and the weight thereof, the superposition order and the weight thereof, whether the first component and the second component are blocked and the weight thereof, and whether the animation and the weight thereof are performed.
In one design, the component includes a window or control.
In one design, the focus node is a node to be processed preferentially, such as a node operated by a current user, i.e., a node focused by the user, or a node acted by an input event.
In a third aspect, the application provides an electronic device comprising one or more processors, and a memory having code stored therein, which when executed by the processors, causes the electronic device to perform the method of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of the first aspect.
The specific embodiments and the corresponding technical effects of each of the second aspect to the fourth aspect may be referred to the specific embodiments and the technical effects of the first aspect.
Detailed Description
A window refers to a basic unit set in a graphical user interface by an application program for using data, and is a visual interface of the application program. The application and data are integrated within the window. In the window, the user can operate the application program in the window to manage, generate and edit the data.
Windows are containers of controls, each having a respective control (or view) or the like. A control is a graphical user interface element, such as a window or text box, that displays arrangement information that can be changed by a user. And, the control is included in the application program as a basic visual building block, and controls all data processed by the application program and interaction operations on the data.
The process of drawing the interface is generally that the electronic device receives an operation on the interface displayed on the display screen of the electronic device. Illustratively, the electronic device receives a user's click operation on an icon (e.g., an icon of a setup application) displayed on a display screen of the electronic device. In response to the operation, the electronic device receives first refresh information at a first time. The electronic device determines information of the first frame image, such as information of each control at the first refresh time, and the like. And the electronic equipment renders each control according to the information of the first frame image so as to complete the rendering of the whole window, draws the first frame image, and sends the first frame image to a display screen for display. Since the refresh period of the screen of the electronic device is fixed, at the second refresh time, the electronic device receives the second refresh information and determines information of the second frame image, such as information of each control at the second refresh time, and the like. And the electronic equipment renders each control according to the information of the second frame image so as to complete the rendering of the whole window, draws the second frame image, and sends the second frame image to a display screen for display. Thus, and so on, the electronic device obtains a plurality of frame images. These frame images are displayed sequentially on the screen of the electronic device, i.e., the screen can be enabled to display an interface with an animation effect at a constant frame rate.
Therefore, each control and each window need to be rendered in the drawing process, and under a high-load scene, more controls and windows need to be rendered, and the conditions of frame loss, white screen, blocking and the like can occur, so that the user experience is affected.
In order to solve the above technical problem, the present application provides an image rendering method, which is applied to an electronic device, and the method may include: the electronic device displays a first interface that includes a first component and a second component, which may each be a window or control. In response to receiving an operation on the first interface, the electronic device determines whether the first component and the second component are focal nodes, which are nodes of interest to the user, during a first refresh period. And when the first component is a focus node and the second component is a non-focus node, rendering the first component through the first process and rendering the second component through the second process in a first refresh period, wherein the rendering priority of the first process is higher than that of the second process. That is, after receiving the user operation on the interface, the electronic device determines whether each component on the interface is a focus node. If the first component is the focus node, the first process of the electronic device renders the first component in real-time. If the second part is a non-focus node, the second process of the electronic device renders the second part, and the texture image of the existing second part can be multiplexed when the second process does not render the second part. Therefore, the electronic equipment only needs to ensure real-time rendering of the focus node focused by the user, and preferably ensures high frame rate of the focus node (such as a focus window/focus control), so that smooth image display of the focus node is ensured, the phenomenon of blocking white screen is avoided, and the smooth performance experience of the user in a high-load scene is improved.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 1, the electronic device 100 may include a processor 110, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, a memory 120, an antenna 1, a wireless communication module 160, a display 170, and a sensor module 150, etc. Wherein the sensor module 150 may include a pressure sensor 150A, a touch sensor 150B, etc.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the display 170, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antenna 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example, the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., as applied to electronic devices. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 1, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves to radiate through the antenna 1. In some embodiments, the wireless communication module 160 receives application information sent by a server.
The electronic device implements display functions through the GPU, the display 170, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 170 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display 170 is used to display images, videos, and the like. The display 170 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 170, N being a positive integer greater than 1.
Memory 120 may be used to store computer-executable program code that includes instructions. The memory 120 may include a stored program area and a stored data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device (e.g., audio data, phonebook, etc.), and so forth. In addition, memory 120 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device and data processing by executing instructions stored in the memory 120 and/or instructions stored in a memory provided in the processor.
The pressure sensor 150A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 150A may be disposed on the display 170. The pressure sensor 150A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 150A, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display 170, the electronic apparatus detects the intensity of the touch operation according to the pressure sensor 150A. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 150A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity smaller than a first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The touch sensor 150B is also referred to as a "touch device". The touch sensor 150B may be disposed on the display 170, and the touch sensor 150B and the display 170 form a touch screen, which is also referred to as a "touch screen". The touch sensor 150B is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 170. In other embodiments, the touch sensor 150B may also be disposed on a surface of the electronic device at a different location than the display 170.
Of course, the electronic device may also include other functional units, which are not limited by the embodiment of the present application.
In addition, the terms and the like related to the actions of the embodiments of the present application may be referred to each other without limitation. The names of the messages or the names of the parameters in the messages in the embodiment of the application are only an example, and other names can be adopted in the specific implementation without limitation.
The electronic device may be a mobile phone, a tablet computer, a laptop, a notebook, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, and the specific form of the electronic device is not particularly limited in the embodiment of the present application.
In some embodiments, the software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservices architecture, or a cloud architecture. By way of example, taking a layered architecture as an example of a software system of the electronic device 100, in conjunction with the image rendering scheme described in fig. 2, fig. 2 shows a schematic diagram of a combination of software and hardware of the electronic device 100 in the image rendering scheme.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface.
As shown in fig. 2, the electronic device 100 includes at least an application layer (or application layer), an application architecture layer, a system service layer, and a hardware layer.
The application layer includes a series of application packages, such as settings, desktops, system applications, three-party applications, game applications, video applications, and the like. Among other things, the third party applications may include, but are not limited to, third party applications downloaded from an application marketplace or the like, such as social applications, shopping applications (not shown), and the like. The windows/controls of each application program can be respectively rendered, or uniformly rendered by a rendering service, and the rendered windows are submitted to a synthesizer of a system service layer for synthesis. In some embodiments, an application in the application layer may submit rendering instructions to a rendering service in the system service layer to complete rendering of the application interface, etc.
The application architecture layer may include Rosen render backend, user program framework, ability framework, window.
The system service layer includes a rendering service, a Window Management Service (WMS), a data management service (database MANAGEMENT SERVICE, DMS), a device management service (production MANAGEMENT SYSTEM, PMS), an Application Management Service (AMS), an intranet management system (INTRANET MANAGEMENT SYSTEM, IMS), and a compositor. The WMS is used to manage windows of applications, such as desktop windows, systemUI windows, negative one-screen windows, and the like. Illustratively, the WMS may manage window types, window levels, window locations, calculate window layouts, perform window morphology (e.g., full screen, floating window, split screen, maximize, minimize, etc.) handoff management, and the like.
The rendering service comprises an animation module and a unified rendering module. Wherein the animation module may be configured to modify properties (e.g., position, size, etc.) of the control in each frame of picture (or image) included in the animation to generate an animation effect. Optionally, the properties of the control are different in different frames.
Alternatively, the animation module may be located at an application layer, and executed through an application User Interface (UI) thread of an application.
The unified rendering module may execute a rendering service based on a unified rendering mechanism, which may also be referred to as executing a unified rendering service. The unified rendering mechanism refers to that the unified rendering module can receive rendering nodes of all applications and then render the rendering nodes of all applications together. Specifically, the method can traverse all rendering nodes based on the control attribute modified by the animation module to complete rendering operation, and can send rendering results to a display screen for display later.
In embodiments of the present application, a rendering node may also be referred to as a layer, or other name.
The hardware layer comprises a synthesizer, and the synthesizer can be used for synthesizing the layers corresponding to each window, such as a desktop layer, a SystemUI layer, a negative one-screen layer and the like. Wherein, the desktop window corresponds to the desktop layer, systemUI window corresponds to SystemUI layer, negative one-screen window corresponds to negative one-screen layer, etc. Illustratively, the compositors may include hardware compositors and graphics processors (graphic processing unit, GPUs), etc.
It should be understood that the architecture of the combination of software and hardware of the electronic device 100 shown in fig. 2 is only exemplary, and in practical applications, it may further include more or fewer modules, and the layers to which the modules belong may also be different. For example, a framework module or the like may also exist between the unifying application and the rendering service. Other division modes of each layer are also possible, and the application is not limited to this.
The technical solutions related to the following embodiments may be implemented in devices having structures as shown in fig. 1 and fig. 2.
Fig. 3-1 and fig. 3-2 are schematic flow diagrams of an image rendering method according to an embodiment of the present application, where, as shown in fig. 3-2, the method is applied to an electronic device, and the method may include:
s301, the electronic device displays a first interface, wherein the first interface comprises at least one component, such as a first component and a second component, and the first interface is an interface of a first application.
It will be appreciated that the first and second components may belong to the same application.
In one example, the above-described components (e.g., first component and second component) may be understood as windows or controls. Accordingly, a first component may be understood as a first window or first control and a second component may be understood as a second window or second control. Similarly, a subsequent third component may also be understood as a third window or third control.
In one example, windows are containers of controls, each having a respective control (or view), component (widget), etc. therein. A control is a graphical user interface element, such as a window or text box, that displays arrangement information that can be changed by a user. And, the control is included in the application program as a basic visual building block, and controls all data processed by the application program and interaction operations on the data.
In one example, the window is the most important part of the user interface. Which is a rectangular area on the screen of the electronic device corresponding to an application program, is a visual interface of the application program. When a user initiates an application, causing the application to start running, the application creates and displays a window. In some embodiments, the user may also perform various operations on the window, such as when the user manipulates an object in the window, the application may respond accordingly. For another example, the user may terminate the application by closing a window. For another example, the user may select a corresponding application by selecting a window.
S302, in response to receiving the operation on the first interface, the electronic device determines whether the first component and the second component are focus nodes in a first refresh period, wherein the focus nodes are nodes focused by a user.
In one particular implementation, as shown in connection with FIG. 3-1, step ① is in response to receiving an operation on an interface of an application, a focus node determination module of an electronic device determines whether a first component and a second component are focus nodes during a first refresh period.
In an example, the operation may be understood as any operation on the first interface, for example, a click operation, a browse operation, a slide operation, an edit operation, a cursor touch operation, and the like.
In one particular implementation, S302 may be implemented as, in particular, an electronic device determining, based on first information of a first component and a second component, whether the first component and the second component are focal nodes during a first refresh period.
Wherein the first information is used to distinguish between focal nodes and non-focal nodes. Since the judgment of the focus node and the non-focus node is based on the attention degree of the user, namely, the part focused by the user is the focus node, and the part not focused by the user is the non-focus node. Thus, the first information is more likely to be presentable, so the first information may include one or more of transparency, a focus window of a window management service, a stacking order (z-order), whether it is occluded, whether it is animated.
In one particular implementation, the first information includes transparency, and S302 may be implemented in particular as the electronic device determining that the first component is a non-focus node and the second component is a focus node when the transparency of the first component is greater than the transparency of the second component. By way of example, assuming that the first information of the first component includes a transparency of 50% and the first information of the second component includes a transparency of 10%,50% >10%, the electronic device determines that the first component is a non-focus node and the second component is a focus node.
In a specific implementation, the first information includes a focus window of the window management service, that is, the window management service detects a window operation of the user, for example, the window management service detects a part in which the mouse is hovered, or the window management service detects that a display position of the part is located in a middle area of the display screen, or the window management service detects a part on which an input event of the user acts. S302 may be implemented in particular as the electronic device determining that the first component is a focus node and the second component is a non-focus node when it is detected that the user is operating the first component and that the second component is not operated. Illustratively, the first component is a first window and the second component is a second window, meaning that the user is more interested in the first window when editing on the first window. At this time, the electronic device detects the operation of the user, and determines that the first window is a focus window and the second window is a non-focus window.
In one particular implementation, the first information includes a focus window of a window management service, e.g., the window management service detects a line of sight of a user. S302 may be implemented in particular as the electronic device determining that the first component is a focus node and the second component is a non-focus node when it is detected that the user' S line of sight falls on the first component and not on the second component.
In a specific implementation manner, the first information includes a stacking order, and S302 may be specifically implemented in a manner that when the stacking order of the first component is a top layer and the stacking order of the second component is a bottom layer, that is, the setting level of the first component is closer to the user plane than the setting level of the second component, where the electronic device determines that the first component is a focus node and the second component is a non-focus node.
In one specific implementation, the first information includes whether to animate, and S302 may be implemented specifically in such a way that when the image of the first component is a multi-frame image (i.e. animation or video display) and the image of the second component is a single-frame image (i.e. picture display), the electronic device determines that the first component is a focus node and the second component is a non-focus node.
In one particular implementation, the first information includes whether the first component is occluded, and S302 may be implemented in particular by the electronic device determining that the first component is a focus node and the second component is a non-focus node when it is detected that the first component is not occluded and the second component is occluded.
Of course, the above implementation manners may be combined with each other, specifically, the first information may include transparency and whether it is blocked, or the first information may include a stacking order and a focus window of the window management service, or the first information may include transparency, whether it is animated, and whether it is blocked. The embodiment of the present application is not particularly limited. When the first information includes a plurality of dimension information, the electronic device may determine whether each component is a focus node according to each dimension information and a weight thereof.
Illustratively, assume that the first information includes transparency, a focus window of the window management service, a superimposition order, whether it is occluded, whether it is animated. The transparency is large and the weight is 0, the transparency is small and the weight is 1, the focus window of the window management service is 1, the non-focus window of the window management service is 0, the stacking sequence is closer to the weight of a user and is 1, otherwise, the stacking sequence is 0, the weight of the blocked window is 0, the weight of the non-blocked window is 1, the weight of the animation window is 1, and the weight of the non-animated window is 0.
For example, assume that the first information of the first part may include that the user operates the first part with a transparency of 50%, the stacking order is top layer, and is not occluded, and that the first information of the second part may include that the stacking order is bottom layer, is occluded, and is not animated. Then, the first component score is 0+1+1+1=3, the second component score is 1+0+0+0=1, 3>1, and the electronic device may determine that the first component is a focus node and the second component is a non-focus node.
For example, assume that the first information of the first part may include that the first part is located in the middle of the display area and is not occluded, and that the first information of the second part may include that the first part is located in the lower part of the display area and is not occluded, and that the first part is animated. Then, the first component score is 1+1+1=3, the second component score is 0+1+1=2, 3>2, and the electronic device may determine that the first component is a focus node and the second component is a non-focus node.
Before introducing S303, two rendering architectures to which the technical solution provided by the embodiments of the present application can be applied are described herein, where the two rendering architectures may include a separate rendering architecture and a unified rendering architecture. That is, the graphics architecture of the operating system of the electronic device provided by the embodiment of the present application may adopt a separate rendering architecture or a unified rendering architecture, and the specific contents are as follows:
First, the electronic device employs a split rendering architecture.
Fig. 4 shows a logical schematic of a split rendering. As shown in fig. 4, each application window (simply referred to as a window) may be rendered through a control tree and a rendering tree. Wherein the rendering tree is a data structure for generating the application interface. That is, the rendering tree records all information for generating one frame of the application. For example, a plurality of control modules (or controls) may be included in the control tree of the window, such as root control, control 1, control 2, control 3, and so on. Control information of the corresponding control is included in each control, for example, control information (such as a display position and the like) of control 1 in a window is included in control 1, and the control information can be used for controlling display of control 1. Each control in the control tree corresponds to each rendering node in the rendering tree one by one. The rendering tree may include a plurality of rendering nodes, such as rendering node 0, rendering node 1, rendering node 2, rendering node 3, and so on. Correspondingly, the root control may correspond to rendering node 0, control 1 may correspond to rendering node 1, control 2 may correspond to rendering node 2, and control 3 may correspond to rendering node 3. As described above, it is determined whether rendering node 0, rendering node 1, rendering node 2, and rendering node 3 are focus nodes. Assuming that the rendering node 2 is determined to be a focus node, and the rendering node 0, the rendering node 1, and the rendering node 3 are non-focus nodes, the rendering node 2 is allocated to the first process to perform rendering, that is, the first process performs rendering on the control 2 according to the control information of the control 2, as described below. Rendering node 0, rendering node 1, and rendering node 3 are assigned to a second process for rendering. That is, the second process renders the root control according to the control information of the root control, the second process renders the control 1 according to the control information of the control 1, and the second process renders the control 3 according to the control information of the control 3. And then, rendering results of all the processes are submitted to a synthesizer, wherein each application window corresponds to one layer in the synthesizer, and the synthesizer can synthesize all the application windows, namely, synthesize the layers corresponding to all the application windows. In embodiments of the present application, the rendering tree may also be referred to as a layer tree, or other names, etc.
In the separate rendering architecture, the hierarchical relationship of each window, the size of the window, and the like are managed by a window management service (window MANAGER SERVICE, WMS). That is, hierarchical management, size, and the like of layers corresponding to the windows are managed by the WMS.
Second, the electronic device adopts a unified rendering architecture.
Fig. 5 shows a logic schematic diagram of a unified rendering mechanism according to an embodiment of the present application. As shown in FIG. 5, in contrast to the split rendering architecture described above, the unified rendering architecture has a unifying application that can be used to abstract windows as controls for management, each window being a control in the unifying application. Illustratively, as shown in FIG. 5, the unifying application may manage the layout and hierarchy of various controls, such as control 1, control 2, container control 3, window control 4, status bar control 5, etc., through a control tree. Control 1 and control 2 may be existing defined controls, such as icons. Both window control 4 and status bar control 5 may be child nodes of container control 3. The unifying application can generate a corresponding rendering tree according to the control tree, for example, the rendering nodes in the generated rendering tree can comprise a root rendering node, rendering nodes 1 to 5 and the like, and the unifying application submits the rendering tree to the rendering service. It can be appreciated that control 1 corresponds to rendering node 1, control 2 corresponds to rendering node 2, container control 3 corresponds to rendering node 3, window control 4 corresponds to rendering node 4, status bar control 5 corresponds to rendering node 5, and so on. The rendering service determines whether each rendering node is a focus node according to the description by traversing each rendering node, distributes the focus node to a main thread (namely the first process) for rendering, and distributes a non-focus node to a secondary thread (namely the second process) for rendering, so that the rendering of the whole interface can be completed. Subsequently, the rendering result can be directly sent to a display screen for display.
In some embodiments, a unifying application may be understood as a virtualized application (or service) having the functionality of various applications such as WMS, desktop, systemUI, and the like. For example, a unifying application may have functionality similar to a WMS for managing the location, size, hierarchy, etc. of controls (e.g., including but not limited to window controls, etc.), as well as gesture management with split screen, full screen, etc. The function of managing icons and the like, the function of managing gestures such as multitasking, and the like may be provided similarly to the desktop.
The two rendering architectures shown in fig. 4 and fig. 5 are introduced herein to illustrate that the technical solution provided in the embodiment of the present application may be applied to at least the two rendering architectures. It can be understood that the technical solution according to the embodiment of the present application may also be applied to other rendering architectures, which are not listed in the embodiment of the present application.
And S303, when the first component is a focus node and the second component is a non-focus node, rendering the first component through a first process and rendering the second component through a second process in a first refresh period, wherein the rendering priority of the first process is higher than that of the second process.
The electronic equipment has parallel rendering processes, the processes with high rendering priority render focal nodes, and the processes with low rendering priority render non-focal nodes, so that the focal nodes are rendered in real time for the nodes (namely the focal nodes) focused by the user, the high frame rate is ensured, and the non-focal nodes are rendered for the nodes (namely the non-focal nodes) which are not focused by the user under the condition that the real-time rendering of the focal nodes is ensured, so that the graphic performance smooth experience of the user on the focused nodes is improved.
For example, as shown in fig. 6, assuming that the window a is a focus node and the window C is a non-focus node, when the electronic device receives the first refresh signal, the first process of the electronic device renders the window a, the second process of the electronic device renders the window C, and the rendered image is displayed on the screen.
In a specific implementation, as shown in connection with fig. 3-1, the focus node determination module of the electronic device determines that the first component is a focus node and the second component is a non-focus node in step ②. And then, the focus node determining module of the electronic equipment sends the determining result to the rendering service of the electronic equipment, and the rendering service distributes the rendering task corresponding to the first component to the first process for rendering and distributes the rendering task corresponding to the second component to the second process for rendering according to the determining result.
In the embodiment of the application, after the electronic equipment receives the operation of the user on the interface, the electronic equipment determines whether each component on the interface is a focus node. If the first component is the focus node, the first process of the electronic device renders the first component in real-time. If the second component is a non-focus node, a second process of the electronic device renders the second component. Therefore, the electronic equipment only needs to ensure real-time rendering of the focus node focused by the user, and preferably ensures high frame rate of the focus node (such as a focus window/focus control), so that smooth image display of the focus node is ensured, the phenomenon of blocking white screen is avoided, and the smooth performance experience of the user in a high-load scene is improved.
S304, multiplexing the existing texture image of the second component in the second refresh period when the second process is not complete with the rendering of the second component in the first refresh period, wherein the second refresh period is the previous period of the first refresh period.
That is, if the second process does not complete the rendering of the second component in the first refresh period, the texture image cached by the second component is multiplexed, so that the node concerned by the user (i.e. the focus node) is rendered in real time to ensure a high frame rate, and the node not concerned by the user (i.e. the non-focus node) is rendered in real time to ensure that the texture image existing in the non-focus node is multiplexed when the rendering of the non-focus node is not completed, thereby improving the graphic performance fluency experience of the user on the concerned node.
Along the above example, as shown in fig. 6, when the electronic device receives the first refresh signal, the first process of the electronic device renders the window a, and the second process of the electronic device renders the window C, but when the second process does not complete the rendering of the window C in the first refresh period, the electronic device displays the image rendered by the first process and the texture image (such as the image rendered in the second refresh period) existing in the window C on the screen. After the second process finishes rendering the window C, the texture image of the window C is cached.
After the rendering is completed, as shown in fig. 3-1, the rendering service of the electronic device in step ③ sends the rendered image to a compositor, which composites the rendering results. Then, step ④ the synthesizer sends the synthesized image to a display screen for display.
In some embodiments, embodiments of the present application are not limited to the first component and the second component, but may include a third component that may or may not be the same application as the first component and the second component. For example, taking the example that the third component belongs to the first component and the second component, fig. 7 is a schematic flow chart of an image rendering method provided by the embodiment of the present application, as shown in fig. 7, the method provided by the embodiment of the present application includes S301, where the method further includes:
S701, the electronic equipment displays a second interface, wherein the second interface comprises a third component, the second interface is an interface of a second application, and the second application is different from the first application.
Specifically, the specific description of S701 may refer to S301 described above, and will not be repeated here.
S702, in response to the operation on the second interface, determining whether the first component, the second component and the third component are focal nodes in the first refresh period according to the first information of the first component, the first information of the second component and the first information of the third component.
Specifically, the specific description of S702 may refer to S302 described above, and will not be repeated here.
S703, when the third component is a focus node and the first component and the second component are both non-focus nodes, rendering the third component through the first process and rendering the first component and the second component through the second process in the first refresh period.
Specifically, the specific description of S703 may refer to S303, which is not described herein.
In a specific implementation manner, the electronic device renders the first component and the second component through the second process, which may specifically be:
S7031, the electronic device determines, according to the second information of the first component and the second information of the second component, and the second information and the weight thereof, the rendering priorities of the first component and the second component.
Wherein the second information is different from the first information, the second information being used to prioritize rendering of the non-focal nodes. That is, the rendering order of the non-focus nodes can be judged by the second information. Thus, the second information is more prone to rendering urgent information, so the second information may include at least one of visible windows where no buffer is generated, invisible windows where no buffer is generated, windows where a buffer is generated and there are visible dirty regions, windows where a buffer is generated and there are only invisible dirty regions, and static windows where a buffer is generated.
Wherein the expression of the electronic device determining the priorities of the first component and the second component may be:
p(x)=∑ωifi,
Wherein ω i is the weight corresponding to the i-th feature, and f i is the i-th feature.
Illustratively, it is assumed that the second information includes a visible window in which no buffer is generated, an invisible window in which no buffer is generated, a window in which a buffer is generated and a visible dirty region is present, a window in which a buffer is generated and only an invisible dirty region is present, and a static window in which a buffer is generated. The weights of the information decrease in sequence, for example, the weight of the visible window which does not generate the buffer is 5, the weight of the invisible window which does not generate the buffer is 4, the weight of the window which generates the buffer and has the visible dirty area is 3, the weight of the window which generates the buffer and has only the invisible dirty area is 2, and the weight of the static window which generates the buffer is 1.
In example 1, if the second information of the first component is a visible window in which no buffer is generated, the second information of the second component is an invisible window in which no buffer is generated. The priority of the first component is then 5 and the priority of the second component is 4.
Example 2, if the second information of the second component is a static window of the cache, the second information of the first component is a window of the cache with only invisible dirty regions. The priority of the first component is then 2 and the priority of the second component is 1.
S7032, when the rendering priority of the first component is higher than that of the second component, the first component and the second component are sequentially rendered by the second process.
Along with example 1,5>4 above, the second process of the electronic device preferentially renders the first component, followed by rendering the second component.
Along with example 2,2>1 above, the second process of the electronic device preferentially renders the first component, followed by rendering the second component.
In the embodiment of the application, when a plurality of non-focus nodes exist, the electronic equipment determines the rendering priority of each non-focus node, and then the electronic equipment renders the corresponding parts of each non-focus node according to the rendering priority of each non-focus node, so that orderly rendering of each part is realized, and the shortage of CPU and GPU resources under a high-load scene is relieved.
In practical application, the method provided by the embodiment of the application needs to be described in detail in combination with the following application scenario. The method comprises the following steps:
scene one, rendering of different controls in the same window in the same application.
Fig. 8 is an interface schematic diagram of an electronic device according to the present application. As shown in fig. 8, the electronic device 100 displays an interface 801, on which an ordinary control 1, a video control 2, and a video control 3 are displayed on the interface 801. Taking the unified rendering architecture as an example, in combination with the unified rendering architecture shown in fig. 4, the interface 801 corresponds to a root control, namely, corresponds to a rendering node 0, the common control 1 corresponds to a control 1, namely, corresponds to a rendering node 1, the video control 2 corresponds to a control 2, namely, corresponds to a rendering node 2, and the video control 3 corresponds to a control 3, namely, corresponds to a rendering node 3. In the first refresh period, when the user slides the interface 801, the video control 2 of the interface 801 is located in the middle area of the display screen, and at this time, the electronic device 100 may detect that the common control 1 is located in the upper area of the display screen, and the image of the video control 2 is a single frame image, the video control 2 is located in the middle area of the display screen, and the image of the video control 2 is a multi-frame image, the video control 3 is located in the lower area of the display screen, and the image of the video control 3 is a multi-frame image. At this time, it is assumed that the weight of the middle region is 1, whereas the weight of the animation is 0, and the weight of the non-animation is 1. The electronic device may determine that the score of the common control 1 is 0+0=0, the score of the video control 2 is 1+1=2, and the score of the video control 3 is 0+1=1, 2>1>0, which means that the user pays attention to the video control 2, and at this time, the electronic device 100 determines that the video control 2 is a focus node, and the common control 1 and the video control 3 are non-focus nodes. Further, since the common control 1 generates a cached static window and the video control 3 is a visible window that does not generate a cache, the electronic device 100 may determine that the rendering priority of the video control 3 is higher than the rendering priority of the common node 1. Then, the electronic device 100 assigns the video control 2 to the first process and the common control 1 and the video control 3 to the second process. The first process of the electronic device 100 renders the video control 2 in real-time and displays the rendered image on the screen. The second process of the electronic device 100 renders the video control 3 first and then renders the common control 1, and if the second process does not complete the rendering of the common control 1 and the video control 3 in the first refresh period, the electronic device 100 displays the existing texture images of the common control 1 and the video control 3 on the screen. Therefore, the electronic equipment can ensure real-time rendering of the video control 2, so that the video control 2 has smooth animation, and the use experience of a user is met. In addition, the rendering of the common control 1 and the video control 3 does not interfere with the video control 2, and if the rendering is not completed, the existing texture image can be adopted, so that the smooth animation of the video control 2 is further ensured, and the video control 2 is effectively prevented from being blocked.
Scene two, rendering of different windows in different applications.
Fig. 9 is an interface schematic diagram of an electronic device according to the present application. As shown in fig. 9, the electronic device 100 displays an interface 901, and window 1, window 2, and window 3 are displayed on the interface 901. Taking the split rendering architecture as an example, in combination with the split rendering architecture shown in fig. 5, the interface 901 corresponds to a root control, namely, corresponds to a root rendering node, the window 1 corresponds to a control 1, namely, corresponds to a rendering node 1, the window 2 corresponds to a container control 3, namely, corresponds to a rendering node 3, and the window 3 corresponds to a control 2, namely, corresponds to a rendering node 2. When the mouse is hovering over window 3, electronic device 100 detects that the stacking order of window 1 is the bottom layer, the stacking order of window 2 is the middle layer, the stacking order of window 3 is the top layer, and window 3 is the window where the mouse is hovering. At this time, it is assumed that the stacking order is closer to the user, with a weight of 1, and vice versa, 0. Then, the electronic device may determine that the score of the window 1 is 0, the score of the window 2 is 0, and the score of the window 3 is 1,1>0, which means that the user closes the window 3, and at this time, the electronic device 100 determines that the window 3 is a focus node, and that the window 1 and the window 2 are non-focus nodes. Further, since window 1 is a cached static window and window 2 is a cached visible window, electronic device 100 may determine that the rendering priority of window 2 is higher than the rendering priority of window 1. Then, the electronic device 100 assigns the window 3 to the first process for rendering, and the window 1 and the window 2 to the second process for rendering. The first process of the electronic device 100 renders the window 3 in real time and displays the rendered image on the screen. The second process of the electronic device 100 renders the window 2 first and then renders the window 1, and if the second process does not complete the rendering of the window 1 and the window 2 in the first refresh period, the electronic device 100 displays the existing texture images of the window 1 and the window 2 on the screen. Therefore, the electronic device can ensure real-time rendering of the window 3, so that the window 3 is used as a focus node for real-time rendering display in a high-load scene, and the use experience of a user is met. In addition, the window 1 and the window 2 can adopt the existing texture images, so that the smooth performance of the window 3 is further ensured, and the window 3 is effectively prevented from being blocked.
It can be seen that the technical solution provided by the embodiment of the present application may be applied to the differential rendering processing of different controls in the same window as shown in fig. 8, or may be applied to the differential rendering processing of different windows as shown in fig. 9, and of course, may also be applied to other scenes, where the embodiment of the present application is not particularly limited.
In the absence of specific recitations and logic conflict, terms and/or descriptions between various embodiments of the present disclosure may be consistent and interchangeable, and features of different embodiments may be combined to form new embodiments in accordance with their inherent logic.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware structures and/or software modules that perform the respective functions. The various illustrative units and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer-driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the electronic device according to the method example, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
As shown in fig. 10, a schematic structural diagram of an electronic device according to an embodiment of the present application is provided, and the electronic device 1000 may be used to implement the methods described in the above method embodiments. By way of example, the electronic device 1000 may include a display unit 1001 and a processing unit 1002.
The display unit 1001 is used for executing the support electronic device 1000 to execute step S301 in fig. 3-2 or steps S301 and S701 in fig. 7. And/or the display unit 1001 is further configured to support the electronic device 1000 to perform other steps performed by the electronic device in the embodiment of the present application.
The processing unit 1002 is configured to execute the supporting electronic apparatus 1000 to execute steps S302 to S304 in fig. 3-2, or steps S702 to S703 in fig. 7. And/or the processing unit 1002 is further configured to support the electronic device 1000 to perform other steps performed by the electronic device in the embodiments of the present application.
Optionally, the electronic device 1000 shown in fig. 10 may further include a communication unit 1003, where the communication unit 1003 is configured to support the electronic device 1000 to perform a step of communicating between the electronic device and other devices in the embodiment of the present application.
Optionally, the electronic device 1000 shown in fig. 10 may further include a storage unit (not shown in fig. 10) storing programs or instructions. When executed by the processing unit 1002, enables the electronic device 1000 shown in fig. 10 to perform the method as shown in fig. 3-2, etc.
The technical effects of the electronic device 1000 shown in fig. 10 may refer to the technical effects of the method shown in fig. 3-2, etc., and will not be described herein. The processing unit 1002 involved in the electronic device 1000 shown in fig. 10 may be implemented by a processor or processor-related circuit components, which may be a processor or a processing module. The communication unit may be implemented by a transceiver or transceiver-related circuit component, and may be a transceiver or transceiver module. The display unit 1001 may be implemented by a display related component.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Embodiments of the present application also provide a computer-readable storage medium comprising instructions which, when run on a computer, cause the computer to perform any of the methods described above.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the methods described above.
The embodiment of the application also provides a chip, which comprises a processor and an interface circuit, wherein the interface circuit is coupled with the processor, the processor is used for running a computer program or instructions to realize the method, and the interface circuit is used for communicating with other modules outside the chip.
All or part of any feature or any step of embodiments of the application may be freely combined. The combined technical scheme is also within the scope of the application.
In the description of the present application, "/" means "or" unless otherwise indicated, for example, A/B may mean A or B. The term "and/or" herein is merely an association relation describing the association object, and means that three kinds of relations may exist, for example, a and/or B may mean that a exists alone, a and B exist together, and B exists alone. Furthermore, "at least one" means one or more, and "a plurality" means two or more. The terms "first," "second," and the like do not limit the number and order of execution, and the terms "first," "second," and the like do not necessarily differ.
In the description of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.