Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, in the description of the embodiments of the present application, "plurality" means two or more than two.
The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present embodiment, unless otherwise specified, the meaning of "plurality" is two or more.
Various applications of which the display elements contain pictures on the terminal equipment can provide abundant and visual display information for users, and in the age of rapid development of the current intelligent terminal equipment, the application layer of which the display elements contain pictures is endless, so that the requirements of various fields of the users are met. However, when a user selects to open a certain application to load and display pictures, the problem that the pictures flash out of order occurs, so that the display effect is disordered and the subjective visual experience of the user is poor.
As shown in fig. 1, the user selects to open the application mall 102 in the main interface 101 of the mobile phone 100, and the interface is quickly changed to a bitmap interface 103, wherein all pictures are replaced with a bitmap (solid-color picture, which is shown as white) for display, such as a large propaganda picture at position 104 and a large propaganda picture at position 105Icon picture at location 106Icon picture of game and at position 107All of the icon pictures of (2) are displayed by using the occupied bit map instead.
Immediately after the background loading is completed, the pictures can be displayed in the interface according to the principle of "who loads first and who displays first", and then the interface can be changed into the interface 108 as shown in the following: hot application at location 109The icon picture is displayed after the initial loading, and the application at the rest position still displays the occupied bit map. At this time, the liquid crystal display device,The icon picture of (2) appears to flash suddenly and the visual effect appears to be abrupt.
Subsequently, at position 111The icon picture is also displayed loaded, as shown in interface 110. Position 111 is to the left of position 109, which should be displayed first if the order of human vision from top to bottom left to right is used toAt icon picture redisplay position 109But in the actual case is the icon picture at location 109The icon picture of the picture is loaded and displayed first,After the icon pictures are loaded, the icon pictures are displayed, and the poor visual experience of out-of-order display is brought to a user by referring to the change from the interface 108 to the interface 110.
Interface 110 may transition to interface 112 as more and more pictures are loaded and subsequently displayed. It should be noted that the icon pictures at positions 105, 113, 114, 115 and 116 are still displayed according to the principle of "who loaded first and who displayed first", and one possible display order is: at location 115The icon picture of (2) is loaded and displayed first, then at position 114At location 113At icon picture, position 105Icon picture and at location 116The icon pictures are sequentially loaded and then displayed, and the visual effect presented to the user in the process is still a disordered sense of disordered flash.
Over time, all pictures are loaded and displayed, and interface 112 transitions to interface 117. Likewise, the pictures at positions 104, 118, 119, 106 and 107 are also displayed on the principle of "who loaded first and displayed first", one possible display order is: at position 107Icon pictures are loaded first, displayed first, then at location 118At position 119 of the icon pictureIcon picture, at location 106And displaying the icon pictures and the propaganda large pictures at the position 104 after the icon pictures and the propaganda large pictures are loaded in sequence. Although the above-mentioned interface 101 to interface 117 has a short time, the overall visual effect for the user is disordered and messy, resulting in a poor visual experience of "flickering" and "jumping" for the user.
In addition, when a user slides within an application to refresh a picture (or otherwise view a picture), the picture may also flash out of order. Referring to fig. 2, when the user opens the gallery application of the mobile phone 100, after all the pictures of the gallery top page are loaded, the display result is shown as an interface 201.
The user slides up in the interface 201 to view the earlier picture, the interface 201 transitions to interface 202: some pictures with high loading speed are loaded and displayed firstly according to the principle of 'who loads firstly and who displays firstly', such as pictures displayed at a position 203; while some slower loading pictures do not complete loading in time, the occupancy map is displayed at the location where it should be displayed, such as at location 204. At this point, the picture displayed at location 203 may cause a sudden flickering and abrupt visual effect to the user.
The interface 202 transitions to the interface 205 as pictures are loaded and displayed sequentially over time, such as the picture at location 204 is loaded and displayed after the picture at location 203. It should be noted that although position 204 is above position 203 in interface 205, the picture at position 204 is displayed after the picture at position 203, rather than in top-to-bottom left-to-right order, which gives the user a poor visual impression of disorder.
When the user stays at the interface 205 for a certain time and all pictures in the interface 205 are loaded, the interface 205 is finally converted into an interface 206. It should be noted that the pictures in the interface 206 are not displayed in the order of top-to-bottom from top to bottom to right from the position 207 to the position 211, but are still displayed according to the principle of "who loads first and who loads first", for example, the pictures at the position 211 are displayed first after loading first (the pictures at the position 203 and the position 204 are displayed already in the interface 205), then the pictures at the position 208 are loaded and displayed, the pictures at the position 207 are loaded and displayed, the pictures at the position 210 are loaded and displayed, and finally the pictures at the position 209 are loaded and displayed.
Although the time from the interface 201 to the interface 206 is short, the random and chaotic feeling of random "flash" of the pictures is caused to the user in the process of refreshing the pictures, and the browsing habit of the general user from top to bottom from left to right is not met. Especially when the user continuously slides and refreshes the pictures, the out-of-order flash feeling becomes more obvious, and the visual experience of the user is greatly influenced.
In order to solve the problem that pictures flash out of order in the loading display process when the pictures are opened or refreshed in the prior art, the embodiment of the application uniformly manages the appearance sequence of the pictures according to the positions of the pictures, and provides a method for orderly loading and displaying the pictures.
The method for displaying the application icon provided by the embodiment of the application can be applied to electronic devices such as mobile phones, tablet computers, notebook computers, ultra-mobile personal computer (UMPC), handheld computers, netbooks, personal Digital Assistants (PDAs), wearable electronic devices, vehicle-mounted devices, virtual reality devices and the like, and the embodiment of the application is not limited in any way.
As shown in fig. 3, a mobile phone is taken as an example of a terminal according to an embodiment of the present application, and the mobile phone 100 includes:
Processor 310, memory (including external memory interface 320 and internal memory 321), universal serial bus (universal serial bus, USB) interface 330, charge management module 340, power management module 341, battery 342, antenna 1, antenna 2, mobile communication module 350, wireless communication module 360, audio module 370, speaker 370A, receiver 370B, microphone 370C, headphone interface 370D, sensor module 380, keys 390, motor 391, indicator 392, camera 393, display 394, and subscriber identity module (subscriber identification module, SIM) card interface 395, among others.
It will be appreciated that the structure illustrated in the embodiments of the present application is not limited to a specific configuration of the mobile phone. In other embodiments of the application, the handset may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 310 may include one or more processing units. For example: processor 310 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a flight controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In an embodiment of the present application, the processor 310 may process instructions for loading pictures through a graphics processor or the like.
In some embodiments, the handset 100 may also include one or more processors 310. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. In other embodiments, memory may also be provided in the processor 310 for storing instructions and data. Illustratively, the memory in the processor 310 may be a cache memory. The memory may hold instructions or data that the processor 310 has just used or recycled. If the processor 310 needs to reuse the instruction or data, it may be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 310, thereby improving the efficiency of the handset 100 in processing data or executing instructions.
In some embodiments, processor 310 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The external memory interface 320 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 100. The external memory card communicates with the processor 310 through an external memory interface 320 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 321 may be used to store computer executable program code comprising instructions. The internal memory 321 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the handset 100, etc. In addition, the internal memory 321 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 310 performs various functional applications and data processing of the mobile phone 100 by executing instructions stored in the internal memory 321 and/or instructions stored in a memory provided in the processor.
The charge management module 340 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 340 may receive a charging input of a wired charger through the USB interface 330. In some wireless charging embodiments, the charge management module 340 may receive wireless charging input through a wireless charging coil of the cell phone 100. The battery 342 is charged by the charging management module 340, and the mobile phone 100 can be powered by the power management module 341.
The power management module 341 is configured to connect the battery 342, the charge management module 340 and the processor 310. The power management module 341 receives input from the battery 342 and/or the charge management module 340 to power the processor 310, the internal memory 321, the display screen 394, the camera assembly 393, the wireless communication module 360, and the like. The power management module 341 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 341 may also be disposed in the processor 310. In other embodiments, the power management module 341 and the charging management module 340 may also be disposed in the same device.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 350, the wireless communication module 360, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communications bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 350 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 350 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), or the like. The mobile communication module 350 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 350 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate the electromagnetic waves. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be disposed in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 350 may be provided in the same device as at least some of the modules of the processor 310.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 370A, receiver 370B, etc.) or displays images or video through display screen 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 350 or other functional module, independent of the processor 310.
The wireless communication module 360 may provide solutions for wireless communication including Near Field Communication (NFC) technology, wireless local area network (wireless local area networks, WLAN) (e.g., wiFi network), bluetooth BT, global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), infrared technology (IR), etc. applied to the mobile phone 100. The wireless communication module 360 may be one or more devices that integrate at least one communication processing module. The wireless communication module 360 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 360 may also receive a signal to be transmitted from the processor 310, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 and the mobile communication module 350 of the handset 100 are coupled, and the antenna 2 and the wireless communication module 360 are coupled, so that the handset 100 can communicate with a network and other devices through wireless communication technology. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS). The mobile phone 100 can establish communication connection with other devices through the mobile communication module 350, and perform data transmission with other devices through the established communication connection.
The mobile phone 100 implements display functions through a GPU, a display screen 394, an application processor, and the like. The GPU is a microprocessor for image processing, connected to the display screen 394 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 310 may include one or more GPUs that execute program instructions to generate or change display information. In the embodiment of the application, the GPU can be used for converting and driving the display information required by the computer system and providing a line scanning signal for the display to control the display to correctly display contents such as various pictures.
The display screen 394 is used for displaying images, videos, and the like. The display screen 394 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the handset 100 may include 1 or N displays 394, N being a positive integer greater than 1. In the embodiment of the present application, the display screen 394 is mainly used for displaying various pictures, such as pictures of application icons, pictures taken in a gallery, and animations.
The mobile phone 100 may implement photographing functions through an ISP, a photographing component 393, a video codec, a GPU, a display screen 394, an application processor, and the like.
The handset 100 may implement audio functions through an audio module 370, speaker 370A, receiver 370B, microphone 370C, an application processor, and the like. Such as music playing, recording, etc. Regarding the specific operation and function of the audio module 370, speaker 370A, receiver 370B and microphone 370C, reference may be made to the description in the conventional art.
The sensor module 380 may include pressure sensors, gyroscope sensors, barometric pressure sensors, magnetic sensors, acceleration sensors, distance sensors, proximity sensors, fingerprint sensors, temperature sensors, touch sensors, ambient light sensors, bone conduction sensors, and the like.
The keys 390 include a power on key, a volume key, etc. Key 390 may be a mechanical key. Or may be a touch key. The handset 100 may receive key inputs, generating key signal inputs related to user settings and function control of the handset 100.
The motor 391 may generate a vibration alert. The motor 391 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 391 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display screen 394. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 392 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The camera 393 is used for shooting pictures, and the camera may be a wide-angle camera, a main camera, a macro camera, a tele camera, a time of flight (TOF) camera, etc.
The SIM card interface 395 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 395 or removed from the SIM card interface 395 to enable contact and separation with the handset 100. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 395 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 395 can be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 395 may also be compatible with different types of SIM cards. The SIM card interface 395 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the handset 100 and cannot be separated from the handset 100.
It should be noted that, the hardware modules included in the mobile phone 100 shown in fig. 3 are only described as an example, and the specific structure of the mobile phone 100 is not limited. For example, the handset 100 may also include other functional modules.
The software system architecture of the terminal device is described below. For example, the software system of the terminal device provided by the embodiment of the application can adopt a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, a cloud architecture or the like. For example, the software system may include, but is not limited to(Symbian)、(Android)、(iOS)、 The application is not limited to operating systems such as (Blackberry), hong (Harmony), and the like.
Referring to fig. 4, fig. 4 is a block diagram of a software structure of a terminal device in a picture loading display in an embodiment of the present application, taking a hong-mony (Harmony) operating system with a layered architecture as an example. The layered architecture may divide the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. As shown in fig. 4, the software structure of the terminal device may be divided into four layers from bottom to top: kernel layer, system basic capability layer, application framework layer (framework layer for short) and application layer (application layer for short).
The Harmony system adopts a multi-kernel design, optionally including a Linux kernel, a Harmony microkernel, and LiteOS. By this design, devices with different device capabilities can select the appropriate system kernel. The kernel layer also includes a kernel abstraction layer (Kernal Abstract Layer) that provides basic kernel capabilities for other Harmony layers, such as process management, thread management, memory management, file system management, network management, peripheral management, and the like.
The system basic service layer is a core capability set of the Harmony system, and supports the Harmony system to provide service for application services through the framework layer in a multi-device deployment scene. The layer optionally includes a system basic capability subsystem set, a basic software service subsystem set, a hong-and-Monte-Care framework (HDF) and hardware abstraction adaptation layer (HAL), a hardware service subsystem set, a proprietary hardware service subsystem, and an enhanced software service subsystem set.
System basic capability subsystem set: the method provides basic capability for operation, scheduling, migration and other operations of the distributed application on the Harmony system multi-device, and consists of a distributed soft bus, distributed data management and file management, distributed task scheduling, ark operation, distributed security, privacy protection and the like. Wherein the ark runtime provides a C/c++/JavaScr ipt multi-language runtime and underlying system class library, and also provides runtime for Java programs (i.e., applications or parts of the framework layer developed using Java language) that are static using the ark compiler.
The basic software service subsystem set provides public and general software services for the Harmony system, and consists of MSDP & DV (Multimodal Sensor Data Platform & Dvice Virtualization), graphic images, distributed media, multimode input, event notification and other software services. In the embodiment of the application, the picture arrangement frame can be uniformly managed according to the display position of the view control (IMAGEVIEW) corresponding to the picture, so that the orderly gradual display of the picture is realized, and the out-of-order flash is avoided.
The enhanced software service subsystem set provides differentiated capability enhanced software services for different devices for the Harmony system, and consists of subsystems such as vehicle machine business software, tablet business software services, intelligent screen business software services, internet of things (IoT) business software services and the like.
The hong Monte-Care driven framework (HDF) and hardware abstraction adaptation layer (HAL) are the ecologically open foundation for hardware of a Harmony system, provide hardware capability abstraction for hardware upwards, and provide development frameworks and running environments for various peripheral drivers downwards.
The hardware service subsystem set provides public and adaptive hardware service for the Harmony system and consists of hardware service subsystems such as a universal Sensor, a position, a power supply, USB, biological identification and the like.
The proprietary hardware service subsystem provides differentiated hardware services for the Harmony system for different devices, optionally including subsystems of tablet proprietary hardware services, car proprietary hardware services, wearable proprietary hardware services, ioT proprietary hardware services, and the like.
The framework layer provides a Java/C/C++/JavaScr ipt multi-language user program framework and meta-capability framework for the application program of the Harmony system, and a multi-language framework API for opening various software and hardware services.
The application layer comprises system applications and three-party applications, and can comprise cameras, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message and other applications.
It should be noted that fig. 4 only uses a hierarchical hard system as an example, and introduces a software structure block diagram of a terminal device when a picture is loaded and displayed. The present application is not limited to the specific architecture of the software system of the terminal device, and reference may be made to conventional techniques for specific description of software systems of other architectures.
The embodiment of the application provides a picture loading display method, which comprises the steps of firstly, acquiring N pictures in a first interface of a first application; then creating view controls corresponding to the N pictures in the first interface; then loading at least one picture in the N pictures in a first time period; and after the first time period is finished, displaying at least one picture in a first interface according to a first sequence according to the position of the view control corresponding to the at least one picture. If the N pictures are not completely loaded in the first time period, loading at least one residual picture except at least one picture loaded in the first time period in the N pictures in the second time period; and after the second time period is over, displaying at least one piece of residual pictures in the first interface according to the first sequence according to the position of the view control corresponding to the at least one piece of residual pictures. There may also be more time periods until after a certain time period has elapsed the N pictures in the first interface have been displayed.
Please refer to fig. 5, which is a schematic diagram of interface change when a user selects to open an application mall APP in the picture loading and displaying method provided by the embodiment of the present application. The change from interface 101 to interface 103 in fig. 5 is the same as in fig. 1: after the user selects to open the application mall 102 in the main interface 101 of the mobile phone 100, the interface is quickly converted into a bitmap interface 103: wherein all pictures are first displayed in place of a occupancy bitmap (shown as a solid white background map), e.g. a promotional big picture at position 104, a picture at position 105Icon picture at location 106Icon picture of game and at position 107All of the icon pictures of (2) are displayed by using the occupied bit map instead.
It should be noted that, before displaying the picture, a view control corresponding to the picture needs to be created, and after creating the view control, a occupation map is generally displayed at a position where the view control is located, where the occupation map may be a solid background picture, a non-solid background picture with pattern content, or even a transparent blank picture; in effect, the view controls for each image are quickly created in the interface 103 at the corresponding locations, except that the images are not loaded and the display is replaced with the occupancy map.
The interface 103 is merely an example to show the display effect of the interface, and may be further shown by the form of the remaining figures or characters, which is not limited in the embodiment of the present application. In addition, more or fewer presentation elements may be included in interface 103, such as, for example, a large drawing at location 104, in one possible embodiment, nor is the embodiment of the application limited in this regard.
Assuming that all pictures in interface 103 are loaded within 350 milliseconds of one time period, all images may be displayed in a certain order. One possible display order is in top-to-bottom left-to-right order, i.e., interface 103 transitions directly to interface 510 as shown: promotional big at location 104, at location 105At position 502 of the icon pictureIcon picture, position 503Icon picture, at location 504Icon picture, position 505At icon picture, position 506Icon picture, position 507 Icon picture, at location 508Icon picture, at location 509Icon picture, at location 106Icon picture and at position 107The icon pictures are displayed in the order from top to bottom and from left to right.
It should be noted that, in the scenario shown in fig. 5, the time period is set to 350 ms, and in some embodiments, the time period may also be set to other values that do not make the user obviously feel that the picture has a hysteresis, such as 300 ms.
It should be noted that the order of loading the pictures at the 12 positions may be arbitrary, but the pictures are displayed in the order from top to bottom and from left to right.
In one possible implementation, the pictures at the 12 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 12 positions are displayed in a certain sequence, and meanwhile, uniform transparency animation can be adopted for displaying.
The uniform transparency animation display is that the value of the alpha channel of all pictures is set to be the same value at the same moment for display. More specifically, a general picture is represented by an RGB color space (RGB color space is obtained by changing three color channels of red (R), green (G), and blue (B) and overlapping them with each other, RGB is a color representing three channels of red, green, and blue, and this color space includes almost all colors perceived by human eyesight), and a picture in each application is generally overlapped with an alpha channel on the basis of the RGB color space, and the alpha channel is generally used as an opacity parameter: if a pixel has an alpha channel value of 0%, it is completely transparent (i.e., invisible), while a value of 100% means a completely opaque pixel (i.e., exactly the same as a picture pixel of the RGB color space). Values between 0% and 100% allow the pixel to appear through the background as though it were through glass (translucency). The alpha channel value may be expressed in percentages, integers or real numbers from 0 to 1 as in the RGB parameters. In the embodiment of the application, the value of the alpha channel of all the pictures can be uniformly set to the same value at the same time, for example, the value of the alpha channel of all the pictures can be uniformly set to 0.5 at a certain time, so that all the pictures have the same transparency at the same time. The value of the alpha channel is not limited in this embodiment of the present application.
Assuming that only a portion of the pictures in the interface 103 are loaded within 350 milliseconds of one time period, the loaded pictures are displayed in a certain order. One possible scenario is illustrated by interface 103 transitioning to interface 501: only the promotional big at location 104, at location 105At position 502 of the icon pictureIcon picture, position 503 Icon picture, at location 504The icon pictures are loaded and displayed in the order from top to bottom and left to right (see the vertical arrow line and the horizontal arrow line in the interface 501), and the occupation map is still displayed at the rest positions.
It should be noted that the order of loading the pictures at the 5 positions may be arbitrary, but the pictures are displayed in the order from top to bottom and from left to right.
In one possible implementation, the pictures at the 5 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 5 positions are displayed in a certain sequence, and meanwhile, uniform transparency animation can be adopted for displaying.
When a further time period of 350 ms has elapsed, i.e. when 700 ms has arrived, if the remaining pictures are also loaded, the interface 501 is changed to the one shown by the interface 510, and the remaining pictures are also displayed in a certain order. One possible scenario is to display in top-to-bottom left-to-right order, more specifically, at location 505The icon picture is displayed first, then at location 506Icon picture redisplayed, next at location 507The icon picture is displayed, then followed by at location 508Icon picture, at location 509Icon picture, at location 106Icon picture and at position 107Icon pictures are displayed in sequence.
It should be noted that the order of loading the pictures at the 7 positions may be arbitrary, but the pictures are displayed in the order from top to bottom and from left to right.
In one possible implementation, the pictures at the 7 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 7 positions may be displayed in a certain order, and simultaneously, a uniform transparency animation may be used for displaying.
Assuming that when a further time period of 350 ms has elapsed, i.e. when the time reaches 700 ms, if the remaining pictures are still partially unloaded (not shown in fig. 5), the loaded pictures may be displayed in a certain order, one possible case being in a top-to-bottom left-to-right order; the unloaded pictures still continue to be loaded in the background until the loading is completed within a certain time period, and the pictures are displayed in the order from top to bottom and from left to right at the end of the time period.
When a user refreshes pictures in an application, the method provided by the embodiment of the application can also realize that the pictures are displayed according to a certain sequence. Referring to fig. 6, taking a mobile phone as an example of a terminal device, an interface change schematic diagram when a user refreshes a gallery APP in a picture loading display method according to an embodiment of the present application is shown.
When the user slides up in the interface 201 of the gallery APP of the mobile phone 100 to view the picture of earlier time, assuming that the time period is 350 ms, and all the pictures of earlier time are loaded within 350 ms, the interface 201 is changed to the interface 606: the pictures at location 607, location 602, location 603, location 608, location 604, location 605 and location 609 are displayed in a certain order, one possible case being in a top-to-bottom left-to-right order and displayed (as indicated by the horizontal arrow line and the diagonal arrow line in interface 606).
It should be noted that the order of loading the pictures at the 7 positions may be arbitrary, but the pictures are displayed in the order from top to bottom and from left to right.
It should be noted that, in the scenario shown in fig. 6, the time period is set to 350 ms, and in some embodiments, the time period may also be set to other values that do not make the user obviously feel that the picture has a hysteresis, such as 300 ms.
In one possible implementation, the pictures at the 7 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 7 positions may be displayed in a certain order, and simultaneously, a uniform transparency animation may be used for displaying.
Assuming only a portion of the pictures are loaded within 350 milliseconds, interface 201 may first transition to interface 601. In interface 601, the pictures at positions 602, 603, 604 and 605 are all loaded and then displayed in a certain order, one possible display order is from top to bottom from left to right, i.e. the picture at position 602 is displayed first, the picture at position 603 is displayed next, the picture at position 604 is displayed next, and finally the picture at position 605 is displayed next. The pictures at the rest of the interface 601 have not been loaded and therefore the occupancy bitmap is still used instead of being displayed.
It should be noted that the order of loading the pictures at the 4 positions may be arbitrary, but the pictures are displayed in the order from top to bottom and from left to right.
In one possible implementation, the pictures at the 4 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 4 positions are displayed in a certain sequence, and meanwhile, uniform transparency animation can be adopted for displaying.
When the next 350 milliseconds have elapsed, i.e., 700 milliseconds have elapsed, the pictures at the remaining locations are also loaded, and interface 601 transitions to that shown by interface 606. In contrast to interface 601, the picture at position 607, the picture at position 608, and the picture at position 609 in interface 606 are displayed in a certain order, one possible display order being from top to bottom, left to right. More specifically, the picture at position 607 is displayed first, the picture at position 608 is displayed next, and the picture at position 609 is displayed last.
It should be noted that the order of loading the pictures at the 3 positions can be arbitrary, but the pictures are displayed in a uniform transparency animation mode according to the order from top to bottom and from left to right.
In one possible implementation, the pictures at the 3 positions may be displayed in another order, such as in a top-to-bottom right-to-left order.
In one possible implementation, the pictures at the 3 positions may be displayed in a certain order, and simultaneously, a uniform transparency animation may be used for displaying.
Assuming that when a further time period of 350 ms has elapsed, i.e. when 700 ms has arrived, if the pictures at the remaining positions remain partially unloaded (not shown in fig. 6), the loaded pictures may be displayed in a certain order, one possible display order being from top to bottom and left to right; the unloaded pictures still continue to be loaded in the background until the loading is completed within a certain time period, and the pictures are displayed in the sequence from top to bottom and from left to right after the time period is finished.
It should be noted that, the interface 606 is merely an exemplary scene of refreshing a picture in the gallery APP, and the interface 606 may be presented in the form of the remaining pictures or characters, which is not limited in the embodiment of the present application. In addition, more or fewer presentation elements may be included in interface 606, nor is the embodiment of the application limited in this regard.
The technical scheme in the application is specifically described below with reference to specific embodiments. In the embodiment of the application, N pictures in a first interface of a first application are firstly obtained, then a view control corresponding to the N pictures is created in the first interface, at least one picture in the N pictures is loaded in a first time period, and after the first time period is finished, the at least one picture is displayed in the first interface according to the position of the view control corresponding to the at least one picture. And if the N pictures are not completely loaded in the first time period, loading at least one residual picture except at least one picture loaded in the first time period in the second time period, and displaying the at least one residual picture in the first interface according to the first sequence after the second time period is finished and the position of the view control corresponding to the at least one residual picture. There may also be more time periods until after a certain time period has elapsed the N pictures in the first interface have been displayed. The embodiment of the application can solve the problems of out-of-order flash and disordered visual effect during picture display.
Referring to fig. 7, fig. 7 shows a flowchart of a picture loading display method according to an embodiment of the present application. The picture loading and displaying method comprises the following steps:
s701, acquiring N pictures in a first interface of a first application.
Wherein N is a positive integer.
Alternatively, the first application may obtain N pictures in the first interface of the first application from the server. For example, as shown in fig. 5, in the application mall APP, since the content in the application mall APP changes frequently, when a user opens the application mall APP at different times, the picture in the first interface updates the change in real time, and the updated picture needs to be acquired from the server in real time. Of course, when the content in the first interface of the application mall APP is unchanged, the picture can also be obtained from the local cache.
The first application may also obtain N pictures from the memory of the terminal device, such as the gallery APP shown in fig. 6, where the pictures taken or stored by the user are generally stored in the memory of the current terminal device, and when the user refreshes to view the pictures, the pictures may be directly obtained from the memory. Of course, if the user has previously selected to save the picture by uploading to the cloud server, the picture may also be obtained from the cloud server while refreshing the view picture.
S702, creating view controls corresponding to the N pictures in the first interface.
It should be noted that the first application may be any application that includes a picture, such as an application mall APP shown in interface 101 in fig. 5, and a gallery APP shown in interface 201 in fig. 6.
The view control is a control used for displaying the picture and can be used for setting basic display attributes such as zoom, length-width ratio and the like of the picture. Before displaying the picture, the view control corresponding to the picture needs to be created, and after the picture is loaded, the picture can be displayed at the position of the view control corresponding to the picture.
After the execution of step S702, an optional step S703 may be executed:
S703, displaying the occupancy bitmap at the position of the view control corresponding to the N pictures.
Wherein the occupancy map is a background map that is temporarily displayed instead of a picture.
Referring to the interface 103 in fig. 5, all the pictures in the interface 103 are loaded in the background, and the occupation map is displayed at the position of the view control corresponding to all the pictures. Illustratively, the occupancy bitmap in the interface 103 is a pure white background picture, and the occupancy bitmap may also be a pure color background picture of other colors, a non-pure color background picture with pattern content, or even a transparent blank picture.
S704, loading at least one picture in the N pictures in a first time period.
It should be appreciated that N pictures are acquired in step S701, and at least one of the acquired N pictures needs to be loaded into the first application in step S704 for subsequent display.
According to the embodiment of the application, the display pictures are loaded according to the time period, the time period can be set to be a value which does not enable a user to obviously feel that the pictures have hysteresis, and the time period is not limited.
It should be noted that, in the embodiment of the present application, the loading condition of N pictures in the first interface of the first application may be monitored by the picture arrangement framework according to a time period. Illustratively, taking the hong Monte frame shown in FIG. 4 as an example, the picture arrangement frame is located in the graphical image software service subsystem and can be used to monitor the loading condition of N pictures in the first interface of the first application according to a time period.
More specifically, in one possible implementation manner, the picture arrangement framework may actively send an inquiry message to the first application according to a time period, where the inquiry message is used to inquire about loading conditions of N pictures in the first interface of the first application in the current time period; and the first application returns the loading condition of the N pictures in the first interface in the current time period to the picture arrangement frame.
In another possible implementation manner, each time a picture is loaded in the first interface of the first application in the current time period, the first application can actively send a message of the picture loading completion to the picture arrangement frame, and the picture arrangement frame records the loading condition of each picture in the current time period.
And S705, after the first time period is over, displaying the at least one picture in a first interface according to a first sequence according to the position of the view control corresponding to the at least one picture.
Specifically, the view controls corresponding to the N pictures in the first interface created in step S702 are placed into an image container, and then after the first time period is over, the image container displays the at least one picture in the first interface according to the first order according to the position of the view control corresponding to the at least one picture.
In one possible implementation, displaying the at least one picture in a first order in a first interface includes: and displaying the at least one picture in the first interface from top to bottom and from left to right.
It will be appreciated that the sequential display from top to bottom and left to right is arranged according to the visual viewing habits of most users, and that the sequential display may also include the remaining sequential display, such as the sequential display from top to bottom and right to left.
Additionally, optionally, in a possible implementation manner, while the at least one picture is displayed in the first interface in the first order, the method further includes: and displaying the at least one picture in the first interface by adopting a unified transparency animation, wherein the unified transparency animation comprises uniformly setting the value of an alpha channel of the at least one picture to be the same value at the same moment.
For example, assume that the value ranges of the alpha channels of all the pictures in the first interface are normalized to 0 to 1, the value of the alpha channel of all the pictures in the first interface is 0 at time t 1, the value of the alpha channel of all the pictures in the first interface is 0.2 at time t 2, and the value of the alpha channel of all the pictures in the first interface is 0.5 at time t 3.
It should be understood that if all the N pictures are loaded in the first time period, that is, the at least one picture in step S704 refers to the N pictures, the N pictures are displayed in the first interface according to the first order in step S705 according to the positions of the view controls corresponding to the N pictures after the first time period is ended.
Specifically, the interface change regarding the execution of steps S704 to S705 may be changed directly to interface 510 with reference to interface 103 in fig. 5. Assuming that the time period is 350 ms and only 350 ms passes, that is, the first time period passes, all 12 pictures in the interface 103 are loaded, that is, the pictures from the uppermost position 104 to the lowermost position 107 in the interface 103 are loaded, the pictures from the uppermost position 104 to the lowermost position 107 in the interface 103 are displayed in the order from top to bottom and from left to right. As shown in interface 510, the black arrow lines in interface 510 indicate that all pictures in interface 510 are displayed in top-to-bottom left-to-right order.
Alternatively, the pictures at the uppermost position 104 to the lowermost position 107 in the interface 103 may be displayed in the order from top to bottom from left to right while also being animated with uniform transparency.
The order of loading the pictures from the uppermost position 104 to the lowermost position 107 in the interface 103 may be arbitrary. Illustratively, at location 504The icon picture may be loaded first, then at location 105And finishing loading the icon picture until the propaganda big picture at the final position 104 is loaded.
Taking fig. 6 as an example, when the user slides to view the picture in the application, the interface changes caused by the execution of steps S704 to S705 can be directly changed to the interface 606 with reference to the interface 201 in fig. 6. Assuming that the time period is 350 ms and the time has elapsed for only 350 ms, i.e., the first time period has elapsed, all of the pictures in interface 201 that need to be loaded have been loaded, i.e., the pictures at positions 607, 602, 603, 608, 604, 605 and 609 have been loaded, the pictures at the top left position 607 to the bottom right position 609 are displayed in top-to-bottom left-to-right order. As shown in interface 606, the black arrow lines in interface 606 indicate that the pictures in interface 606 are displayed in top-to-bottom left-to-right order.
It should be noted that the order in which the pictures at positions 607, 602, 603, 608, 604, 605 and 609 are loaded may be arbitrary.
It should be understood that if the N pictures are not all loaded in the first time period, that is, the number of the at least one picture in step S704 is less than N, after the first time period is finished in step S705, the at least one picture is displayed in the first interface according to the position of the view control corresponding to the at least one picture in the first order.
For example, referring to fig. 5, when the first application is opened, the interface change associated with the execution of steps S704 to S705 may be changed to interface 501 with reference to interface 103 in fig. 5. Assuming that the time period is 350 ms, and after the first time period is finished, that is, after 350 ms has elapsed, only part of the pictures in the interface 103 are loaded from time 0 to time 350 ms (within the first time period), that is, the pictures in the positions 104, 105, 502, 503 and 504 in the interface 103 are loaded, and the pictures in the rest positions are not loaded, the pictures in the positions 104, 105, 502, 503 and 504 are displayed in the order from top to bottom from left to right. The black arrow lines in the interface 501 indicate that the pictures in the interface 501 are displayed in order from top to bottom, left to right.
Alternatively, the pictures at positions 104, 105, 502, 503 and 504 may be animated with uniform transparency while being displayed in top-to-bottom left-to-right order.
It should be noted that the order of loading the pictures at the positions 104, 105, 502, 503 and 504 may be arbitrary. Illustratively, at location 504The icon picture may be loaded first, then at location 105And finishing loading the icon picture until the propaganda big picture at the final position 104 is loaded.
Taking fig. 6 as an example, when the user refreshes the picture in the application, the interface changes regarding the execution of steps S704 to S705 may be changed to the interface 601 with reference to the interface 201 in fig. 6. Assuming that the time period is 350 ms, and after the first time period is finished, that is, after 350 ms has elapsed, only a part of the pictures in the interface 201 are loaded, that is, the pictures in the positions 602, 603, 604 and 605 in the interface 201 are loaded, and the pictures in the rest positions are not loaded, the pictures in the positions 602, 603, 604 and 605 are displayed in order from top to bottom from left to right. The black arrow lines in the interface 601 indicate that the pictures in the interface 601 are displayed in the order from top to bottom from left to right.
Alternatively, the pictures at positions 602, 603, 604, and 605 may be displayed in top-to-bottom left-to-right order while also being animated with uniform transparency.
It should be noted that the order of loading the pictures at the positions 602, 603, 604 and 605 may be arbitrary, and the picture arrangement frame will display the pictures loaded in the time 0 to 350 ms (in the first time period) in the order from top to bottom and from left to right no matter what order the pictures are loaded.
Alternatively, if N pictures are not all loaded in the first time period, steps S706 and S707 may be performed.
S706, loading at least one rest picture except the at least one picture in the N pictures in a second time period.
And S707, displaying the at least one residual picture in the first interface according to the first sequence according to the position of the view control corresponding to the at least one residual picture after the second time period is finished.
Specifically, the view control corresponding to the N pictures in the first interface created in step S702 is put into an image container; and after the second time period is finished, the image container displays the at least one residual picture in the first interface according to the first sequence according to the position of the view control corresponding to the at least one residual picture.
Illustratively, the interface changes brought about with respect to the execution of steps S706 and S707 may be changed to interface 510 with reference to interface 501 in fig. 5. Generation of interface 501: assuming that the time period is 350 ms, only the pictures at 104, 105, 502, 503, and 504 are loaded in the interface 103 within the time 0 to 350 ms (within the first time period), and after steps S704 and S705 are performed, the interface 501 is obtained. If the pictures at the positions 505, 506, 507, 508, 509, 106 and 107 in the interface 501 are all loaded in the time of 350 ms to 700 ms (in the second time period) after a further time period, i.e. after the second time period, up to 700 ms, the pictures at the positions 505, 506, 507, 508, 509, 106 and 107 are displayed in the order from top to bottom from left to right. The black arrow lines in interface 510 indicate that the pictures in interface 510 are displayed in top-to-bottom left-to-right order.
Alternatively, the pictures at positions 505, 506, 507, 508, 509, 106, and 107 may be displayed in top-down left-to-right order while also using uniform transparency animation.
It should be noted that the order in which the pictures at locations 505, 506, 507, 508, 509, 106, and 107 are loaded may be arbitrary.
Taking fig. 6 as an example, the interface change caused by the execution of steps S706 and S707 may be changed to interface 606 with reference to interface 601 in fig. 6. Generation of interface 601: assuming a time period of 350 ms, only the pictures at positions 602, 603, 604, and 605 in the interface 601 are loaded in the time 0 to 350 ms (in the first time period), and after steps S704 and S705 are performed, the final interface 601 is displayed. If the pictures at positions 607, 608 and 609 in interface 601 are all loaded in the time period of 350 ms to 700 ms (in the second time period) after a further time period has elapsed, i.e., after the second time period has elapsed, the pictures at positions 607, 608 and 609 are displayed in the order from top to bottom from left to right. It should be noted that the order in which the pictures at positions 607, 608 and 609 are loaded may be arbitrary.
In the embodiment of the present application, there may be a third time period, a fourth time period … and an mth time period, and step a: loading K pictures except for the pictures loaded in the previous M-1 time periods in the N pictures in the Mth time period, wherein M is a positive integer, and K is a positive integer; step b: after the Mth time period is finished, displaying the K pictures in a first interface according to a first sequence according to the positions of view controls corresponding to the K pictures; repeating the steps a to b until N pictures in the first interface are displayed after the L-th time period is finished, wherein L is a positive integer.
Illustratively, still taking fig. 5 as an example, the background assumes: assuming that the time period is still 350 ms, and that only the pictures at positions 104, 105, 502, 503 and 504 in the interface 103 are loaded in the time 0 to 350 ms (in the first time period), the interface 501 is obtained after the steps S704 and S705 are performed. If the pictures at positions 505, 506, 507, 508, 509, 106 and 107 in interface 501 are only partially loaded within 350 ms to 700 ms (within the second time period), one possible case is that only the pictures at positions 505, 507 and 509 are loaded, the pictures at positions 506, 508, 106 and 107 are not yet loaded (not shown in fig. 5), then after 700 ms (after the end of the second time period), the loaded pictures at positions 505, 507 and 509 are displayed in top-to-bottom left-to-right order, and the pictures at positions 506, 508, 106 and 107 are still replaced with occupancy maps.
If a further time period, i.e., a third time period, has elapsed, all of the pictures at positions 506, 508, 106, and 107 are loaded within 700 ms to 1050 ms (within the third time period), then the loaded pictures at positions 506, 508, 106, and 107 are displayed in top-down left-to-right order after 1050 ms (after the third time period has ended), and the process ends. And if the pictures at positions 506, 508 are loaded and the pictures at positions 106, 107 are not loaded within 700 milliseconds to 1050 milliseconds (within the third time period), the pictures at positions 506, 507 are displayed in the order from top to bottom and from left to right after 1050 milliseconds; and the pictures at the positions 106 and 107 continue to be loaded until the pictures at the positions 106 and 107 are displayed after the L-th time period, and the process ends.
Taking fig. 6 as an example, the background assumes: assuming a time period of 350 ms, and only the pictures at positions 602, 603, 604, and 605 in the interface 601 are loaded in the time 0 to 350 ms (in the first time period), and after steps S704 and S705 are performed, the final interface 601 is obtained. If only a portion of the pictures at positions 607, 608 and 609 in interface 601 are loaded within 350 ms to 700 ms (second time period) after a further time period (second time period), one possible case is that only the pictures at positions 607 and 608 are loaded (not shown in fig. 6), the loaded pictures at positions 607 and 608 are displayed in top-down left-to-right order after 700 ms (after the second time period). If a time period, namely a third time period, passes, the loading of the picture at the position 609 is completed within 700 milliseconds to 1050 milliseconds (within the third time period), the picture at the position 609 is displayed in the order from top to bottom and from left to right after 1050 milliseconds (after the third time period is completed), and the process is finished; and the picture at the position 609 is not loaded in 700 ms to 1050 ms (in the third time period), loading is continued until the picture at the position 609 is displayed after the L-th time period, and the process is ended.
It will be appreciated that the above-described terminal, etc. may comprise hardware structures and/or software modules that perform the respective functions in order to achieve the above-described functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The embodiment of the application can divide the functional modules of the terminal equipment and the like according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of an integrated unit, fig. 8 shows a schematic diagram of one possible configuration of the terminal device involved in the above-described embodiment. The terminal device 800 includes: a storage module 801, a processing module 802, and a transceiver module 803. The processing module 802 is configured to control and manage actions of the terminal device 800, e.g., the processing module 802 is configured to support the first terminal 800 to perform the processes S701, S702, S704, and S706 in fig. 7, and/or other processes for the techniques described herein. The transceiver module 803 is used to support information communication in the terminal device 800. The terminal device 800 may further comprise a storage module 801 for storing program code and data of the terminal device.
The processing module 802 may be a Processor or controller, such as a central processing unit (Central Processing Unit, CPU), a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application-specific integrated Circuit (ASIC), a field programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor may also be a combination that performs the function of a computation, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like. The communication module 803 may be a transceiver, a transceiver circuit, a communication interface, or the like, and the storage module 801 may be a memory.
The processing module 802 may be implemented by the processor 902 shown in fig. 9. Transceiver module 803 may be implemented by transceiver 903 shown in fig. 9. In particular, the processor 902 is implemented by executing a computer program stored in the memory 901. Alternatively, the memory may be a storage unit in the chip, such as a register, a cache, or the like, and the storage unit may also be a storage unit in the computer device that is located outside the chip, such as the memory 901 shown in fig. 9.
The steps of a method or algorithm described in connection with the present disclosure may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access Memory (Random Access Memory, RAM), flash Memory, read Only Memory (ROM), erasable programmable Read Only Memory (Erasable Programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disk Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present application in further detail, and are not to be construed as limiting the scope of the application, but are merely intended to cover any modifications, equivalents, improvements, etc. based on the teachings of the application.