[go: up one dir, main page]

CN115237297B - Method for displaying schedule and related device - Google Patents

Method for displaying schedule and related device Download PDF

Info

Publication number
CN115237297B
CN115237297B CN202211146732.2A CN202211146732A CN115237297B CN 115237297 B CN115237297 B CN 115237297B CN 202211146732 A CN202211146732 A CN 202211146732A CN 115237297 B CN115237297 B CN 115237297B
Authority
CN
China
Prior art keywords
schedule
displaying
user
card
mark frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211146732.2A
Other languages
Chinese (zh)
Other versions
CN115237297A (en
Inventor
鲍新彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211146732.2A priority Critical patent/CN115237297B/en
Publication of CN115237297A publication Critical patent/CN115237297A/en
Application granted granted Critical
Publication of CN115237297B publication Critical patent/CN115237297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a method and a related device for displaying schedules, which are beneficial to a user to quickly position the schedules expected to be checked in a calendar application and improve user experience. The method comprises the following steps: displaying a day view of a target date in a calendar application, the day view including at least one schedule arranged in chronological order; in response to an operation of dragging a first schedule in the at least one schedule by a user, displaying a mark frame in the day view, wherein the mark frame is used for highlighting the schedule selected by the user from the at least one schedule; in response to an operation of the user to release the first schedule in the mark frame, the first schedule is displayed in the mark frame.

Description

Method for displaying schedule and related device
Technical Field
The present application relates to the field of terminals, and in particular, to a method for displaying a schedule and a related device.
Background
Currently, the pace of life of people is constantly increasing, and a variety of events may need to be handled every day. To record the daily itineraries, the user can record certain important events, such as scheduled train tickets, airline tickets, meeting times and addresses, etc., by adding a schedule to a calendar application on the terminal device.
At present, a calendar application can sequence a plurality of schedules according to the starting time of the schedules, when more schedules are displayed in a day view of a certain day in the calendar application, a user can consume more time when searching for a certain schedule, the schedule expected to be checked cannot be quickly positioned, and the user experience is poor.
Disclosure of Invention
The application provides a method and a related device for displaying schedules, which are beneficial for a user to quickly position the schedules expected to be checked in a calendar application and improve user experience.
In a first aspect, a method for displaying a schedule is provided, the method comprising: displaying a day view of a target date in a calendar application, the day view including at least one schedule arranged in chronological order; in response to the operation that the user drags a first schedule in the at least one schedule, displaying a mark frame in the day view, wherein the mark frame is used for displaying the schedule selected by the user from the at least one schedule; in response to an operation of the user to release the first schedule in the mark frame, the first schedule is displayed in the mark frame.
The method for displaying the schedule is described by taking a system calendar in the terminal device as an example. It is understood that the method for displaying schedules of the present application is also applicable to schedule setting functions provided in other applications or services.
In a practical application, a user can click an icon of a calendar application on a desktop to enter a calendar interface, and select a target date on the calendar interface to display a day view of the target date. The day view may display at least one schedule of the user on the target date, which may be sorted by a start time of the schedule. Typically, a user can manually create a schedule in a calendar application, or schedule information can be synchronized by the calendar application from other applications. For example, the calendar application synchronizes a schedule set in a mailbox from the mailbox, or the calendar application synchronizes to-do matters input by the user from a memo.
Based on the technical scheme of the application, the terminal equipment can provide a mark frame for the user in the daily view, and the user can add the concerned schedule to the mark frame for displaying. Therefore, the user can visually check the schedules concerned by the user in the mark frame without looking over and searching in at least one schedule, so that the schedule expected to be checked can be quickly positioned, and the user experience is improved.
With reference to the first aspect, in certain implementations of the first aspect, displaying a first schedule in a markup box includes: and displaying the first schedule in the form of a schedule card in the mark frame.
In this application, terminal equipment can present manifold schedule form, and card formula design can let schedule information more regular, is favorable to improving the speed and the travelling comfort that the user looked over the schedule.
With reference to the first aspect, in some implementation manners of the first aspect, the schedule card is provided with a background color, the background color of the schedule card of the first schedule is determined according to an icon color of a source application of the first schedule, and the source application is an application that provides schedule information for generating the first schedule to the calendar application.
In the application, the schedule card is provided with the background color, and the background color of the card is obviously distinguished from the background color of the page, so that a user can easily perceive the existence of the schedule card visually, and the user can check the schedule card clearly.
With reference to the first aspect, in certain implementations of the first aspect, the mark frame displays a plurality of schedule cards, and the plurality of schedule cards includes schedule cards of the first schedule. The method further comprises the following steps: responding to the operation of a user for pressing any schedule card in the plurality of schedule cards, and displaying a first control on each schedule card in the plurality of schedule cards, wherein the first control is used for adjusting the display positions of at least part of schedule cards in the plurality of schedule cards; responding to the operation that a user presses the first control on the schedule card of the first schedule and drags the schedule card of the first schedule, and controlling the schedule card of the first schedule to move to a target display position, wherein the target display position is the display position of other schedule cards except the display position of the schedule card of the first schedule; and in response to the operation that the user releases the schedule cards of the first schedule at the target display position, displaying the schedule cards of the first schedule at the target display position.
In the application, the terminal device can adjust the sequencing of the plurality of schedule cards in the mark frame in response to the operation of the user, so that the operability of the user in the mark frame is richer, and the display positions of the plurality of schedule cards in the mark frame can be customized.
With reference to the first aspect, in certain implementations of the first aspect, displaying the first schedule in a markup box includes: and displaying a time axis on the mark frame, wherein the first schedule is displayed on the time axis.
In the application, the terminal device can add a time axis in the mark frame, so that the terminal device can display the schedules on the newly added time axis according to the starting time sequence of the schedules selected by the user, the schedules which are desired to be viewed can be quickly positioned in the mark frame for the user, and the time sequence of the schedules is clear at a glance.
With reference to the first aspect, in certain implementations of the first aspect, after the markup frame displays the first schedule, the method further includes: in response to a sliding operation of the first schedule on the mark frame by the user, displaying a processing state of the first schedule, wherein the processing state comprises one or more of the following items: not started, ongoing or finished.
In the application, the terminal device displays the processing state of the first schedule through sliding operation, and the mode is simple and easy to implement, so that a user can quickly know the current processing state of the schedule.
With reference to the first aspect, in certain implementations of the first aspect, the displaying the processing status of the first schedule, where the processing status of the first schedule is not started, includes: and displaying the time length of the current time from the starting time of the first schedule.
With reference to the first aspect, in certain implementations of the first aspect, the at least one schedule further includes a second schedule, and conflict marks are displayed in the first schedule and the second schedule, where the conflict marks are used to indicate that a start-stop period of the first schedule overlaps with a start-stop period of the second schedule. After the mark-up box displays the first schedule, the method further comprises: canceling the conflict mark of the first schedule and the second schedule.
In the present application, after the terminal device displays the first schedule in the mark frame, it is described that the terminal device can cancel the conflict mark of the first schedule and the second schedule and solve the conflict problem of the plurality of schedules without considering the conflict mark of the first schedule and the daily schedule having a time conflict with the first schedule.
With reference to the first aspect, in certain implementations of the first aspect, the method further includes: controlling the first date to move out of the mark frame in response to the operation that the user drags the first date out of the mark frame; and canceling the display of the first schedule in the mark frame in response to an operation of the user to release the first schedule outside the mark frame.
In the present application, the terminal device may move the first schedule out of the mark frame in response to an operation by the user, without displaying the first schedule in the mark frame. Therefore, the operability of the user in the mark frame is richer, and richer interaction experience is brought to the user.
With reference to the first aspect, in certain implementations of the first aspect, the day view includes a schedule view, the schedule view is used for displaying the at least one schedule, and the first schedule is displayed in the schedule view while being displayed in the mark frame.
In the application, the first schedule in the schedule view can be displayed all the time, so that the terminal equipment does not need to sort at least one schedule in the schedule view again, and the power consumption of the terminal equipment is reduced.
With reference to the first aspect, in certain implementations of the first aspect, after the markup box displays the first schedule, the method further includes: folding and displaying a first schedule in a schedule view; and, after canceling the display of the first schedule in the markup frame, the method further comprises: and unfolding and displaying the folded part of the first schedule in the schedule view.
In the application, the terminal device displays the first schedule in a folding mode in the schedule view after the marking frame displays the first schedule, so that interface space can be saved to display more other schedules. After the terminal device cancels the display of the first schedule in the mark frame, the terminal device may unfold and display the folded part of the first schedule in the schedule view, displaying more contents about the first schedule.
With reference to the first aspect, in certain implementations of the first aspect, the marker box is located above the schedule view.
In the application, the mark frame is positioned on the upper position of the schedule view and is more striking, so that the schedule selected by the user is displayed in the mark frame at the middle position, and the reminding effect is more obvious.
In a second aspect, there is provided an apparatus for displaying a schedule, comprising: for performing the method of any one of the possible implementations of the first aspect described above. In particular, the apparatus comprises means for performing the method of any one of the possible implementations of the first aspect described above.
In a third aspect, there is provided another apparatus for displaying a schedule, including a processor coupled to a memory, the memory operable to store a computer program, and the processor operable to invoke and execute the computer program in the memory to implement the method of any one of the possible implementations of the first aspect.
In one implementation, the device for displaying schedules is a terminal device. When the means for displaying the schedule is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation, the schedule display device is a chip configured in the terminal device. When the schedule display device is a chip configured in the terminal equipment, the communication interface may be an input/output interface.
In a fourth aspect, a processor is provided, comprising: input circuit, output circuit and processing circuit. The processing circuit is configured to receive a signal via the input circuit and transmit a signal via the output circuit, so that the processor performs the method of any one of the possible implementations of the first aspect.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a flip-flop, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the signal output by the output circuit may be output to and transmitted by a transmitter, for example and without limitation, and the input circuit and the output circuit may be the same circuit that functions as the input circuit and the output circuit, respectively, at different times. The present application is not limited to the specific implementation of the processor and various circuits.
In a fifth aspect, a processing apparatus is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory, and may receive signals via the receiver and transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, there are one or more processors and one or more memories.
Alternatively, the memory may be integrated with the processor, or provided separately from the processor.
In a specific implementation process, the memory may be a non-transitory (non-transitory) memory, such as a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of the memory and the arrangement manner of the memory and the processor are not limited in this application.
It will be appreciated that the associated data interaction process, for example, sending the indication information, may be a process of outputting the indication information from the processor, and receiving the capability information may be a process of receiving the input capability information from the processor. In particular, the data output by the processor may be output to a transmitter and the input data received by the processor may be from a receiver. The transmitter and receiver may be collectively referred to as a transceiver, among others.
The processing device in the fifth aspect may be a chip, the processor may be implemented by hardware or software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory, which may be integrated with the processor, located external to the processor, or stand-alone.
In a sixth aspect, there is provided a computer program product comprising: computer program code which, when executed, causes a computer to perform the method of any of the possible implementations of the first aspect described above.
In a seventh aspect, a computer-readable storage medium is provided, which stores a computer program that, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device to which an embodiment of the present application is applicable;
fig. 2 is a block diagram of a software structure of a terminal device to which the embodiment of the present application is applicable;
fig. 3 to fig. 8 are schematic views of a schedule display interface provided in an embodiment of the present application;
FIG. 9 is a schematic flow chart diagram illustrating a method for displaying a schedule according to an embodiment of the present application;
fig. 10 is a schematic block diagram of an apparatus for displaying a schedule according to an embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings.
In order to facilitate clear description of technical solutions of the embodiments of the present application, some terms referred to in the embodiments of the present application are introduced below.
In the embodiments of the present application, the words "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. For example, the first schedule and the second schedule are for distinguishing different schedules, and the order of the schedules is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Further, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, and c, may represent: a, or b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applied. As shown in fig. 1, the terminal device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, a Display Processing Unit (DPU), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, terminal device 100 may also include one or more processors 110. The processor may be, among other things, a neural center and a command center of the terminal device 100. The processor can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses, reduces the latency of the processor 110 and thus increases the efficiency of the terminal device 100.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a USB interface, etc. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, and may also be used to transmit data between the terminal device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is an illustrative description, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLAN), bluetooth, global Navigation Satellite System (GNSS), frequency Modulation (FM), NFC, infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technologies may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a bei dou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 100 can implement a display function by the GPU, the display screen 194, the application processor, and the like. The application processor may include an NPU and/or a DPU. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like. The DPU is also referred to as a display sub-system (DSS) and is configured to adjust the color of the display screen 194, which may be adjusted via a Look Up Table (LUT) for three-dimensional (3 d) colors. The DPU may also perform scaling, noise reduction, contrast enhancement, backlight brightness management, high dynamic range imaging (HDR) processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, or a quantum dot light-emitting diode (QLED). In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement a photographing function through the ISP, one or more cameras 193, a video codec, a GPU, one or more display screens 194, and an application processor, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data files such as music, photos, videos, and the like are saved in the external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the terminal device 100 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data (such as photos, contacts, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. In some embodiments, the processor 110 may cause the terminal device 100 to execute various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear. The microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking near the microphone 170C through the mouth. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions. The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, may also be an open mobile platform (OMTP) standard interface of 3.5mm, and may also be a CTIA (cellular telecommunications industry association) standard interface of the USA.
The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the embodiment of the present application, an Android (Android) system with a layered architecture is taken as an example to exemplarily illustrate a software structure of the terminal device 100.
Fig. 2 is a block diagram of a software structure of a terminal device to which the embodiment of the present application is applied. The layered architecture divides the software system of the terminal device 100 into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into an application layer (APP), an application framework layer (application framework), an Android runtime (Android runtime) and system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel). In some embodiments, the terminal device 100 also includes hardware (e.g., a camera).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 2, the application packages may include camera, calendar, map, phone, music, WLAN, bluetooth, video, social, gallery, navigation, short message, etc. applications.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 2, the application framework layers may include a window manager, a content provider, an explorer, a notification manager, a view system, a phone manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
Content providers are used to store and retrieve data and make it accessible to applications. The data may include video images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system includes visual controls such as controls to display text, controls to display pictures, and the like.
The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication function of the terminal device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in the status bar, a prompt tone is sounded, the terminal device 100 vibrates, an indicator lamp blinks, and the like.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by the java language used by the java API framework, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer is an abstraction interface driven by the device kernel, and provides an application program interface for accessing the bottom device to a java API framework with a higher level. The hardware abstraction layer may include a plurality of library modules, e.g., display modules, audio modules, bluetooth modules, camera modules, etc., each of which may implement an interface for a particular type of hardware component. When the framework API requires access to the device hardware, the Android system will load the library module for that hardware component.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least includes a display driver, an audio driver, a bluetooth driver, a camera driver, and the like, which is not limited in the embodiment of the present application.
The terminal device of the embodiment of the present application may be a handheld device, a vehicle-mounted device, or the like having a wireless connection function, and may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. Currently, some examples of terminal devices are: a mobile phone (mobile phone), a tablet computer, a smart television, a notebook computer, a tablet computer (Pad), a handheld computer, a Mobile Internet Device (MID), a Virtual Reality (VR) device, an Augmented Reality (AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol), SIP) phone, wireless Local Loop (WLL) station, personal Digital Assistant (PDA), handheld device with wireless communication function, computing device or other processing device connected to wireless modem, vehicle-mounted device, wearable device, terminal device in 5G network or terminal device in Public Land Mobile Network (PLMN) evolved in the future, and the like.
By way of example and not limitation, in the embodiments of the present application, the terminal device may also be a wearable device. Wearable equipment can also be called wearable intelligent equipment, is the general term of applying wearable technique to carry out intelligent design, develop the equipment that can dress to daily wearing, like glasses, gloves, wrist-watch, dress and shoes etc.. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable smart device includes full functionality, large size, and can implement full or partial functionality without relying on a smart phone, such as: smart watches or smart glasses and the like, and only focus on a certain type of application functions, and need to be used in cooperation with other devices such as smart phones, such as various smart bracelets for physical sign monitoring, smart jewelry and the like.
It should be understood that in the embodiment of the present application, the terminal device may be an apparatus for implementing a function of the terminal device, or may be an apparatus capable of supporting the terminal device to implement the function, such as a chip system, and the apparatus may be installed in the terminal. In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
The terminal device in the embodiment of the present application may also be referred to as: user Equipment (UE), mobile Station (MS), mobile Terminal (MT), access terminal, subscriber unit, subscriber station, mobile station, remote terminal, mobile device, user terminal, wireless communication device, user agent, or user device, etc.
At present, a plurality of schedules can be sequenced by calendar application according to the starting time of the schedules, when a user needs to process a plurality of events in a certain day, the calendar application displays a plurality of schedules, so that the user can consume more time when looking for a certain schedule, the schedule expected to be viewed cannot be quickly positioned, and the user experience is poor.
In view of this, the embodiment of the present application provides a method for displaying schedules and a related device, where a terminal device may add and display a mark frame in a daily view according to a dragging operation of a user on a certain schedule in the daily view, and display a schedule selected by the user in the mark frame. Therefore, the user can check the schedule selected by the user in the mark frame at a glance, the time for the user to check the schedule in a plurality of schedules in the schedule view is saved, the schedule expected to be checked is quickly positioned, and the user experience is improved.
The method for displaying the schedule of the embodiment of the present application is exemplarily described below in connection with an interface of a calendar application.
Fig. 3 is a schematic view of a schedule display interface according to an embodiment of the present disclosure.
Referring to the a interface in fig. 3, the terminal device displays a day view of a target date 2022 years, 9 months, 10 days, the day view being for displaying a schedule view of at least one schedule, the day view further including a date view for displaying a date.
In this example, the at least one schedule displayed by the schedule view includes schedule a, schedule B, schedule C, schedule D, schedule E, schedule F, and schedule G. The terminal equipment sequences the at least one schedule in sequence on a time axis according to the starting time of each schedule in the at least one schedule in the calendar application. The schedule A, the schedule B and the schedule C belong to the all-day schedule and are arranged at the position near the front of the time axis, and the rest of the schedules D, E, F and G are sequentially arranged on the time axis according to the starting time and the all-day schedule. It should be understood that the current interface only displays seven schedules from schedule a to schedule G, and there may be more schedules hidden limited by the size of the terminal device screen, and the user may slide up in the schedule view to display more hidden schedules.
The schedule information such as the subject, the start and stop time (including the start time and the end time), the location, etc. of the schedule can be displayed in each schedule. For example, schedule D has a topic of meeting one, a start time of 8 a.m., an end time of 9 a.m., and a location of xx office buildings in xx district xx of xx city. The subject of schedule E is conference two, with a start time of 9 am, an end time of 9 am, and a location of xx office buildings in xx city xx. The topic of schedule F is "train trip", shift is C1411, start time is 9 am, 30 am, end time is 12 am, and location is xx railway xx south station.
The terminal device may have schedules with time conflicts when the calendar application orders at least one schedule in chronological order. In the above example, the start-stop period of schedule D and the start-stop period of schedule E overlap, so the terminal device marks schedule D and schedule E in the calendar application, see interface a in fig. 3, and conflict marks are displayed in schedule D and schedule E, for example, displaying a "conflict" word. It should be noted that the whole day schedule and the to-do schedule are not marked for conflict.
Currently, a user can set a priority of a schedule in a calendar application. Illustratively, the priorities may include "important", "ignore", and "none". For example, the user may set the priority of schedule D as "important," so that an interface element indicating "important" is displayed in schedule D, where the interface element may be an exclamation point, a star mark, a triangle, and the like, which is not limited in this embodiment of the application.
The terminal device may create the schedule in several ways.
In a possible implementation manner, the interface a in fig. 3 further includes a control 01, and the user can click the control 01 to create a new schedule, input text prompt contents and set a schedule reminding event, and after the input is completed, the schedule is displayed in a schedule view.
In another possible implementation, the calendar application may synchronize events that the user is to do from other system applications to create a schedule. The system application may include a memo, a short message, a contact person, and the like.
Taking the system application as a memo as an example, the user inputs a to-be-handled item in the memo, such as "meeting at xxx on xx sun", after the user inputs the content, the memo can identify a time point in the input content, highlight and maintain the time point in a selectable state, and the user can click the highlighted time point and then jump to a newly-built schedule interface of the calendar application to add a new schedule.
Taking the system application as the memo as an example, the user sets a backlog in the memo and sets a time reminder for the backlog. And when the system detects that the to-do-list is set with the time reminder, the system informs the calendar application to synchronize the to-do-list from the memo and creates a new schedule.
Taking the system application as an example of a short message, after a user purchases a train ticket on ticket booking software, the user receives the short message of successful ticket booking, and the short message prompts the current travel information of the user, including information of a departure place, a destination, departure time, arrival time, shift number, station number and the like. The system can identify the travel information in the short message, inform the calendar application to synchronize the travel information from the short message, and create a new schedule.
Taking system application as an example of a contact person, a user can set a birthday of a friend in the contact person, and a calendar application can create a new schedule from birthday information of the friend synchronized in the contact person.
In yet another possible implementation, the calendar application may synchronize backlog from cooperating third party applications to create a schedule. The calendar application and the third-party application perform access authorization in a form of logging in a common account. Upon authorization, the calendar application may synchronize schedules or to-do items that have been created in the third party application.
Taking a third-party application as an example of a video application, a user makes an appointment in the video application, after the user clicks the appointment, the video application prompts whether the user allows to access the calendar application, and after the user selects the appointment, the video application can synchronize the appointment items into the calendar application, so that a new schedule is added in the calendar application.
In the embodiment of the present application, if the number of schedules of the target date is relatively large and the user only desires to focus on one or some of the schedules, for example, referring to the interface a in fig. 3, the user may drag the schedule D by pressing the schedule D with a long press without loosing hands, and when the terminal device detects the operation of the user dragging the schedule D, the interface B in fig. 3 may be displayed.
Referring to the B interface in fig. 3, a mark box is additionally displayed above the schedule view, and the mark box is used for displaying some schedules with higher priority and expected to be highlighted for the user's selection.
Referring to the C interface in fig. 3, the user can drag the schedule D into the range of the mark-up box. When the terminal device detects that the user releases the schedule D in the mark frame, referring to the D interface in fig. 3, the terminal device displays the schedule D in the mark frame in the form of a schedule card, that is, displays the schedule card D as shown in the figure.
In the embodiment of the present application, the mark frame is located above the schedule view, and it is understood that the mark frame in this case is a top-setting frame, and the user can set the schedule desired to be highlighted on the top of the top-setting frame.
In one possible implementation, schedule D continues to be displayed in the schedule view during the time that the user drags schedule D from the schedule view and after schedule D is displayed in the tab box. However, after the schedule D is displayed in the mark frame, the schedule D is displayed in a folded manner in the schedule view, and only the subject and the start time of the schedule D are simply displayed.
In another possible implementation, after the user drags the schedule D to the mark box for display, the schedule D is no longer displayed in the schedule view.
In the mark frame, the schedule card of the schedule D displays the information of the subject, the start and stop time, the address and the like, and the user can click the schedule D in the mark frame to jump to the detail page to view more detailed information about the schedule D.
Referring to the interface C in fig. 3, after the schedule D is displayed in the mark frame, the conflict mark is no longer displayed in the schedule D and the schedule E, that is, the conflict mark is cancelled.
In the manner arranged by the time axis as shown in the a-interface in fig. 3, even if schedule D is the "important" schedule marked by the user, schedule D is marked as "conflict" when the start-stop period of schedule E for which the priority is not set and the start-stop period of schedule D for which "important" is set overlap.
In the embodiment of the application, once the user drags the schedule D to the mark frame for display, the terminal device determines that the priority of the schedule D is higher than that of the schedule E after detecting that the schedule D is displayed in the mark frame, or the terminal device defaults that the user pays more attention to the schedule D and does not participate in the conflict sequencing, so that the time conflict between the schedule D and the schedule E can be not considered, and the problem of the time conflict of a plurality of schedules is solved.
In the embodiment of the application, the user can display the schedules which are considered to be important in the marking frame, so that when the number of the schedules displayed in the time axis is large, the user can be conveniently and quickly positioned to the schedules expected to be checked, the operation is simple and convenient, the time is saved, and the use experience of the user is improved.
It should be noted that the mark frame shown in the embodiment of the present application is located above the schedule view, so that the schedule displayed in the mark frame is equivalent to be set to the top for display, the reminding effect is more obvious, and a user can quickly locate the schedule desired to be viewed. The above of the calendar view is illustrated by the marker box in each of fig. 3 to 8. In addition, the mark frame can also be located below, to the left, and to the right of the schedule view, which is not limited in the embodiments of the present application.
It should be understood that the size and the display position of the schedule view can be adaptively adjusted according to the display position of the mark frame. For example, when the mark-up frame is positioned above the schedule view, the size of the schedule view shown in the D-interface in fig. 3 is reduced compared to the schedule view shown in the a-interface in fig. 3.
Fig. 4 is a schematic view of another interface for displaying a schedule according to an embodiment of the present application.
Referring to the interface a in fig. 4, schedule cards D, E, and F are shown in the mark boxes. The schedule card D corresponds to the schedule D, the schedule card E corresponds to the schedule E, and the schedule card F corresponds to the schedule F. Schedule D, schedule E, and schedule F are still displayed in the schedule view, but these three schedules are displayed folded, and only the subject and start time of the schedule are simply displayed.
The user may display other calendar cards that have been dragged into the markup frame, but are hidden from display, by sliding left in the markup frame.
Referring to the interface B in fig. 4, in response to the operation of the user sliding leftward in the mark frame, the terminal device displays the hidden schedule card G in the mark frame, where the schedule card G corresponds to the schedule G. At this time, the schedule card E, the schedule card F, and the schedule card G are displayed in the mark frame, and the schedule card D is hidden.
Optionally, the schedule card displayed in the mark frame is matched with a corresponding background color.
As described above, each calendar has its origin, either created locally to the calendar application or synchronized from a third party application. The terminal device may determine the background color of the schedule card based on a source application of a certain schedule in the calendar application, wherein the source application is an application providing schedule information for generating the schedule to the calendar application.
The process of matching the background color for the schedule card is described below with reference to an example.
Taking schedule D as an example, assuming that schedule D is created by the calendar application from the memo in synchronization with the matters input by the user, when the terminal device creates schedule D in the calendar application, the terminal device may obtain the icon of the memo from the installation package of the memo, and analyze the color of the icon of the memo to obtain the main color of the icon of the memo. And then selecting one color closest to the main color of the memo icon from a plurality of preset colors as a color mark of the schedule D. When the terminal device displays the schedule D in the style of the schedule card in the mark frame in response to the user's operation, the terminal device sets the color mark of the schedule D as the background color of the schedule card D.
Taking schedule E as an example, assuming that schedule E is also created by the calendar application from the memo in synchronization with the matters input by the user, the color labels of schedule D and schedule E are the same, and schedule card D and schedule card E also have the same background color. That is, different schedules from the same application may have the same color label.
By matching different background colors with schedule cards from different sources, schedules can be naturally distinguished according to different background colors, schedules with the same background color belong to the same category, and schedules with different background colors belong to different categories, so that a user can conveniently distinguish schedules with different categories and check the schedules at a glance.
Optionally, an icon for identifying the content of the schedule is also displayed in the schedule card displayed in the mark frame.
Illustratively, the content of the schedule D and the content of the schedule E are related to the meeting, and the terminal device adds a "notebook" icon to the schedule card D and the schedule card E in the calendar application.
Illustratively, the content of the schedule F is related to travel, and the terminal device adds a "train" icon to the schedule card F in the calendar application.
Illustratively, the content of the schedule G is related to accommodation, and the terminal device adds a "building" icon to the schedule card G in the calendar application.
Fig. 5 is a schematic view of another interface for displaying a schedule according to an embodiment of the present application.
Referring to the interface a in fig. 5, the terminal device displays the schedule card D, the schedule card E, and the schedule card F in the mark frame, and the terminal device may sort the three schedule cards in the mark frame according to a default sorting manner.
In one possible implementation, the terminal device sorts the three schedule cards in the calendar application from left to right in the markup box according to the start times of schedule D, schedule E, and schedule F.
In another possible implementation manner, the terminal device sorts the three schedule cards in the mark frame from left to right in the calendar application according to the sequence in which the schedule D, the schedule E, and the schedule F are dragged into the mark frame.
In yet another possible implementation, the terminal device arranges schedule cards with the same background color adjacently in the calendar application.
Alternatively, the user may perform manual adjustment of the display positions of at least some schedule cards among the plurality of schedule cards in the mark frame.
The user can long-press any one of the plurality of schedule cards displayed in the mark frame to activate the plurality of schedule cards to be in an editable state. Referring to the a interface in fig. 5, the user long-presses the schedule card F among the three schedule cards displayed in the mark frame. The terminal equipment responds to the operation that the user presses the schedule card F displayed in the mark frame for a long time, activates the schedule card D, the schedule card E and the schedule card F in the mark frame to be in an editable state, and displays a B interface shown in the figure 5.
Referring to the interface B in fig. 5, controls 02 appear on the upper right of the schedule card D, the upper right of the schedule card E, and the upper right of the schedule card F in the mark frame, and the controls 02 are used for adjusting the display positions of at least some schedule cards in the plurality of schedule cards.
It should be noted that the control 02 may also be displayed at any possible position, such as at the upper left, the lower left, and the lower right of the calendar card, which is not limited in this embodiment of the application.
Referring to an interface C in fig. 5, in response to an operation of pressing and dragging the control 02 displayed in the schedule card F by the user, the terminal device controls the schedule card F to move to a target display position along with dragging of the mobile phone of the user, where the target display position is a display position of other schedule cards except for the display position of the schedule card F. The terminal device displays a D interface as in fig. 5 in response to an operation of the user to release the schedule card F at the display position of the schedule card D.
Referring to a D interface in fig. 5, the terminal device displays the schedule card F at the original display position of the schedule card D, the display positions of the schedule card D and the schedule card E are also changed correspondingly, and the three schedule cards are sequentially arranged from left to right as the schedule card F, the schedule card D and the schedule card E. After the user releases his hand, the control 02 on the top right of the calendar card disappears.
In the above example, the display positions of the schedule card D, the schedule card E, and the schedule card F are all changed after the display position of the schedule card F is adjusted. In some possible cases, after adjusting a schedule card in the plurality of schedule cards, only the display positions of a part of the schedule cards in the plurality of schedule cards are changed. For example, the user presses the control 02 displayed in the schedule card D, and releases the schedule card D after dragging the schedule card D to the display position of the schedule card E, so that the schedule card D is displayed at the display position of the original schedule card E after releasing the schedule card D, the schedule card E is displayed at the display position of the original schedule card D, and the display position of the schedule card F is not changed.
Fig. 6 is a schematic view of another interface for displaying a schedule according to an embodiment of the present application.
Referring to the interface a in fig. 6, the terminal device displays the schedule card D, the schedule card E, and the schedule card F in the mark frame, and the user can manually cancel the schedule card in the mark frame.
Illustratively, the terminal device activates a plurality of schedule cards in the mark frame to be in an editable state in response to the user's operation of pressing the schedule card F for a long time, and displays a B interface as in fig. 6. Interface B in fig. 6 is similar to interface B in fig. 5, and is not described in detail here.
Illustratively, if the schedule card F is cancelled or completed, the user desires to delete the schedule card F in the mark frame, referring to the interface C in fig. 6, and the terminal device follows the dragging of the user's finger to remove the schedule card F from the mark frame in response to the user's operation of dragging the other area of the schedule card F except for the control 02. The terminal device displays a D interface as in fig. 6 in response to an operation of the user to release the schedule card F outside the mark frame.
Referring to the D interface in fig. 6, the terminal device does not display the schedule card F in the mark frame any more, and unfolds and displays the schedule F which is displayed in a folded manner in the schedule view.
In the interface diagrams for displaying schedules described above with reference to fig. 3 to 6, the calendar application may add a mark frame above the schedule view, and the schedule that the user desires to focus on may be displayed in the mark frame in the form of a schedule card. Another schematic of an interface for displaying a calendar in a set-top box is described below.
Fig. 7 is a schematic view of another interface for displaying a schedule according to an embodiment of the present application.
The interface a to the interface C shown in fig. 7 is similar to the interface a to the interface C shown in fig. 3, and the operation of dragging the schedule D to trigger the display of the mark frame and releasing the schedule D in the mark frame is introduced, which is not described again.
After the user releases the schedule D in the tab box, referring to the D interface in fig. 7, a single column in the tab box may have a time axis on which the schedule D dragged into the tab box is displayed. The schedule view still continues to display the schedule D, but the schedule D is displayed folded in the schedule view, and only the subject and the start time of the schedule D are simply displayed.
Fig. 8 is a schematic view of another interface for displaying a schedule according to an embodiment of the present application.
As shown in the a interface in fig. 8, schedule D, schedule E, and schedule F are shown in the mark box. In this manner, the schedule D, the schedule E, and the schedule F in the mark frame are automatically sorted by the start time of the schedule on the time axis in the mark frame, and do not participate in the sorting in the schedule view. The schedule view continues to display the schedule D, the schedule E and the schedule F, but the three schedules are displayed in a folded manner in the schedule view, and only the subject and the start time of the schedule are simply displayed.
Alternatively, the user may slide a certain schedule in the mark-up box to display the processing status of the schedule. Wherein the processing state may include one or more of: not started, ongoing or finished. It should be understood that the direction of the slide operation in the embodiment of the present application is perpendicular to the direction of the time axis. The direction of the time axis is up-down, and if a certain schedule in the mark frame is held and slid up and down, the terminal device displays other hidden schedules in the mark frame in response to the operation of the up-down sliding. Therefore, the user can press a certain schedule in the mark frame to slide left or right to display the processing state of the schedule, and therefore gesture collision can be avoided.
When the processing state is not started, the terminal device may display a word "start after XX minutes" after the user slides the schedule, and remind the user of a time length from the start time of the schedule.
When the processing state is in progress, the terminal device may display a word "in progress" after the user slides the schedule.
When the processing state is ended, the terminal device may display an "ended" character after the user slides the schedule.
Referring to the A interface in FIG. 8, the user may slide left on schedule D in the markup box. In response to the sliding operation of the user on the schedule D, the terminal device displays a B interface as in fig. 8.
Referring to the B interface in fig. 8, the terminal device displays the processing state of the schedule D in the mark frame as "10 minutes later to start".
Optionally, in response to a sliding operation of the user on the schedule D, the terminal device may further display a control for instructing to cancel the schedule D in the mark frame, for example, display a selection frame with a "cancel" typeface. When the user selects the selection box with the "cancel" typeface, the schedule D is no longer displayed in the mark box.
Alternatively, the user may also perform the operation of canceling the display of the schedule in the mark frame as described above with respect to fig. 6, canceling the display of the schedule D in the mark frame. And the terminal equipment responds to the operation that the user presses the schedule D to move the schedule D out of the marking frame, loosens the schedule D outside the marking frame and cancels the display of the schedule D in the marking frame. After the cancellation, the terminal device unfolds and displays the folded part of the schedule D in the schedule view.
In summary, fig. 9 is a schematic flowchart of a method 900 for displaying a schedule according to an embodiment of the present application. The steps of the method 900 may be performed by a terminal device, and the terminal device may have a structure as shown in fig. 1 and/or fig. 2, but the embodiment of the present application is not limited thereto. The method 900 includes the steps of:
s901, displaying a day view of a target date in a calendar application, wherein the day view comprises at least one schedule arranged according to a time sequence;
s902, responding to an operation of dragging a first schedule in at least one schedule by a user, and displaying a mark frame in a day view, wherein the mark frame is used for displaying the schedule selected by the user from the at least one schedule;
and S903, responding to the operation that the user looses the first schedule in the mark frame, and displaying the first schedule in the mark frame.
For the specific interface of the terminal device and the execution process of the method for displaying the schedule, reference may be made to fig. 3 to 8, which are not described herein again.
In the embodiment of the application, the terminal device responds to the operation that the user drags the first schedule in at least one schedule, the mark frame is displayed at the preset position in the day view, the terminal device can display the schedule selected by the user in the mark frame, and the schedule selected by the user and displayed in the mark frame is generally the schedule expected to be focused by the user.
It should be understood that, the sequence numbers of the above processes do not imply any order of execution, and the order of execution of the processes should be determined by their functions and inherent logic, and should not limit the implementation process of the embodiments of the present application in any way.
The method for displaying a schedule according to an embodiment of the present application is described in detail above with reference to fig. 3 to 9, and the apparatus for displaying a schedule according to an embodiment of the present application is described in detail below with reference to fig. 10.
Fig. 10 is a schematic block diagram of an apparatus 1000 for displaying a schedule according to an embodiment of the present application. The apparatus 1000 includes a processing module 1010 and a touch module 1020.
Wherein the processing module 1010 is configured to: a day view of a target date in a calendar application is displayed, the day view including at least one schedule arranged in chronological order. The touch module 1020 is configured to: and receiving an operation of dragging a first schedule in the at least one schedule, which is input by a user. The processing module 1010 is configured to: and responding to the operation that the user drags a first schedule in the at least one schedule, and displaying a mark frame in the day view, wherein the mark frame is used for displaying the schedule selected by the user from the at least one schedule. The touch module 1020 is configured to: and receiving an operation of releasing the first schedule in the mark frame, which is input by a user. The processing module 1010 is configured to: in response to an operation of the user to release the first schedule in the mark frame, the first schedule is displayed in the mark frame.
Optionally, the processing module 1010 is configured to: and displaying the first schedule in the form of a schedule card in the mark frame.
Optionally, the schedule card is provided with a background color, the background color of the schedule card of the first schedule is determined according to an icon color of a source application of the first schedule, and the source application is an application providing schedule information for generating the first schedule to the calendar application.
Optionally, the mark frame displays a plurality of schedule cards, the plurality of schedule cards including a schedule card of the first schedule. The processing module 1010 is configured to: responding to the operation of a user for pressing any schedule card in the plurality of schedule cards, and displaying a first control on each schedule card in the plurality of schedule cards, wherein the first control is used for adjusting the display positions of at least part of schedule cards in the plurality of schedule cards; responding to the operation that a user presses a first control on a schedule card of a first schedule and drags the schedule card of the first schedule, and controlling the schedule card of the first schedule to move to a target display position, wherein the target display position is the display position of other schedule cards except the display position of the schedule card of the first schedule; and displaying the schedule cards of the first schedule at the target display position in response to the operation that the user releases the schedule cards of the first schedule at the target display position.
Optionally, the processing module 1010 is configured to: and displaying a time axis on the mark frame, wherein the time axis displays the first schedule.
Optionally, the processing module 1010 is configured to: in response to a sliding operation of the first schedule on the mark frame by the user, displaying a processing state of the first schedule, wherein the processing state comprises one or more of the following items: not started, ongoing or finished.
Optionally, the processing status of the first schedule is not started. The processing module 1010 is configured to: and displaying the time length of the current time from the starting time of the first schedule.
Optionally, the at least one schedule further includes a second schedule, and conflict marks are displayed in the first schedule and the second schedule, and the conflict marks are used for indicating that the start-stop period of the first schedule overlaps with the start-stop period of the second schedule. The processing module 1010 is configured to: canceling the conflict mark of the first schedule and the second schedule.
Optionally, the processing module 1010 is configured to: controlling the first date to move out of the mark frame in response to the operation that the user drags the first date out of the mark frame; and canceling the display of the first schedule in the mark frame in response to an operation of the user to release the first schedule outside the mark frame.
Optionally, the day view comprises a schedule view for displaying at least one schedule. The first schedule is displayed in the mark frame and also displayed in the schedule view.
Optionally, the processing module 1010 is configured to: folding and displaying a first schedule in a schedule view; and after the display of the first schedule in the mark frame is cancelled, unfolding and displaying the folded part of the first schedule in the schedule view.
Optionally, the marker box is located above the schedule view.
In an alternative example, those skilled in the art may understand that the apparatus 1000 may be embodied as a terminal device, or the functions of the terminal device may be integrated in the apparatus 1000. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above.
It should be appreciated that the apparatus 1000 herein is embodied in the form of functional modules. The term module herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 1000 may also be a chip or a chip system, for example: system on chip (SoC).
The application also provides a computer-readable storage medium, in which computer-executable instructions are stored, and when executed by a processor, the computer-executable instructions can implement the method performed by the terminal device in any of the above method embodiments.
An embodiment of the present application further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the method performed by the terminal device in any of the above method embodiments may be implemented.
The embodiment of the application also provides terminal equipment which can execute the method for displaying the schedule in any method embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working processes of the system, the apparatus, and the module described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and in actual implementation, there may be other divisions, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the embodiments of the present application, and all the changes or substitutions should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A method of displaying a schedule, comprising:
displaying a day view of a target date in a calendar application, the day view including at least one schedule arranged in chronological order;
responding to the operation that a user drags a first schedule in the at least one schedule, and displaying a mark frame in the day view, wherein the mark frame is used for displaying the schedule selected by the user from the at least one schedule;
responding to the operation that a user loosens the first schedule in the mark frame, and displaying the first schedule in the mark frame in a schedule card mode;
the schedule card is provided with a background color and an icon for identifying the content of the schedule, the background color of the schedule card of the first schedule is determined according to the icon color of a source application of the first schedule, and the source application is an application for providing schedule information for generating the first schedule for the calendar application.
2. The method of claim 1, wherein the markup frame displays a plurality of schedule cards, the plurality of schedule cards including a schedule card of the first schedule;
the method further comprises the following steps:
responding to an operation of a user for pressing any schedule card in the plurality of schedule cards for a long time, and displaying a first control on each schedule card in the plurality of schedule cards, wherein the first control is used for adjusting the display positions of at least part of the schedule cards in the plurality of schedule cards;
responding to the operation that a user presses the first control on the schedule card of the first schedule and drags the schedule card of the first schedule, and controlling the schedule card of the first schedule to move to a target display position, wherein the target display position is the display position of other schedule cards except the display position of the schedule card of the first schedule;
and responding to the operation that the user loosens the schedule cards of the first schedule at the target display position, and displaying the schedule cards of the first schedule at the target display position.
3. The method of claim 1, wherein displaying the first schedule in the markup box comprises:
and displaying a time axis on the mark frame, wherein the first schedule is displayed on the time axis.
4. The method of claim 3, wherein after the marker box displays the first schedule, the method further comprises:
responding to the sliding operation of the user on the first schedule in the mark frame, and displaying the processing state of the first schedule, wherein the processing state comprises one or more of the following items: not started, ongoing or finished.
5. The method of claim 4, wherein the processing status of the first schedule is not started, and wherein displaying the processing status of the first schedule comprises:
and displaying the time length of the current time from the starting time of the first schedule.
6. The method of claim 1, wherein the at least one schedule further comprises a second schedule; conflict marks are displayed in the first schedule and the second schedule, and the conflict marks are used for indicating that the start-stop period of the first schedule is overlapped with the start-stop period of the second schedule;
after the markup box displays the first schedule, the method further includes:
canceling the conflict mark of the first schedule and the second schedule.
7. The method of claim 1, further comprising:
controlling the first calendar to move out of the mark frame in response to an operation of dragging the first calendar out of the mark frame by a user;
and canceling the display of the first schedule in the mark frame in response to the operation that the user loosens the first schedule outside the mark frame.
8. The method of claim 7, wherein the day view comprises a schedule view for displaying the at least one schedule; the first schedule is displayed in the mark frame and is also displayed in the schedule view.
9. The method of claim 8, wherein after the marker box displays the first schedule, the method further comprises:
folding and displaying the first schedule in the schedule view; and
after the canceling displays the first schedule at the markup frame, the method further includes:
and unfolding and displaying the folded part of the first schedule in the schedule view.
10. The method of any of claims 1 to 9, wherein the markup box is located above a calendar view.
11. An apparatus for displaying a schedule, comprising means for performing the method of any of claims 1 to 10.
12. An apparatus for displaying a schedule, comprising: a processor and a memory, wherein,
the memory is used for storing a computer program;
the processor is configured to invoke and execute the computer program to cause the apparatus to perform the method of any one of claims 1 to 10.
13. A computer-readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 10.
CN202211146732.2A 2022-09-21 2022-09-21 Method for displaying schedule and related device Active CN115237297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211146732.2A CN115237297B (en) 2022-09-21 2022-09-21 Method for displaying schedule and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211146732.2A CN115237297B (en) 2022-09-21 2022-09-21 Method for displaying schedule and related device

Publications (2)

Publication Number Publication Date
CN115237297A CN115237297A (en) 2022-10-25
CN115237297B true CN115237297B (en) 2023-03-24

Family

ID=83682024

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211146732.2A Active CN115237297B (en) 2022-09-21 2022-09-21 Method for displaying schedule and related device

Country Status (1)

Country Link
CN (1) CN115237297B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001029643A1 (en) * 1999-10-22 2001-04-26 Appcity Inc. Method and system for encapsulating an application program
CN112506594A (en) * 2020-09-25 2021-03-16 维沃移动通信有限公司 Application icon display method and device

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000057217A (en) * 1998-08-10 2000-02-25 Ibm Japan Ltd Schedule display method, schedule change method, schedule management system, schedule management device, storage medium storing schedule management program
CN1231861C (en) * 2000-07-03 2005-12-14 株式会社Ntt都科摩 Apparatus and method for displaying information concerning business schedule
US7757181B2 (en) * 2006-05-05 2010-07-13 Microsoft Corporation Agenda and day hybrid calendar view
KR100717246B1 (en) * 2006-09-07 2007-05-15 이일규 International Diary System
JP6174429B2 (en) * 2013-01-25 2017-08-02 麻喜子 中野 Date continuous notation calendar, solid calendar and notebook
JP6170740B2 (en) * 2013-05-30 2017-07-26 アズビル株式会社 Schedule setting method
CN105956820A (en) * 2016-04-22 2016-09-21 珠海格力电器股份有限公司 Method, system and terminal for presenting calendar event schedule
US10622022B2 (en) * 2016-05-12 2020-04-14 Lumanary Inc. Automated video bumper system
CN113687758B (en) * 2016-06-07 2024-04-12 北京三星通信技术研究有限公司 Operation method based on auxiliary display area and terminal equipment
CN106600229B (en) * 2016-12-15 2024-04-09 北京小米移动软件有限公司 Calendar item reminding method and device
US10552770B2 (en) * 2017-05-09 2020-02-04 Microsoft Technology Licensing, Llc Efficient schedule item creation
CN108646961B (en) * 2018-05-04 2021-08-27 腾讯科技(深圳)有限公司 Management method and device for tasks to be handled and storage medium
CN111240779A (en) * 2020-01-03 2020-06-05 北京小米移动软件有限公司 Calendar display method, device and storage medium
CN113556461B (en) * 2020-09-29 2022-07-26 华为技术有限公司 Image processing method, electronic equipment and computer readable storage medium
CN114527901A (en) * 2020-10-31 2022-05-24 华为技术有限公司 File dragging method and electronic equipment
CN112862464A (en) * 2021-03-18 2021-05-28 网易(杭州)网络有限公司 Method and device for processing schedule in calendar and electronic equipment
CN117896461A (en) * 2021-06-15 2024-04-16 荣耀终端有限公司 Schedule processing method and electronic device
CN115983818A (en) * 2021-08-09 2023-04-18 荣耀终端有限公司 Schedule conflict processing method, schedule conflict processing device, storage medium and software program product
CN113810540B (en) * 2021-08-11 2023-01-17 荣耀终端有限公司 Schedule management method, terminal device, chip system and storage medium
CN117420936A (en) * 2022-07-19 2024-01-19 荣耀终端有限公司 Calendar view display method, electronic device and readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001029643A1 (en) * 1999-10-22 2001-04-26 Appcity Inc. Method and system for encapsulating an application program
CN112506594A (en) * 2020-09-25 2021-03-16 维沃移动通信有限公司 Application icon display method and device

Also Published As

Publication number Publication date
CN115237297A (en) 2022-10-25

Similar Documents

Publication Publication Date Title
EP4293490A1 (en) Display method and related apparatus
WO2020238774A1 (en) Notification message preview method and electronic device
WO2020244622A1 (en) Notification prompt method, terminal and system
US20230418444A1 (en) Notification Message Management Method and Electronic Device
CN115334193B (en) Notification display method and device based on situation
EP4261680A1 (en) Widget display method and electronic device
CN111602108A (en) Application icon display method and terminal
US20220366327A1 (en) Information sharing method for smart scene service and related apparatus
CN114115512A (en) Information display method, terminal device, and computer-readable storage medium
CN111835904A (en) Method for starting application based on context awareness and user portrait and electronic equipment
CN114493470A (en) Schedule management method, electronic device and computer-readable storage medium
WO2022247383A1 (en) Prompt method, graphical user interface, and related apparatus
WO2022022335A1 (en) Method and apparatus for displaying weather information, and electronic device
CN116088715B (en) Message reminding method and electronic equipment
WO2021042881A1 (en) Message notification method and electronic device
CN115237297B (en) Method for displaying schedule and related device
CN113810533A (en) Information reminding method and electronic equipment
US20240385737A1 (en) Application icon display method and related apparatus
WO2023160179A9 (en) Magnification switching method and magnification switching apparatus
CN117784991A (en) Display method of latest task list and electronic equipment
CN117130527B (en) Schedule management method, electronic device, and computer-readable storage medium
CN117273687B (en) Card punching recommendation method and electronic equipment
WO2024041180A1 (en) Path planning method and apparatus
CN119027084A (en) Trip reminder method and device
CN119166029A (en) Schedule creation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant