[go: up one dir, main page]

CN110045895B - Information processing method, system, medium, and computing device - Google Patents

Information processing method, system, medium, and computing device Download PDF

Info

Publication number
CN110045895B
CN110045895B CN201811528792.4A CN201811528792A CN110045895B CN 110045895 B CN110045895 B CN 110045895B CN 201811528792 A CN201811528792 A CN 201811528792A CN 110045895 B CN110045895 B CN 110045895B
Authority
CN
China
Prior art keywords
current interface
operation gesture
gesture
detection result
preset rule
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811528792.4A
Other languages
Chinese (zh)
Other versions
CN110045895A (en
Inventor
浠昏蕉
任轶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Netease Cloud Music Technology Co Ltd
Original Assignee
Hangzhou Netease Cloud Music Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Netease Cloud Music Technology Co Ltd filed Critical Hangzhou Netease Cloud Music Technology Co Ltd
Priority to CN201811528792.4A priority Critical patent/CN110045895B/en
Publication of CN110045895A publication Critical patent/CN110045895A/en
Application granted granted Critical
Publication of CN110045895B publication Critical patent/CN110045895B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An embodiment of the present invention provides an information processing method, including: detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object; and under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, presenting a second object at a second position of the current interface, wherein the second object displays part or all of specific information. According to the invention, under the condition of not leaving the current interface, the free switching of the display form of the current interface is realized through the operation gesture which is acted on the first object by the user and accords with the preset rule, and the intelligent experience of man-machine interaction is improved. Furthermore, embodiments of the present invention provide an information processing system, a medium, and a computing device.

Description

Information processing method, system, medium, and computing device
Technical Field
The embodiment of the invention relates to the field of intelligent terminal application, in particular to an information processing method, an information processing system, an information processing medium and a computing device.
Background
This section is intended to provide a background or context to the embodiments of the invention that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
In the whole Application field of the intelligent terminal, mobile Application programs (APPs for short) developed based on mobile clients are endowed with increasingly rich functions, so that more and more users are willing to select APPs with different functions to be installed on the mobile clients to meet the requirements of various aspects such as work, entertainment and the like. Along with the continuous upgrading of hardware equipment, the size of a display screen of a mobile terminal is larger and larger, but the size of an interface capable of being presented on the display screen is still limited, how to bear diversified information on the interface with the limited size and present an interactive interface with good vision so as to provide rich sensory experience and interactive experience for a user, and the interactive interface becomes a topic which cannot be avoided in the design and research and development stages of the APP.
At present, the related art provides methods for bearing diversified information on an interface with a limited size, for example, on an interface presented by an APP, a user is provided with an entry into the interface bearing specific information, and the interface bearing specific information can be accessed through the entry without bearing the specific information on the current interface.
Disclosure of Invention
However, in the course of implementing the inventive concept, the inventors found that at least the following problems existed in the related art: on the APP interface, the information interface presented to the user is single, and only one presentation form is supported, that is, either an entry capable of switching and displaying the interface bearing the specific information is presented, and the user clicks the icon of the entry, so that the interface bearing the specific information can be accessed, or the interface bearing the specific information is directly presented. The cost of a user entering an interface carrying specific information through a clicking operation is that the user has to leave the currently presented interface, but the user may not want to leave the currently presented interface, resulting in a large discount on the user experience.
Therefore, in the prior art, the cost of entering the interface carrying the specific information by the click operation is that the user has to leave the currently presented interface, and the user may not want to leave the currently presented interface, which results in the technical problems that the user experience is greatly reduced, and the user is easily lost, and the like, which is a very annoying process.
For this reason, an improved information processing method is highly needed to overcome the above technical problems caused by the information processing methods of the prior art and provide users with a good interactive experience.
In this context, embodiments of the present invention are intended to provide an information processing method, an information processing system, a medium, and a computing device.
In a first aspect of embodiments of the present invention, there is provided an information processing method including: detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object; and presenting a second object at a second position of the current interface under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, wherein the second object displays part or all of the specific information.
In an embodiment of the present invention, before the detecting the first operation gesture for the first object in the current interface to obtain the first detection result, the method further includes: detecting whether a second operation gesture aiming at the first object in the current interface exists or not; and when the second operation gesture is detected to exist, starting to detect a first operation gesture aiming at the first object in the current interface.
In another embodiment of the present invention, the second operation gesture is one of the following gestures: the duration of the compression exceeds a first threshold; the number of clicks is at least two within a time frame of a second threshold.
In another embodiment of the present invention, in a case that the presence of the second operation gesture is detected, the method further includes: and displaying first prompt information at a third position of the current interface, wherein the first prompt information is used for prompting a user to execute the first operation gesture conforming to the first preset rule so as to present the second object at a second position of the current interface.
In another embodiment of the present invention, the detecting the first operation gesture with respect to the first object in the current interface to obtain the first detection result includes: detecting whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance, wherein the first detection result shows that the first operation gesture accords with a first preset rule under the condition that the first dragging distance reaches the first preset distance; and/or detecting whether the first object is dragged to a first preset range by the first dragging operation, wherein the second object is displayed in the first preset range, and the first detection result shows that the first operation gesture conforms to the first preset rule under the condition that the first object is dragged to the first preset range by the first dragging operation.
In another embodiment of the present invention, in a case that the first detection result indicates that the first operation gesture conforms to a first preset rule, the method further includes: and displaying second prompt information at a fourth position of the current interface, wherein the second prompt information is used for prompting a user to release the first operation gesture so as to present the second object at the second position of the current interface.
In yet another embodiment of the present invention, the method further includes: and when the first detection result shows that the first operation gesture does not accord with the first preset rule, presenting the first object at the first position of the current interface, and not displaying the second object at the second position.
In yet another embodiment of the present invention, the method further includes: the first object is no longer displayed on the current interface.
In yet another embodiment of the present invention, the method further includes: detecting a third operation gesture aiming at the second object in the current interface to obtain a second detection result, wherein the second object is presented at the second position of the current interface and displays part or all of the specific information; and under the condition that the second detection result shows that the third operation gesture accords with a second preset rule, presenting the first object at the first position of the current interface, wherein the interface carrying the specific information can be switched and displayed by clicking the first object.
In yet another embodiment of the present invention, before the detecting the third operation gesture for the second object in the current interface to obtain the second detection result, the method further includes: detecting whether a fourth operation gesture aiming at the second object in the current interface exists or not; and when detecting that the fourth operation gesture exists, starting to detect a third operation gesture aiming at the second object in the current interface.
In another embodiment of the present invention, the fourth operation gesture is one of the following gestures: the sustained compression time exceeds a third threshold; the number of clicks is at least two within the time frame of the fourth threshold.
In another embodiment of the present invention, in a case that the presence of the fourth operation gesture is detected, the method further includes: displaying third prompt information at a fifth position of the current interface, where the third prompt information is used to prompt a user to execute the third operation gesture conforming to the second preset rule to present the first object at the first position of the current interface.
In another embodiment of the present invention, the third operation gesture includes a second drag operation, and the detecting the third operation gesture for the second object in the current interface to obtain the second detection result includes: detecting whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance, wherein the second detection result shows that the third operation gesture accords with a second preset rule under the condition that the second dragging distance reaches the second preset distance; and/or detecting whether the second object is dragged to a second preset range by the second dragging operation, wherein the first object is displayed in the second preset range, and the second detection result shows that the third operation gesture conforms to the second preset rule under the condition that the second object is dragged to the second preset range by the second dragging operation.
In another embodiment of the present invention, in a case that the second detection result indicates that the third operation gesture conforms to a second preset rule, the method further includes: displaying fourth prompt information at a sixth position of the current interface, where the fourth prompt information is used to prompt a user to release the third operation gesture so as to present the first object at the first position of the current interface.
In yet another embodiment of the present invention, the method further includes: and when the second detection result shows that the third operation gesture does not accord with the second preset rule, presenting the second object at the second position of the current interface, and not displaying the first object at the first position.
In yet another embodiment of the present invention, the method further includes: the second object is no longer displayed on the current interface.
In another embodiment of the present invention, the first object is an audio playing page entry key, wherein the specific information includes audio playing details; the second object is an audio play bar, wherein the part or all of the specific information includes at least one of play, pause, fast forward, fast backward, volume up, volume down, next or previous key.
In a second aspect of embodiments of the present invention, there is provided an information processing system including: the first detection module is used for detecting a first operation gesture aiming at a first object in a current interface so as to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object; and the first presenting module is used for presenting a second object at a second position of the current interface under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, wherein the second object displays part or all of the specific information.
In another embodiment of the present invention, the system further includes: the second detection module is used for detecting whether a second operation gesture aiming at the first object in the current interface exists or not; and the third detection module is used for starting to detect the first operation gesture aiming at the first object in the current interface under the condition that the second operation gesture is detected to exist.
In another embodiment of the present invention, the second operation gesture is one of the following: the duration of the compression exceeds a first threshold; the number of clicks is at least two within a time frame of a second threshold.
In another embodiment of the present invention, the system further includes: the first display module is configured to display first prompt information at a third position of the current interface, where the first prompt information is used to prompt a user to execute the first operation gesture conforming to the first preset rule so as to present the second object at a second position of the current interface.
In another embodiment of the present invention, the first detecting module includes: a first detection submodule, configured to detect whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance, where the first detection result indicates that the first operation gesture meets the first preset rule when the first dragging distance reaches the first preset distance; and/or a second detection submodule, configured to detect whether the first object is dragged to a first preset range by the first dragging operation, where the second object is displayed in the first preset range, and the first detection result indicates that the first operation gesture meets the first preset rule when the first object is dragged to the first preset range by the first dragging operation.
In another embodiment of the present invention, the system further includes: and a second display module, configured to display second prompt information at a fourth position of the current interface, where the second prompt information is used to prompt a user to release the first operation gesture so as to present the second object at the second position of the current interface.
In another embodiment of the present invention, the system further includes: and a second presenting module, configured to present the first object at the first position of the current interface and not display the second object at the second position when the first detection result indicates that the first operation gesture does not meet the first preset rule.
In another embodiment of the present invention, the first presenting module is further configured to: the first object is no longer displayed on the current interface.
In another embodiment of the present invention, the system further includes: a fourth detection module, configured to detect a third operation gesture for the second object in the current interface to obtain a second detection result, where the second object is displayed at the second position of the current interface and displays part or all of the specific information; and a third presenting module, configured to present the first object at the first position of the current interface when the second detection result indicates that the third operation gesture meets a second preset rule, where clicking the first object may switch and display an interface carrying the specific information.
In another embodiment of the present invention, the system further includes: a fifth detecting module, configured to detect whether a fourth operation gesture for the second object in the current interface exists; and a sixth detecting module, configured to, when it is detected that the fourth operation gesture exists, start detecting a third operation gesture for the second object in the current interface.
In another embodiment of the present invention, the fourth operation gesture is one of the following gestures: the sustained compression time exceeds a third threshold; the number of clicks is at least two within the time frame of the fourth threshold.
In another embodiment of the present invention, the system further includes: a third display module, configured to display third prompt information at a fifth position of the current interface, where the third prompt information is used to prompt a user to execute the third operation gesture meeting the second preset rule, so as to present the first object at the first position of the current interface.
In another embodiment of the present invention, the fourth detecting module includes: a third detection submodule, configured to detect whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance, where the second detection result indicates that the third operation gesture meets the second preset rule when the second dragging distance reaches the second preset distance; and/or a fourth detection submodule, configured to detect whether the second object is dragged to a second preset range by the second dragging operation, where the first object is displayed in the second preset range, and the second detection result indicates that the third operation gesture meets the second preset rule when the second object is dragged to the second preset range by the second dragging operation.
In another embodiment of the present invention, the system further includes: a fourth display module, configured to display fourth prompt information at a sixth location of the current interface, where the fourth prompt information is used to prompt a user to release the third operation gesture so as to present the first object at the first location of the current interface.
In another embodiment of the present invention, the system further includes: and a fourth presenting module, configured to present the second object at the second position of the current interface and not display the first object at the first position when the second detection result indicates that the third operation gesture does not meet the second preset rule.
In another embodiment of the present invention, the first presenting module is further configured to: the second object is no longer displayed on the current interface.
In another embodiment of the present invention, the first object is an audio playing page entry key, wherein the specific information includes audio playing details; the second object is an audio play bar, wherein the part or all of the specific information includes at least one of play, pause, fast forward, fast backward, volume up, volume down, next or previous key.
In a third aspect of embodiments of the present invention, there is provided a medium storing computer-executable instructions for implementing any one of the above-described methods when executed by a processing unit.
In a fourth aspect of embodiments of the present invention, there is provided a computing device comprising: a processing unit; and a storage unit storing computer-executable instructions that, when executed by the processing unit, are adapted to implement any of the above-described methods.
According to the information processing method provided by the embodiment of the invention, under the condition that the detected first operation gesture for the first object in the current interface accords with the first preset rule, the second object is presented at the second position of the current interface, and meanwhile, the first object is not displayed on the current interface any more, so that the technical problem that the current presented interface has to be left is avoided, and therefore, under the condition that the current interface is not left, the free switching from the first object to the second object is realized by utilizing the gesture operation which is acted on the first object and accords with the preset rule, the problem that a user only can leave the current page and switch into the interface carrying the specific information when needing to look up or control part of the specific information is at least partially overcome, and the intelligent experience of man-machine interaction is improved.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present invention will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
fig. 1A schematically illustrates an application scenario of an information processing method and system according to an embodiment of the present invention;
FIG. 1B schematically shows an application interface diagram of the information processing method and system according to an embodiment of the invention;
FIG. 2 schematically shows a first flow chart of an information processing method according to an embodiment of the invention;
FIG. 3A schematically illustrates a second flow chart of an information processing method according to an embodiment of the invention;
FIG. 3B schematically shows a flow chart three of an information processing method according to an embodiment of the present invention;
FIG. 3C is a flowchart schematically illustrating detecting a first operation gesture with respect to a first object in the current interface to obtain a first detection result, according to an embodiment of the present invention;
FIG. 3D schematically shows a fourth flowchart of an information processing method according to an embodiment of the invention;
FIG. 3E schematically shows a fifth flowchart of an information processing method according to an embodiment of the invention;
FIGS. 4A to 4E are schematic diagrams showing changes in the interface of the information processing method according to the embodiment of the present invention;
FIG. 5A schematically shows a sixth flowchart of an information processing method according to an embodiment of the present invention;
FIG. 5B schematically shows a seventh flowchart of an information processing method according to an embodiment of the invention;
FIG. 5C schematically shows a flow chart eight of an information processing method according to an embodiment of the present invention;
FIG. 5D schematically shows a flowchart for detecting a third operation gesture on a second object in the current interface to obtain a second detection result, according to an embodiment of the present invention;
FIG. 5E schematically shows a flow diagram nine of an information processing method according to an embodiment of the invention;
FIG. 5F schematically shows a flow chart ten of an information processing method according to an embodiment of the invention;
FIGS. 6A to 6E are schematic diagrams showing changes in the interface of the information processing method according to the embodiment of the present invention;
FIG. 7 schematically shows a first block diagram of an information handling system according to an embodiment of the present invention;
FIG. 8A schematically illustrates a block diagram two of an information handling system according to an embodiment of the present invention;
FIG. 8B schematically shows a third block diagram of an information handling system according to an embodiment of the present invention;
FIG. 8C schematically illustrates a block diagram of a first detection module according to an embodiment of the invention;
FIG. 8D schematically shows a fourth block diagram of an information handling system according to an embodiment of the present invention;
FIG. 8E schematically shows a block diagram five of an information handling system according to an embodiment of the present invention;
FIG. 9A schematically shows a block diagram six of an information handling system according to an embodiment of the present invention;
FIG. 9B schematically shows a block diagram of a first detection module according to an embodiment of the invention;
FIG. 9C schematically shows a block diagram seven of an information handling system according to an embodiment of the present invention;
FIG. 9D schematically illustrates a block diagram of a fourth detection module according to an embodiment of the invention;
FIG. 9E schematically illustrates a block diagram of a first detection module in accordance with an embodiment of the present invention;
FIG. 9F schematically shows a block diagram eight of an information handling system according to an embodiment of the present invention;
FIG. 10 schematically shows a schematic view of a computer-readable storage medium product according to an embodiment of the invention; and
FIG. 11 schematically shows a block diagram of a computing device according to an embodiment of the invention.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
The principles and spirit of the present invention will be described with reference to a number of exemplary embodiments. It is understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the invention, and are not intended to limit the scope of the invention in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
As will be appreciated by one skilled in the art, embodiments of the present invention may be embodied as a system, apparatus, device, method, or computer program product. Thus, the present invention may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
According to an embodiment of the present invention, a method, a medium, a system (apparatus) and a computing device for information processing are provided.
In this context, it is to be understood that the terminology involved in the present invention includes APP clients, song mini-players, song play portals, song play bars, and icons. The APP client is an application based on a mobile phone operating system, such as Internet music of an iPhone version. The user opens a certain interface of the APP client, the client sends a request to the product server, and the server returns data to the client and presents the final interface to the user. The user carries out input operation on an interface of the APP client, the client sends data to the product server, and the product server stores the data. The song mini-player is a resident entrance on the music APP that bears the currently playing song. The playing state of the current song, such as playing and pausing, can be displayed, and the user can conveniently enter a song detail page at any time. The song play entry is a form of song mini-player, and a click can enter the song play details page. Such as music playing icon at the top right corner of the navigation bar at the end of the cloud music iPhone and music playing icon at the bottom navigation of the shrimp music. The song play bar is a form of song player, and extends other information and operations. Such as song name, singer name, lyrics, pause, previous, next, play, pause, menu, etc. An icon is a type of icon format for system icons, software icons, and the like. Moreover, any number of elements in the drawings are by way of example and not by way of limitation, and any nomenclature is used solely for differentiation and not by way of limitation.
The principles and spirit of the present invention are explained in detail below with reference to several representative embodiments of the invention.
Summary of The Invention
In implementing the concept of the present invention, the inventors found that at least the following problems exist in the related art: on one hand, the user needs to click the icon of the entrance to enter the interface carrying the specific information, and on the other hand, the cost of entering the interface carrying the specific information through the clicking action is to leave the currently presented interface, but the user may not want to leave the currently presented interface, so that the user experience is greatly reduced.
An embodiment of the present invention provides an information processing method, including: detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object; and under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, presenting a second object at a second position of the current interface, wherein the second object displays part or all of the specific information.
Having described the general principles of the invention, various non-limiting embodiments of the invention are described in detail below.
Application scene overview
Referring first to fig. 1A, fig. 1A schematically illustrates an application scenario diagram 100 of an information processing method and system according to an embodiment of the present invention. It should be noted that fig. 1A is only an example of an application scenario diagram in which the embodiment of the present invention may be applied to help those skilled in the art understand the technical content of the present invention, and does not mean that the embodiment of the present invention may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1A, the application scenario diagram 100 according to the embodiment may include terminal devices 101, 102, 103, a network 104 and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, and so forth.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (for example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and perform other processing on the received data such as the user request, and feed back a processing result (e.g., a webpage, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the information processing method provided in the embodiment of the present invention may be generally executed by the terminal device 101, 102, or 103, or may also be executed by another terminal device different from the terminal device 101, 102, or 103. Accordingly, the information processing system provided by the embodiment of the present invention may also be provided in the terminal device 101, 102, or 103, or in another terminal device different from the terminal device 101, 102, or 103.
It should be understood that the number of terminal devices, networks, and servers in FIG. 1A are merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 1B schematically shows an application interface diagram 111 of the information processing method and system according to the embodiment of the present invention. It should be noted that fig. 1B is only an example of an application interface to which the embodiment of the present invention may be applied to help those skilled in the art understand the technical content of the present invention, and it does not mean that the embodiment of the present invention may not be applied to other application interfaces.
The information processing method provided by the embodiment of the invention is suitable for the client application program. The client application may be any type of application installed in the electronic device, such as a music application, a reading application, or a video playing application, and the present invention is not limited thereto.
It should be noted that the display interface of the client application shown in fig. 1B has different presentation forms to meet different requirements of the user. Taking a music-class application as an example, the user may be provided with two presentation modes, a song play entry 112 and a song play bar 113. The song playing entrance 112 is small in size, does not occupy too many pages, and can leave space for content display to the greatest extent, but the song playing entrance 112 can only be used as an entrance, and a user can only switch from a current page into a song detail page if the user wants to look up song playing details or wants to manipulate songs. The song playing bar 113 is large in size and occupies too many pages, but can display some necessary information or common information for playing and controlling songs, so that the problem that a user has to leave the current interface can be avoided. For simplicity of description, specific embodiments of the present invention will be described in detail below by taking the song playing entry 112 as the first object and the song playing bar 113 as the second object.
It should be noted that the application scenario shown in fig. 1B is not a limitation that the information processing method and the information processing system of the present invention can adapt to a scenario, and does not represent that the present invention cannot be applied to other application scenarios, and may be correspondingly developed according to the spirit of the embodiment of the present invention, and is not described herein again. The application scenarios of the present embodiment are merely illustrative and are not intended to limit or narrow the scope of the present invention.
Exemplary method
A method of information processing according to an exemplary embodiment of the present invention is described below with reference to fig. 2, fig. 3A to 3E, fig. 4A to 4E, and fig. 5A to 5F in conjunction with the application interface diagram 111 illustrated in fig. 1B. It should be noted that the above-mentioned application interface schematic diagram is only shown for the convenience of understanding the spirit and principle of the present invention, and the embodiments of the present invention are not limited in any way in this respect. Rather, embodiments of the present invention may be applied to any scenario where applicable. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
An embodiment of the present invention provides an information processing method, including: detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object; and under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, presenting a second object at a second position of the current interface, wherein the second object displays part or all of the specific information.
Fig. 2 schematically shows a first flowchart of an information processing method according to an embodiment of the present invention.
As shown in fig. 2, the method may include operations S210 and S220. Wherein:
in operation S210, a first operation gesture for a first object in a current interface is detected to obtain a first detection result.
In operation S220, in a case that the first detection result indicates that the first operation gesture conforms to the first preset rule, a second object is presented at a second position of the current interface.
According to the exemplary embodiment of the invention, when a user uses a client APP through an electronic device terminal, diversified information of the APP is loaded on a display interface with a limited size of the electronic device, so that the user can perform specific operations such as browsing, clicking and the like according to personal preferences. The first object and the second object are controls which show information for a user and can be executed with specific operation, the controls are different showing forms of a client APP, the first object is shown at a first position of a current interface, the interface carrying the specific information can be switched and displayed by clicking the first object, but the first object can only be used as an entrance, the user can only switch from the current page to a song detail page if the user wants to look up song playing details or wants to control a song, and the second object is shown at a second position of the current interface, displays part or all of the specific information, and can show some necessary information or common information for playing and controlling the song.
It should be noted that the first location where the first object is located and the second location where the second object is located may be different according to the design of the client APP, and the display locations of the first object and the second object are not specifically limited in the present invention.
According to the embodiment of the invention, a first operation gesture can be executed on the first object, and under the condition that the first operation gesture conforms to a first preset rule, the second object is presented at the second position of the current interface, so that switching from the first object to the second object is realized.
In conjunction with the application scenario shown in fig. 1B, the first object may be a song playing entry of a music player, and is displayed at a first position of the current interface in the form of an icon, such as the upper right corner of the navigation bar and the bottom navigation bar, and clicking the song playing entry may switch to display a song playing detail page, and the second object may be a song playing bar, which displays some or all of the above specific information, and may display some necessary information or common information for song playing and controlling, such as a song name, a singer name, lyrics, a pause, a previous, a next, a play, a pause, a menu, and so on.
According to the embodiment of the invention, under the condition that the detected first operation gesture for the first object in the current interface accords with the first preset rule, the second object is presented at the second position of the current interface, so that the first object is freely switched to the second object by utilizing the gesture operation which accords with the preset rule under the condition that the first operation gesture does not leave the current interface, and the intelligent experience of man-machine interaction is improved.
Fig. 3A schematically shows a flow chart two of an information processing method according to an embodiment of the present invention.
As shown in fig. 3A, the method includes operations S311 and S312 before the aforementioned operation S210 (detecting a first operation gesture for a first object in the current interface to obtain a first detection result), in addition to the aforementioned operations S210 and S220. Wherein:
in operation S311, it is detected whether there is a second operation gesture with respect to the first object in the current interface.
In operation S312, in a case that the presence of the second operation gesture is detected, the detection of the first operation gesture for the first object in the current interface is started.
According to an embodiment of the present invention, the second operation gesture is one of the following: the continuous pressing time exceeds a first threshold value, and the number of clicks in a time range of a second threshold value is at least two.
According to the embodiment of the invention, under the condition that a second operation gesture with the continuous pressing time exceeding a first threshold exists or under the condition that the second operation gesture with the clicking times at least two times within the time range of a second threshold exists, the first operation gesture aiming at the first object in the current interface is started to be detected. The second operational gesture may cause the first object icon to be in an active state, and the first object icon in the active state may support the first operational gesture, i.e., the drag operation.
By the embodiment of the invention, the first operation gesture aiming at the first object in the current interface is detected under the condition that the second operation gesture exists, so that the error switching caused by the error operation of the user can be effectively avoided, and the computing resource of the electronic equipment is saved.
Fig. 3B schematically shows a flowchart three of an information processing method according to an embodiment of the present invention.
As shown in fig. 3B, in addition to the aforementioned operations S210, S220, S311, and S312, in the case that the presence of the second operation gesture is detected, the method further includes an operation S321: and displaying the first prompt message at a third position of the current interface.
According to the embodiment of the invention, the first prompt message is used for prompting the user to execute the first operation gesture conforming to the first preset rule so as to present the second object at the second position of the current interface. Such as a drag-destination guidance prompt that may appear "drag to this area" at the bottom of the current interface.
According to the embodiment of the invention, under the condition that the second operation gesture is detected to exist, the first prompt information is displayed at the third position of the current interface, so that the user can be prompted to execute the first operation gesture conforming to the first preset rule to present the second object at the second position of the current interface, clear guide information is provided for the user, and the interaction experience between the user and the electronic equipment is better.
Fig. 3C schematically shows a flowchart for detecting a first operation gesture for a first object in the current interface to obtain a first detection result according to an embodiment of the present invention.
As shown in fig. 3C, the aforementioned operation S210 (detecting a first operation gesture for a first object in the current interface to obtain a first detection result) may include operations S331 and/or S332. Wherein:
in operation S331, it is detected whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance.
In operation S332, it is detected whether the first object is dragged by the first dragging operation within a first preset range.
According to the embodiment of the present invention, the first operation gesture includes a first dragging operation, and there are three methods for detecting the first operation gesture for the first object in the current interface to obtain the first detection result, which can be selected according to the actual situation, and the present invention is not limited thereto.
The first method comprises the following steps: whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance or not can be detected, and under the condition that the first dragging distance reaches the first preset distance, a first detection result shows that the first operation gesture accords with a first preset rule. Correspondingly, under the condition that the first dragging distance does not reach the first preset distance, the first detection result shows that the first operation gesture does not accord with the first preset rule.
And the second method comprises the following steps: the method includes the steps that whether the first object is dragged to a first preset range by the first dragging operation can be detected, the first preset range is a range in which the second object is to be displayed, and under the condition that the first object is dragged to the first preset range by the first dragging operation, a first detection result shows that the first operation gesture accords with a first preset rule. Correspondingly, under the condition that the first object is not dragged to be within the first preset range by the first dragging operation, the first detection result shows that the first operation gesture does not accord with the first preset rule.
And the third is that: the first dragging distance corresponding to the first dragging operation can be detected to reach a first preset distance, whether the first object is dragged to a first preset range by the first dragging operation is also detected, and under the condition that the first dragging distance and the first object are consistent, the first detection result shows that the first operation gesture accords with a first preset rule. Accordingly, in the case that neither of the first operation gestures is in accordance with the first preset rule, the first detection result indicates that the first operation gesture is not in accordance with the first preset rule.
According to the embodiment of the invention, whether the first operation gesture accords with the first preset rule is detected by utilizing whether the first dragging distance corresponding to the first dragging operation reaches the first preset distance and/or whether the first object is dragged to the first preset range to be displayed by the second object through the first dragging operation, a plurality of detection methods are provided to obtain the detection result whether the first gesture accords with the first preset rule, and the accuracy of the detection result is improved.
Fig. 3D schematically shows a fourth flowchart of an information processing method according to an embodiment of the present invention.
As shown in fig. 3D, in addition to the foregoing operations S210 and S220, in the case that the first detection result indicates that the first operation gesture conforms to the first preset rule, the method further includes operations S341: and displaying the second prompt message at the fourth position of the current interface.
According to the embodiment of the invention, under the condition that the first detection result shows that the first operation gesture accords with the first preset rule, second prompt information for prompting a user to release the first operation gesture to present the second object at the second position of the current interface is displayed at the fourth position of the current interface. If the first object is dragged to the vicinity of the guide area by the first operation gesture, a generation guide prompt of 'release generation play bar' appears to prompt the user to release to generate the play bar, and when the generation guide prompt appears, the release of the hands releases to generate the song play bar.
Through the embodiment of the invention, the second prompt information for prompting the user to release the first operation gesture to present the second object at the second position of the current interface is displayed at the fourth position of the current interface, so that clear action guidance prompt can be provided for the user, the user is guided to execute the next operation, and the interactive experience of the user is improved.
Fig. 3E schematically shows a flow chart five of an information processing method according to an embodiment of the present invention.
As shown in fig. 3E, the method includes, in addition to the aforementioned operations S210 and S220, an operation S351: and under the condition that the first detection result shows that the first operation gesture does not accord with the first preset rule, presenting the first object at the first position of the current interface, and simultaneously not displaying the second object at the second position.
According to the embodiment of the invention, under the condition that the first dragging distance corresponding to the first dragging operation does not reach the first preset distance and/or the first object is not dragged to the first preset range to be displayed by the second object by the first dragging operation, the first operation gesture does not accord with the first preset rule, at this time, the first object is continuously presented at the first position of the current interface, the second object is not displayed at the second position, namely, the switching execution of the first object to the second object fails, and the first object can automatically recover to the default inactivated state.
According to the embodiment of the invention, under the condition that the first detection result shows that the first operation gesture does not accord with the first preset rule, the first object is presented at the first position of the current interface, and the second object is not displayed at the second position, so that a user can cancel the switching from the first object to the second object according to the actual condition, and the misoperation of the user is effectively avoided.
According to the embodiment of the invention, the first object is not displayed on the current interface any more while the second object is presented at the second position of the current interface.
Fig. 4A to 4E schematically show interface change diagrams of the information processing method according to the embodiment of the present invention.
As shown in the figure, taking a switching process in which a first object located at a first position (upper right corner) of a current interface is used as a song playing entry and a second object located at a second position (bottom) of the current interface is used as a song playing bar as an example, an interface change schematic diagram of the information processing method of the present invention is described:
1) as shown in FIG. 4A, long-pressing the icon of the song playing entry in the upper right corner causes it to become active, and the icon in the active state can support dragging.
2) As shown in fig. 4B, a drag-destination guidance prompt of "drag to this area" appears at the bottom when dragging.
3) As shown in fig. 4C, when the drag is continued to the vicinity of the guide area, a generation guide prompt of "release generation play bar" appears.
4) As shown in FIG. 4D, the hands-free icon automatically reverts to the default inactive state before state 3 is reached.
5) When the generation of the guidance prompt occurs, the release of the hands releases the generation of the song play bar, as shown in fig. 4E. The generated song play bar contains album cover, song title, singer name, play/pause button, and play list button. Sliding left or right supports the last/next song to be cut.
Fig. 5A schematically shows a sixth flowchart of an information processing method according to an embodiment of the present invention.
As shown in fig. 5A, the method includes operations S511 and S512 in addition to the aforementioned operations S210 and S220. Wherein:
in operation S511, a third operation gesture for a second object in the current interface is detected to obtain a second detection result.
In operation S512, in a case that the second detection result indicates that the third operation gesture conforms to the second preset rule, the first object is presented at the first position of the current interface.
According to the embodiment of the invention, not only the free switching from the first object to the second object can be realized, but also the free switching from the second object to the first object can be realized. As mentioned above, the second object is presented at the second position of the current interface and displays some or all of the specific information, and clicking the first object can switch to display the interface carrying the specific information.
According to the embodiment of the invention, the free switching from the second object to the first object can be realized under the condition that the second object is displayed on the current interface, and the condition that the second object is displayed on the current interface can be that the second object is displayed on the current interface under the default condition, or the second object is displayed on the current interface after the free switching from the first object to the second object is realized.
According to the embodiment of the invention, a third operation gesture can be executed on the second object, and the first object is presented at the first position of the current interface under the condition that the third operation gesture conforms to a second preset rule, so that switching from the second object to the first object is realized.
According to the embodiment of the invention, the free switching between the first object and the second object is realized through gesture operation, namely according to the operation gesture of a user, the free switching between the first object and the second object can be realized, the free switching between the second object and the first object can also be realized, the user-defined operation according to the preference of the user is realized, the multi-form switching is realized, the interactive experience between the user and the electronic equipment is better, and meanwhile, the display position of the second object can be avoided, and the second object is positioned outside a thumb hot area in a one-hand operation state and is inconvenient to operate.
Fig. 5B schematically shows a flowchart seven of an information processing method according to an embodiment of the present invention.
As shown in fig. 5B, the method may further include operations S521 and S522 in addition to the aforementioned operations S210, S220, S511, and S512. Wherein:
in operation S521, it is detected whether a fourth operation gesture for the second object in the current interface exists.
In operation S522, in a case where it is detected that the fourth operation gesture exists, the third operation gesture for detecting the second object in the current interface is started.
According to an embodiment of the present invention, the second operation gesture is one of the following: the continuous pressing time exceeds a third threshold value, and the number of clicks in a time range of a fourth threshold value is at least two.
According to the embodiment of the invention, in the case that a fourth operation gesture with the continuous pressing time exceeding a third threshold exists, or in the case that the fourth operation gesture with the clicking times at least twice within the time range of the fourth threshold exists, the third operation gesture for detecting the second object in the current interface is started. The fourth operation gesture may cause the second object to be in an activated state, and the second object in the activated state may support a drag operation.
By the embodiment of the invention, under the condition that the fourth operation gesture exists, the third operation gesture aiming at the second object in the current interface is detected, so that the error switching caused by the error operation of the user can be effectively avoided, and the computing resource of the electronic equipment is saved.
Fig. 5C schematically shows a flowchart eight of an information processing method according to an embodiment of the present invention.
As shown in fig. 5C, in addition to the aforementioned operations S210, S220, S511, S512, S521 and S522, in the case that the presence of the fourth operation gesture is detected, the method may further include operations S531: and displaying the third prompt message at a fifth position of the current interface.
According to the embodiment of the invention, the third prompt message is used for prompting the user to execute a third operation gesture conforming to a second preset rule so as to present the first object at the first position of the current interface. Such as a drag-destination guidance prompt that may appear "drag to this area" in the upper right corner of the navigation bar when dragged.
According to the embodiment of the invention, under the condition that the fourth operation gesture is detected, the third prompt message is displayed at the fifth position of the current interface, so that the user can be prompted to execute the third operation gesture conforming to the second preset rule to present the first object at the first position of the current interface, clear guide information is provided for the user, and the interaction experience between the user and the electronic equipment is better.
FIG. 5D schematically shows a flowchart for detecting a third operation gesture on a second object in the current interface to obtain a second detection result according to the embodiment of the present invention.
As shown in fig. 5D, the aforementioned operation S511 (detecting a third operation gesture for a second object in the current interface to obtain a second detection result) may include operations S541 and S542. Wherein:
in operation S541, it is detected whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance.
In operation S542, it is detected whether the second object is dragged within a second preset range by the second drag operation.
According to the embodiment of the present invention, the third operation gesture includes the second drag operation, and there are three methods for detecting the third operation gesture for the second object in the current interface to obtain the second detection result, which may be selected according to the actual situation, and the present invention is not limited thereto.
The first method comprises the following steps: whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance or not can be detected, and under the condition that the second dragging distance reaches the second preset distance, a second detection result shows that the third operation gesture accords with a second preset rule. Correspondingly, under the condition that the second dragging distance does not reach the second preset distance, the second detection result shows that the third operation gesture does not accord with the second preset rule.
And the second method comprises the following steps: the method may only detect whether the second object is dragged by the second dragging operation into a second preset range, where the second preset range is a range in which the first object is to be displayed, and the second detection result indicates that the third operation gesture conforms to a second preset rule under the condition that the second object is dragged by the second dragging operation into the second preset range. Correspondingly, under the condition that the second object is not dragged to be within the second preset range by the second dragging operation, the second detection result shows that the third operation gesture does not accord with the second preset rule.
And the third is that: the method and the device can detect whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance or not, and also detect whether the second object is dragged to a second preset range by the second dragging operation, and under the condition that the second dragging distance and the second object are consistent, the second detection result shows that the third operation gesture accords with a second preset rule. Accordingly, in the case that neither of the first operation gesture and the second operation gesture does not meet the first preset rule, the second detection result indicates that the third operation gesture does not meet the second preset rule.
According to the embodiment of the invention, whether the third operation gesture accords with the second preset rule is detected by utilizing whether the second dragging distance corresponding to the second dragging operation reaches the second preset distance and/or whether the second object is dragged to the second preset range to be displayed by the first object through the second dragging operation, so that various detection methods are provided to obtain the detection result whether the third gesture accords with the second preset rule, and the accuracy of the detection result is improved.
Fig. 5E schematically shows a flowchart nine of an information processing method according to an embodiment of the present invention.
As shown in fig. 5E, in addition to the foregoing operations S210, S220, S511, and S512, in the case that the second detection result indicates that the third operation gesture conforms to the second preset rule, the method may further include operations S551: and displaying the fourth prompt message at a sixth position of the current interface.
According to the embodiment of the invention, the fourth prompt message is used for prompting the user to release the third operation gesture so as to present the first object at the first position of the current interface. And under the condition that the second detection result shows that the third operation gesture accords with a second preset rule, displaying fourth prompt information for prompting the user to release the third operation gesture so as to present the first object at the first position of the current interface at a sixth position of the current interface. If the second object is dragged to the vicinity of the guide area by the third operation gesture, a generation guide prompt of 'release generation of song playing entrance' appears to prompt the user to release to generate the song playing entrance, and when the generation guide prompt appears, the release of the hands releases to generate the song playing entrance. The generated song playing entry has small occupation, and a click can enter a song playing detail page without additional other operations.
According to the embodiment of the invention, the fourth prompt information for prompting the user to release the third operation gesture to present the first object at the first position of the current interface is displayed at the sixth position of the current interface, so that a clear action guide prompt can be provided for the user, the user is guided to execute the next operation, and the interactive experience of the user is improved.
Fig. 5F schematically shows a flowchart ten of an information processing method according to an embodiment of the present invention.
As shown in fig. 5F, the method may include operations S561, in addition to the aforementioned operations S210, S220, S511, and S512: and under the condition that the second detection result shows that the third operation gesture does not accord with the second preset rule, presenting a second object at the second position of the current interface, and simultaneously not displaying the first object at the first position.
According to the embodiment of the invention, under the condition that the second dragging distance corresponding to the second dragging operation does not reach the second preset distance and/or the second object is not dragged to the second preset range by the second dragging operation, the third operation gesture does not accord with the second preset rule, at this time, the second object is presented at the second position of the current interface, and the first object is not displayed at the first position, namely, the switching execution of the second object to the first object fails, and the second object can automatically recover to the default inactivated state.
According to the embodiment of the invention, under the condition that the second detection result shows that the third operation gesture does not accord with the second preset rule, the second object is presented at the second position of the current interface, and the first object is not displayed at the first position, so that a user can cancel the switching from the second object to the first object according to the actual condition, and the misoperation of the user is effectively avoided.
According to the embodiment of the invention, the first object is presented at the first position of the current interface, and meanwhile, the second object is not displayed on the current interface.
Fig. 6A to 6E schematically show interface change diagrams of an information processing method according to still another embodiment of the present invention.
As shown in the figure, the information processing method of the present invention is described by taking a switching process in which the second object is a song play bar and the first object is a song play entry as an example.
1) As shown in fig. 6A, a long press on the bottom song play bar causes the play bar to become active. The active playbar may support dragging.
2) As shown in fig. 6B, the drag destination guidance prompt appears in the upper right corner of the navigation bar when dragged.
3) As shown in fig. 6C, when the drag is continued to the vicinity of the guide area, the drag-destination guide prompt becomes the generation guide prompt.
4) As shown in fig. 6D, the hands-free play bar automatically returns to the default inactive state before reaching a state where the drag-destination guidance prompt becomes the guidance prompt generated.
5) When the generation of the guidance prompt occurs, the release of the hands releases the generation of the song play entry, as shown in fig. 6E. The generated song playing entrance has small occupation, and clicking can enter a song playing detail page without additional other operations.
According to an embodiment of the invention, in the method of any one of the above: the first object is an audio playing page entry key, wherein the specific information comprises audio playing details; the second object is an audio play bar, wherein part or all of the specific information comprises at least one of play, pause, fast forward, fast backward, volume up, volume down, next or previous key.
Through the embodiment of the invention, the page which displays more interactive information than the second object can be switched to by clicking the first object, so that richer interaction possibilities can be provided for a user, and the user experience is better.
Exemplary devices
Having described exemplary modes of exemplary embodiments of the present invention, an information processing system for implementing an information processing method of exemplary embodiments of the present invention will be described in detail next with reference to fig. 7, 8A to 8E, and 9A to 9F.
FIG. 7 schematically shows a first block diagram of an information handling system according to an embodiment of the present invention.
As shown in fig. 7, the information processing system 700 may include a first detection module 710 and a first rendering module 720. Wherein:
the first detection module 710 is configured to detect a first operation gesture for a first object in the current interface to obtain a first detection result.
And the first presenting module 720 is configured to present the second object at the second position of the current interface when the first detection result indicates that the first operation gesture conforms to the first preset rule.
The first rendering module 720 is further configured to: the first object is no longer displayed in the current interface while the second object is presented in the second position of the current interface.
According to the embodiment of the invention, under the condition that the detected first operation gesture for the first object in the current interface accords with the first preset rule, the second object is presented at the second position of the current interface, and meanwhile, the first object is not displayed on the current interface any more, so that the first object is freely switched to the second object by utilizing the gesture operation which accords with the preset rule under the condition that the first operation gesture does not leave the current interface, and the intelligent experience of man-machine interaction is improved.
FIG. 8A schematically shows a block diagram two of an information handling system according to an embodiment of the present invention.
As shown in fig. 8A, the information processing system 700 may include a second detection module 811 and a third detection module 812 in addition to the first detection module 710 and the first rendering module 720. Wherein:
the second detecting module 811 is configured to detect whether a second operation gesture for the first object in the current interface exists.
And a third detecting module 812, configured to, in a case that it is detected that the second operation gesture exists, start to detect the first operation gesture for the first object in the current interface.
According to an embodiment of the present invention, the second operation gesture is one of the following: the continuous pressing time exceeds a first threshold value, and the number of clicks in a time range of a second threshold value is at least two.
By the embodiment of the invention, the first operation gesture aiming at the first object in the current interface is detected under the condition that the second operation gesture exists, so that the error switching caused by the error operation of the user can be effectively avoided, and the computing resource of the electronic equipment is saved.
FIG. 8B schematically shows a block diagram three of an information handling system according to an embodiment of the present invention.
As shown in fig. 8B, the information processing system 700 may include a first display module 821, in addition to the first detection module 710, the first presentation module 720, the second detection module 811, and the third detection module 812: and the first prompt message is displayed at the third position of the current interface.
According to the embodiment of the invention, under the condition that the second operation gesture is detected to exist, the first prompt information is displayed at the third position of the current interface, so that the user can be prompted to execute the first operation gesture conforming to the first preset rule to present the second object at the second position of the current interface, clear guide information is provided for the user, and the interaction experience between the user and the electronic equipment is better.
FIG. 8C schematically shows a block diagram of a first detection module according to an embodiment of the invention.
As shown in fig. 8C, the first detection module 710 may include a first detection submodule 831 and/or a second detection submodule 832. Wherein:
the first detecting submodule 831 is configured to detect whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance.
And the second detecting submodule 832 is configured to detect whether the first object is dragged within a first preset range by the first dragging operation.
According to the embodiment of the invention, whether the first operation gesture accords with the first preset rule is detected by utilizing whether the first dragging distance corresponding to the first dragging operation reaches the first preset distance and/or whether the first object is dragged to the first preset range to be displayed by the second object through the first dragging operation, a plurality of detection methods are provided to obtain the detection result whether the first gesture accords with the first preset rule, and the accuracy of the detection result is improved.
FIG. 8D schematically shows a fourth block diagram of an information handling system according to an embodiment of the present invention.
As shown in fig. 8D, the information processing system 700 may further include a second display module 841, in addition to the first detection module 710 and the first presentation module 720: and the display module is used for displaying the second prompt message at the fourth position of the current interface.
Through the embodiment of the invention, the second prompt information for prompting the user to release the first operation gesture to present the second object at the second position of the current interface is displayed at the fourth position of the current interface, so that clear action guidance prompt can be provided for the user, the user is guided to execute the next operation, and the interactive experience of the user is improved.
FIG. 8E schematically shows a block diagram five of an information handling system according to an embodiment of the present invention.
As shown in fig. 8E, the information processing system 700 may include, in addition to the first detection module 710 and the first presentation module 720, a second presentation module 851: and the first object is presented at the first position of the current interface and the second object is not displayed at the second position at the same time when the first detection result shows that the first operation gesture does not accord with the first preset rule.
According to the embodiment of the invention, under the condition that the first detection result shows that the first operation gesture does not accord with the first preset rule, the first object is presented at the first position of the current interface, and the second object is not displayed at the second position, so that a user can cancel the switching from the first object to the second object according to the actual condition, and the misoperation of the user is effectively avoided.
FIG. 9A schematically shows a block diagram six of an information handling system according to an embodiment of the present invention.
As shown in fig. 9A, the information processing system 900 may include a fourth detection module 911 and a third presentation module 912 in addition to the first detection module 710 and the first presentation module 720. Wherein:
a fourth detection module 911, configured to detect a third operation gesture for a second object in the current interface to obtain a second detection result.
And the third presenting module 912 is configured to present the first object at the first position of the current interface when the second detection result indicates that the third operation gesture conforms to the second preset rule.
The third rendering module 912 is further configured to: the second object is no longer displayed in the current interface while the first object is presented in the first position of the current interface.
According to the embodiment of the invention, the free switching between the first object and the second object is realized through gesture operation, namely according to the operation gesture of a user, the free switching between the first object and the second object can be realized, the free switching between the second object and the first object can also be realized, the user-defined operation according to the preference of the user is realized, the multi-form switching is realized, the interactive experience between the user and the electronic equipment is better, and meanwhile, the display position of the second object can be avoided, and the second object is positioned outside a thumb hot area in a one-hand operation state and is inconvenient to operate.
FIG. 9B schematically shows a block diagram of a first detection module according to an embodiment of the invention.
As shown in fig. 9B, the information processing system 900 may include a fifth detection module 921 and a sixth detection module 922 in addition to the first detection module 710, the first presentation module 720, the fourth detection module 911, and the third presentation module 912. Wherein:
the fifth detecting module 921 is configured to detect whether a fourth operation gesture for the second object in the current interface exists.
And a sixth detecting module 922, configured to, in a case that it is detected that the fourth operation gesture exists, start to perform detection of a third operation gesture for the second object in the current interface.
According to an embodiment of the present invention, the second operation gesture is one of the following: the continuous pressing time exceeds a third threshold value, and the number of clicks in a time range of a fourth threshold value is at least two.
By the embodiment of the invention, under the condition that the fourth operation gesture exists, the third operation gesture aiming at the second object in the current interface is detected, so that the error switching caused by the error operation of the user can be effectively avoided, and the computing resource of the electronic equipment is saved.
Fig. 9C schematically shows a block diagram seven of an information processing system according to an embodiment of the present invention.
As shown in fig. 9C, the information processing system 900 may include a third display module 931 in addition to the first detection module 710, the first presentation module 720, the fourth detection module 911, the third presentation module 912, the fifth detection module 921, and the sixth detection module 922: and the third prompt message is displayed at the fifth position of the current interface.
According to the embodiment of the invention, under the condition that the fourth operation gesture is detected, the third prompt message is displayed at the fifth position of the current interface, so that the user can be prompted to execute the third operation gesture conforming to the second preset rule to present the first object at the first position of the current interface, clear guide information is provided for the user, and the interaction experience between the user and the electronic equipment is better.
FIG. 9D schematically shows a block diagram of a fourth detection module according to an embodiment of the invention.
As shown in fig. 9D, the fourth detection module 911 may include a third detection sub-module 941 and a fourth detection sub-module 942. Wherein:
the third detecting sub-module 941 is configured to detect whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance.
The fourth detecting sub-module 942 is configured to detect whether the second object is dragged to the second preset range by the second dragging operation.
According to the embodiment of the invention, whether the third operation gesture accords with the second preset rule is detected by utilizing whether the second dragging distance corresponding to the second dragging operation reaches the second preset distance and/or whether the second object is dragged to the second preset range to be displayed by the first object through the second dragging operation, so that various detection methods are provided to obtain the detection result whether the third gesture accords with the second preset rule, and the accuracy of the detection result is improved.
FIG. 9E schematically shows a block diagram of a first detection module according to an embodiment of the invention.
As shown in fig. 9E, the information processing system 900 may further include a fourth display module 951, in addition to the first detection module 710, the first presentation module 720, the fourth detection module 911, and the third presentation module 912: and the fourth prompt message is used for prompting the user to release the third operation gesture so as to present the first object at the first position of the current interface.
According to the embodiment of the invention, the fourth prompt information for prompting the user to release the third operation gesture to present the first object at the first position of the current interface is displayed at the sixth position of the current interface, so that a clear action guide prompt can be provided for the user, the user is guided to execute the next operation, and the interactive experience of the user is improved.
Fig. 9F schematically shows a block diagram eight of an information processing system according to an embodiment of the present invention.
As shown in fig. 9F, the information processing system 900 may include a fourth presenting module 961 in addition to the first detecting module 710, the first presenting module 720, the fourth detecting module 911, and the third presenting module 912: and the second object is presented at the second position of the current interface and the first object is not displayed at the first position when the second detection result shows that the third operation gesture does not accord with the second preset rule.
According to the embodiment of the invention, under the condition that the second detection result shows that the third operation gesture does not accord with the second preset rule, the second object is presented at the second position of the current interface, and the first object is not displayed at the first position, so that a user can cancel the switching from the second object to the first object according to the actual condition, and the misoperation of the user is effectively avoided.
According to an embodiment of the invention, in the method of any one of the above: the first object is an audio playing page entry key, wherein the specific information comprises audio playing details; the second object is an audio play bar, wherein part or all of the specific information comprises at least one of play, pause, fast forward, fast backward, volume up, volume down, next or previous key.
Through the embodiment of the invention, the page which displays more interactive information than the second object can be switched to by clicking the first object, so that richer interaction possibilities can be provided for a user, and the user experience is better.
According to an exemplary embodiment of the invention, any number of the modules, sub-modules, or at least part of the functionality of any number thereof may be implemented in one module. Any one or more of the modules, sub-modules according to exemplary embodiments of the present invention may be implemented by being divided into a plurality of modules. Any one or more of the modules, sub-modules according to exemplary embodiments of the present invention may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of three implementations, or in any suitable combination of any of these. Alternatively, one or more of the modules, sub-modules according to exemplary embodiments of the invention may be at least partly implemented as computer program modules, which, when executed, may perform corresponding functions.
For example, any number of the first detection module 710, the first presentation module 720, the second detection module 811, the third detection module 812, the first display module 821, the second display module 841, the second presentation module 851, the fourth detection module 911, the third presentation module 912, the fifth detection module 921, the sixth detection module 822, the third display module 931, the fourth display module 951, and the fourth presentation module 961 may be implemented in one module, or any one of them may be divided into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an exemplary embodiment of the present invention, at least one of the first detection module 710, the first presentation module 720, the second detection module 811, the third detection module 812, the first display module 821, the second display module 841, the second presentation module 851, the fourth detection module 911, the third presentation module 912, the fifth detection module 921, the sixth detection module 822, the third display module 931, the fourth display module 951, and the fourth presentation module 961 may be at least partially implemented as a hardware circuit, such as Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), systems on a chip, systems on a substrate, systems on a package, Application Specific Integrated Circuits (ASICs), or in hardware or firmware in any other reasonable manner by integrating or packaging circuits, or in any one of three implementations, software, hardware and firmware, or in any suitable combination of any of them. Alternatively, at least one of the first detecting module 710, the first presenting module 720, the second detecting module 811, the third detecting module 812, the first display module 821, the second display module 841, the second presenting module 851, the fourth detecting module 911, the third presenting module 912, the fifth detecting module 921, the sixth detecting module 822, the third display module 931, the fourth display module 951, and the fourth presenting module 961 may be at least partially implemented as a computer program module that, when executed by a computer, may perform the functions of the respective modules.
According to the embodiment of the invention, the function file is loaded based on the request data and the function processing class, the specific function is realized in the built-in browser, and a way which can utilize the webpage end as a display entrance of the user interface in the built-in browser and quickly develop and realize the function is provided for the specific function.
Exemplary Medium
Having described exemplary apparatus of exemplary embodiments of the present invention, exemplary media for implementing information processing of exemplary embodiments of the present invention are described in detail below with reference to FIG. 10.
An embodiment of the present invention provides a medium storing computer-executable instructions that, when executed by a processing unit, cause the processing unit to perform any one of the above-described information processing methods in the above-described method embodiments.
In some possible embodiments, the various aspects of the present invention may also be implemented in a program product, which includes program code for causing a device to perform an operation (or step) in the information sending method according to various exemplary embodiments of the present invention described in the above section "exemplary method" of this specification when the program product runs on the device, for example, the device may perform operation S210 as shown in fig. 2, and detect a first operation gesture for a first object in the current interface to obtain a first detection result. In operation S220, when the first detection result indicates that the first operation gesture conforms to the first preset rule, the second object is presented at the second position of the current interface.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
As shown in fig. 10, an information processing program product 100 according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a device, such as a personal computer. However, the program product of the present invention is not limited in this respect, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Exemplary computing device
Having described the method, medium, and apparatus of exemplary embodiments of the present invention, a computing device for information processing of exemplary embodiments of the present invention is next described with reference to fig. 11.
The embodiment of the invention also provides the computing equipment. As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible embodiments, a computing device according to the present invention may include at least one processing unit, and at least one memory unit. Wherein the storage unit stores program code which, when executed by the processing unit, causes the processing unit to perform the steps in the information presentation methods according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of this specification. For example, the processing unit may perform operation S210 as shown in fig. 2, and detect a first operation gesture with respect to a first object in the current interface to obtain a first detection result. In operation S220, when the first detection result indicates that the first operation gesture conforms to the first preset rule, the second object is presented at the second position of the current interface.
A computing device 110 for information processing according to this embodiment of the present invention is described below with reference to fig. 11. The computing device 110 shown in FIG. 11 is only one example and should not impose any limitations on the functionality or scope of use of embodiments of the present invention.
As shown in fig. 11, computing device 110 is embodied in the form of a general purpose computing device. Components of computing device 110 may include, but are not limited to: the at least one processing unit 1101, the at least one memory unit 1102, and a bus 1103 connecting different system components (including the memory unit 1102 and the processing unit 1101).
The bus 1103 includes an address bus, a data bus, and a control bus.
The storage unit 1102 may include readable media in the form of volatile memory, such as Random Access Memory (RAM)11021 and/or cache memory 11022, and may further include Read Only Memory (ROM) 11023.
The memory unit 1102 may also include a program/utility 11025 having a set (at least one) of program modules 11024, such program modules 11024 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The computing device 110 may also communicate with one or more external devices 1104 (e.g., keyboard, pointing device, bluetooth device, etc.), which may be through input/output (I/O) interfaces 1105. Also, the computing device 110 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 1106. As shown, the network adapter 1106 communicates with other modules of the computing device 110 over the bus 1103. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the computing device 110, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
It should be noted that although in the above detailed description several units/modules or sub-units/modules of the apparatus are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more of the units/modules described above may be embodied in one unit/module according to embodiments of the invention. Conversely, the features and functions of one unit/module described above may be further divided into embodiments by a plurality of units/modules.
Moreover, while the operations of the method of the invention are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
While the spirit and principles of the invention have been described with reference to several particular embodiments, it is to be understood that the invention is not limited to the particular embodiments disclosed, nor is the division of the aspects, which is for convenience only as the features in these aspects may not be combined to benefit from the present disclosure. The invention is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (30)

1. An information processing method comprising:
detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and clicking the first object can switch and display an interface carrying specific information, the first object is an audio playing page entry key, and the specific information comprises audio playing details;
and under the condition that the first detection result shows that the first operation gesture accords with a first preset rule, presenting a second object at a second position of the current interface, and simultaneously no longer displaying the first object on the current interface, wherein the second object is an audio play bar and displays part of the specific information, and the part of the specific information comprises at least one of playing, pausing, fast forwarding, fast rewinding, volume increasing, volume decreasing, next key or previous key.
2. The method of claim 1, wherein prior to the detecting a first operational gesture with respect to a first object in a current interface to obtain a first detection result, the method further comprises:
detecting whether a second operation gesture aiming at the first object in the current interface exists;
and starting to detect a first operation gesture aiming at a first object in the current interface under the condition that the second operation gesture exists.
3. The method of claim 2, wherein the second operational gesture is one of:
the duration of the compression exceeds a first threshold;
the number of clicks is at least two within a time frame of a second threshold.
4. The method of claim 2, wherein, in the event that the presence of the second operational gesture is detected, the method further comprises:
displaying first prompt information at a third position of the current interface, wherein the first prompt information is used for prompting a user to execute the first operation gesture conforming to the first preset rule so as to present the second object at a second position of the current interface.
5. The method of claim 1, wherein the first operational gesture comprises a first drag operation, the detecting the first operational gesture for the first object in the current interface to obtain a first detection result comprising:
detecting whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance or not, wherein the first detection result shows that the first operation gesture accords with a first preset rule under the condition that the first dragging distance reaches the first preset distance; and/or
Detecting whether the first object is dragged to a first preset range by the first dragging operation, wherein the second object is displayed in the first preset range, and the first detection result shows that the first operation gesture conforms to the first preset rule under the condition that the first object is dragged to the first preset range by the first dragging operation.
6. The method according to claim 1, wherein in case the first detection result indicates that the first operation gesture complies with a first preset rule, the method further comprises:
displaying second prompt information at a fourth position of the current interface, wherein the second prompt information is used for prompting a user to release the first operation gesture so as to present the second object at the second position of the current interface.
7. The method of claim 1, wherein the method further comprises:
and if the first detection result shows that the first operation gesture does not accord with the first preset rule, presenting the first object at the first position of the current interface, and simultaneously not displaying the second object at the second position.
8. The method of claim 1, wherein the method further comprises:
detecting a third operation gesture aiming at the second object in the current interface to obtain a second detection result, wherein the second object is presented at the second position of the current interface and is displayed with part or all of the specific information;
and under the condition that the second detection result shows that the third operation gesture accords with a second preset rule, presenting the first object at the first position of the current interface, and simultaneously no longer displaying the second object on the current interface, wherein clicking the first object can switch to display the interface carrying the specific information.
9. The method of claim 8, wherein prior to the detecting a third operational gesture for the second object in the current interface to obtain a second detection result, the method further comprises:
detecting whether a fourth operation gesture aiming at the second object in the current interface exists;
and in the case that the fourth operation gesture is detected to exist, starting to detect a third operation gesture aiming at the second object in the current interface.
10. The method of claim 9, wherein the fourth operational gesture is one of:
the sustained compression time exceeds a third threshold;
the number of clicks is at least two within the time frame of the fourth threshold.
11. The method of claim 9, wherein, in the event that the presence of the fourth operational gesture is detected, the method further comprises:
displaying third prompt information at a fifth position of the current interface, wherein the third prompt information is used for prompting a user to execute the third operation gesture conforming to the second preset rule so as to present the first object at the first position of the current interface.
12. The method of claim 8, wherein the third operational gesture includes a second drag operation, the detecting the third operational gesture for the second object in the current interface to obtain a second detection result includes:
detecting whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance, wherein the second detection result shows that the third operation gesture conforms to a second preset rule under the condition that the second dragging distance reaches the second preset distance; and/or
Detecting whether the second object is dragged to be within a second preset range by the second dragging operation, wherein the first object is displayed within the second preset range, and the second detection result shows that the third operation gesture conforms to the second preset rule under the condition that the second object is dragged to be within the second preset range by the second dragging operation.
13. The method according to claim 8, wherein in case that the second detection result indicates that the third operation gesture complies with a second preset rule, the method further comprises:
displaying fourth prompt information at a sixth position of the current interface, wherein the fourth prompt information is used for prompting a user to release the third operation gesture to present the first object at the first position of the current interface.
14. The method of claim 8, wherein the method further comprises:
and when the second detection result shows that the third operation gesture does not accord with the second preset rule, presenting the second object at the second position of the current interface, and simultaneously not displaying the first object at the first position.
15. An information processing system comprising:
the first detection module is used for detecting a first operation gesture aiming at a first object in a current interface to obtain a first detection result, wherein the first object is presented at a first position of the current interface, and an interface carrying specific information can be switched and displayed by clicking the first object, the first object is an entry key of an audio playing page, and the specific information comprises audio playing details;
and a first presenting module, configured to, when the first detection result indicates that the first operation gesture conforms to a first preset rule, present a second object at a second position of the current interface, and simultaneously no longer display the first object on the current interface, where the second object is an audio play bar on which a portion of the specific information is displayed, and the portion of the specific information includes at least one of play, pause, fast forward, fast backward, volume up, volume down, next key or previous key.
16. The system of claim 15, wherein the system further comprises:
the second detection module is used for detecting whether a second operation gesture aiming at the first object in the current interface exists or not;
and the third detection module is used for starting to detect the first operation gesture aiming at the first object in the current interface under the condition that the second operation gesture is detected to exist.
17. The system of claim 16, wherein the second operational gesture is one of:
the duration of the compression exceeds a first threshold;
the number of clicks is at least two within a time frame of a second threshold.
18. The system of claim 16, wherein the system further comprises:
the first display module is used for displaying first prompt information at a third position of the current interface, wherein the first prompt information is used for prompting a user to execute the first operation gesture conforming to the first preset rule so as to present the second object at a second position of the current interface.
19. The system of claim 15, wherein the first operational gesture comprises a first drag operation, the first detection module comprising:
the first detection submodule is used for detecting whether a first dragging distance corresponding to the first dragging operation reaches a first preset distance, wherein the first detection result shows that the first operation gesture accords with a first preset rule under the condition that the first dragging distance reaches the first preset distance; and/or
And the second detection submodule is used for detecting whether the first object is dragged to a first preset range by the first dragging operation, wherein the second object is displayed in the first preset range, and the first detection result shows that the first operation gesture conforms to the first preset rule under the condition that the first object is dragged to the first preset range by the first dragging operation.
20. The system of claim 15, wherein the system further comprises:
the second display module is used for displaying second prompt information at a fourth position of the current interface, wherein the second prompt information is used for prompting a user to release the first operation gesture so as to present the second object at the second position of the current interface.
21. The system of claim 15, wherein the system further comprises:
and the second control module is used for presenting the first object at the first position of the current interface and simultaneously not displaying the second object at the second position under the condition that the first detection result shows that the first operation gesture does not accord with the first preset rule.
22. The system of claim 15, wherein the system further comprises:
a fourth detection module, configured to detect a third operation gesture on the second object in the current interface to obtain a second detection result, where the second object is presented at the second location of the current interface and displays part or all of the specific information;
and the third control module is used for presenting the first object at the first position of the current interface and simultaneously no longer displaying the second object on the current interface under the condition that the second detection result shows that the third operation gesture accords with a second preset rule, wherein the interface carrying the specific information can be switched and displayed by clicking the first object.
23. The system of claim 22, wherein the system further comprises:
a fifth detection module, configured to detect whether a fourth operation gesture for the second object in the current interface exists;
and the sixth detection module is used for starting to detect a third operation gesture aiming at the second object in the current interface under the condition that the fourth operation gesture is detected to exist.
24. The system of claim 23, wherein the fourth operational gesture is one of:
the sustained compression time exceeds a third threshold;
the number of clicks is at least two within the time frame of the fourth threshold.
25. The system of claim 23, wherein the system further comprises:
the third display module is configured to display third prompt information at a fifth position of the current interface, where the third prompt information is used to prompt a user to execute the third operation gesture conforming to the second preset rule to present the first object at the first position of the current interface.
26. The system of claim 22, wherein the third operational gesture comprises a second drag operation, the fourth detection module comprising:
a third detection submodule, configured to detect whether a second dragging distance corresponding to the second dragging operation reaches a second preset distance, where the second detection result indicates that the third operation gesture conforms to the second preset rule when the second dragging distance reaches the second preset distance; and/or
And the fourth detection submodule is used for detecting whether the second object is dragged to a second preset range by the second dragging operation, wherein the first object is displayed in the second preset range, and the second detection result shows that the third operation gesture conforms to the second preset rule under the condition that the second object is dragged to the second preset range by the second dragging operation.
27. The system of claim 21, wherein the system further comprises:
the fourth display module is configured to display fourth prompt information at a sixth position of the current interface, where the fourth prompt information is used to prompt a user to release a third operation gesture to present the first object at the first position of the current interface.
28. The system of claim 22, wherein the system further comprises:
and the fourth control module is used for presenting the second object at the second position of the current interface and simultaneously not displaying the first object at the first position under the condition that the second detection result shows that the third operation gesture does not accord with the second preset rule.
29. A medium storing computer executable instructions for implementing the method of any one of claims 1 to 14 when executed by a processing unit.
30. A computing device, comprising:
a processing unit; and
a storage unit storing computer-executable instructions for implementing the method of any one of claims 1 to 14 when executed by the processing unit.
CN201811528792.4A 2018-12-13 2018-12-13 Information processing method, system, medium, and computing device Active CN110045895B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811528792.4A CN110045895B (en) 2018-12-13 2018-12-13 Information processing method, system, medium, and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811528792.4A CN110045895B (en) 2018-12-13 2018-12-13 Information processing method, system, medium, and computing device

Publications (2)

Publication Number Publication Date
CN110045895A CN110045895A (en) 2019-07-23
CN110045895B true CN110045895B (en) 2021-05-18

Family

ID=67273715

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811528792.4A Active CN110045895B (en) 2018-12-13 2018-12-13 Information processing method, system, medium, and computing device

Country Status (1)

Country Link
CN (1) CN110045895B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112883358A (en) * 2021-02-25 2021-06-01 中国工商银行股份有限公司 Device unlocking method, device, electronic device, medium, and program product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924860A (en) * 2006-10-08 2007-03-07 网之易信息技术(北京)有限公司 Search engine based search result fast pre-reading device
CN102929557A (en) * 2012-11-08 2013-02-13 东莞宇龙通信科技有限公司 Terminal and terminal manipulation method
CN106095269A (en) * 2016-06-02 2016-11-09 腾讯科技(深圳)有限公司 Method for information display, Apparatus and system
CN106201632A (en) * 2016-07-29 2016-12-07 维沃移动通信有限公司 The access method of a kind of application program and mobile terminal
CN106844019A (en) * 2015-12-04 2017-06-13 阿里巴巴集团控股有限公司 Application control method, application program redirect associated configuration method and device
CN107741815A (en) * 2017-10-26 2018-02-27 上海哔哩哔哩科技有限公司 Gesture operation method and equipment for player

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8499254B2 (en) * 2008-10-27 2013-07-30 Microsoft Corporation Surfacing and management of window-specific controls
US9015641B2 (en) * 2011-01-06 2015-04-21 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9367959B2 (en) * 2012-06-05 2016-06-14 Apple Inc. Mapping application with 3D presentation
US20160188196A1 (en) * 2014-12-30 2016-06-30 Airwatch Llc Floating media player
CN104918095A (en) * 2015-05-19 2015-09-16 乐视致新电子科技(天津)有限公司 Multimedia stream data preview display method and device
CN105988686A (en) * 2015-06-10 2016-10-05 乐视致新电子科技(天津)有限公司 Play interface display method and device as well as terminal
CN107291341B (en) * 2017-07-11 2021-03-09 广州飞傲电子科技有限公司 Method and system for selecting music player through touch turntable rotation
CN108388628B (en) * 2018-02-12 2022-02-22 腾讯科技(深圳)有限公司 Webpage audio playing method and device
CN108762852A (en) * 2018-06-10 2018-11-06 北京酷我科技有限公司 A kind of implementation method of interception Audio Controls and lyrics control linkage effect

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924860A (en) * 2006-10-08 2007-03-07 网之易信息技术(北京)有限公司 Search engine based search result fast pre-reading device
CN102929557A (en) * 2012-11-08 2013-02-13 东莞宇龙通信科技有限公司 Terminal and terminal manipulation method
CN106844019A (en) * 2015-12-04 2017-06-13 阿里巴巴集团控股有限公司 Application control method, application program redirect associated configuration method and device
CN106095269A (en) * 2016-06-02 2016-11-09 腾讯科技(深圳)有限公司 Method for information display, Apparatus and system
CN106201632A (en) * 2016-07-29 2016-12-07 维沃移动通信有限公司 The access method of a kind of application program and mobile terminal
CN107741815A (en) * 2017-10-26 2018-02-27 上海哔哩哔哩科技有限公司 Gesture operation method and equipment for player

Also Published As

Publication number Publication date
CN110045895A (en) 2019-07-23

Similar Documents

Publication Publication Date Title
JP5956725B2 (en) Method, device, and computer program product for providing context-aware help content
US11175968B2 (en) Embedding an interface of one application into an interface of another application
KR102064952B1 (en) Electronic device for operating application using received data
US9645730B2 (en) Method and apparatus for providing user interface in portable terminal
WO2020007012A1 (en) Method and device for displaying search page, terminal, and storage medium
CN108370396B (en) Electronic device, notification display method of electronic device, and computer-readable medium
KR102078753B1 (en) Method for controlling layout and an electronic device thereof
KR20140144104A (en) Electronic apparatus and Method for providing service thereof
US20130254692A1 (en) Method of generating an electronic folder and an electronic device thereof
JP2022548285A (en) Hotspot recommendation pop-up window control method, control device, storage medium and electronic device
AU2011204097A1 (en) Method and apparatus for setting section of a multimedia file in mobile device
KR20140092873A (en) Adaptive input language switching
KR20120020853A (en) Mobile terminal and method for controlling thereof
KR101832394B1 (en) Terminal apparatus, server and contol method thereof
CN115454286A (en) Application data processing method and device and terminal equipment
KR20140112911A (en) Mobile apparatus executing action in display unchecking mode and control method thereof
KR20150051292A (en) Method for sharing contents and electronic device thereof
KR20190107060A (en) Apparatus and method for providing user assistance in a computing system
US20130111405A1 (en) Controlling method for basic screen and portable device supporting the same
KR102192233B1 (en) Display apparatus and contol method thereof
KR102108412B1 (en) Method for providing search service on chatting based on messaging service, and device therefor
US12164588B2 (en) Enhanced navigation in a web browser while avoiding redirects
US20230042757A1 (en) Human-computer interaction method and apparatus, and electronic device
US11243679B2 (en) Remote data input framework
CN110045895B (en) Information processing method, system, medium, and computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant