CN106462327B - Method, system, and medium for generating an arc-shaped path traveled by a user interface element - Google Patents
Method, system, and medium for generating an arc-shaped path traveled by a user interface element Download PDFInfo
- Publication number
- CN106462327B CN106462327B CN201580031303.2A CN201580031303A CN106462327B CN 106462327 B CN106462327 B CN 106462327B CN 201580031303 A CN201580031303 A CN 201580031303A CN 106462327 B CN106462327 B CN 106462327B
- Authority
- CN
- China
- Prior art keywords
- user interface
- arc
- path
- interface element
- arc angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000004044 response Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 description 11
- 230000007423 decrease Effects 0.000 description 4
- 241000699666 Mus <mouse, genus> Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000739 chaotic effect Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Systems, methods, and computer-readable media for generating an arced path for a user interface element to travel are provided herein. According to one implementation, a method comprising operations performed by at least one processor is provided. Operations of the method include determining coordinates of a start point and an end point of the user interface element corresponding to a location on a user interface. The method may also include determining a minimum arc angle and a maximum arc angle for the arc. Additionally, the method may generate an arc path based on the start point coordinate, the end point coordinate, the minimum arc angle, and the maximum arc angle. The method also generates a command to move the user interface element along the arc-shaped path in the user interface.
Description
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional patent application No. 62/016,636, filed 24/6/2014, the disclosure of which is hereby incorporated by reference in its entirety.
Technical Field
The present disclosure generally relates to systems, methods, and computer-readable media for generating an arced path for a user interface element to travel.
Background
The present disclosure relates generally to the field of user interfaces and computerized animations of user interface elements. More particularly, and not by way of limitation, the present disclosure describes methods, systems, and computer-readable media for generating an arced path for a user interface element to travel.
User interface elements, such as icons, windows, and widgets, are common in modern user interfaces for computers, laptops, smart phones, personal digital assistants, and other devices. In some cases, a user interface element may be moved from one location to another location in a user interface. For example, the device may automatically rearrange icons on the display screen based on alphabetical order or frequency of use.
Conventional computing interfaces typically use linear paths to move or animate user interface elements. For example, when moving an icon from one location to another, the shortest path, i.e. a straight line, is used to move from the starting point to the end point.
Disclosure of Invention
According to embodiments of the present disclosure, computer-implemented systems, methods, and computer-readable media are provided for generating an arced path for a user interface element to travel.
According to one embodiment, a computerized method comprising operations performed by at least one processor is provided. Operations of the method may include determining coordinates of a start point and an end point of a user interface element corresponding to a location on a user interface. The method may also include determining a minimum arc angle and a maximum arc angle for the arc. Additionally, the method may generate the arc path based on the start point coordinates, the end point coordinates, the minimum arc angle, and the maximum arc angle. The method may also generate commands for moving or animating user interface elements along an arc-shaped path in the user interface.
In an embodiment, the method may include generating a velocity profile representing a velocity of the user element moving along the arc-shaped path based on the arc-shaped path distance and the slope of the arc-shaped path. The command may move or animate the user element at a speed corresponding to the velocity profile.
In an embodiment, the method may determine the concavity of the arc. An arcuate path may also be created based on concavity.
The minimum arc angle, the maximum arc angle, and the concavity may be based on at least one or more of stored user preferences, system default settings, a size of the user interface, and an orientation of the user interface. Likewise, the arced path may be generated in response to a determination that the coordinates of the start point and the end point are not vertically aligned. In addition, the minimum arc angle and the maximum arc angle may be changed to avoid intersecting an existing arc path for another user interface element. The coordinates of the start point and the end point may correspond to the location of the centroid of the user interface element.
In the present disclosure, computer-readable media and systems for generating an arced path for a user interface element to travel may also be provided. Additional embodiments and related features of the disclosure are described herein.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate several embodiments and aspects of the disclosure and together with the description, serve to explain certain principles of the presently disclosed embodiments.
FIG. 1 illustrates a flow diagram of an example arced travel path process consistent with embodiments of the present disclosure.
FIG. 2 illustrates a view of an example user interface with an arced path consistent with embodiments of the present disclosure.
FIG. 3 illustrates a view of an example user interface with an exception path consistent with embodiments of the present disclosure.
Fig. 4 illustrates a view of the geometry of an example arced path consistent with embodiments of the present disclosure.
FIG. 5 illustrates an example system for implementing embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will now be described. Further embodiments are disclosed in fig. 1-5, which may be implemented together or separately.
Embodiments of the present disclosure generally relate to systems, methods, and computer-readable media for determining an arc path traveled by a user interface element. According to particular embodiments, a user interface (e.g., monitor, display, touch screen, window, container, etc.) has at least one user interface element (e.g., icon, button, control, label, menu item, text box, binary box, image, video, highlight, etc.). For example, a smartphone with a touch screen can display a row of icons. The user may select (e.g., through interaction with the touch screen) a new location for one of the icons. The user interface may simulate natural motion using the determined arc path when moving the icon to the new location.
The user interface may include any human-machine interface, such as a Graphical User Interface (GUI), that allows a user to interact with the computing device. An example user interface may allow a user to control a computing device and receive feedback from the computing device. For example, the user interface may receive input from a user and provide output to the user. An example user interface may allow a user to input data. For example, the user interface may include selectable numbers or a keyboard to enter a telephone number in the computer. The example user interface may be responsive to user input. For example, in response to a user selection, the user interface may highlight the text. The user interface may include one or more user interface elements of various types.
The user interface element may be any portion, share, or component of the user interface. For example, an example user interface may include an icon, button, window, controller, widget, label, menu item, text box, binary box, image, or video. The user interface element may identify an area that receives input from a user, provides input to a user, or a combination of both. For example, the user interface may include elements that provide input, such as entering a number or pressing an arrow to increase a numerical value (e.g., volume of system audio, contrast of a display, scroll bars of a window, etc.). Example output user interface elements may include, for example, boxes that display data from sensors, such as charts showing power consumption over time or displaying time-of-day. The user interface elements may function as both mechanisms for receiving user input and providing output to the user. For example, the user may select an icon to launch the program. When the icon is selected, the icon may be highlighted (e.g., circled, discolored, bolded, etc.) by the interface, providing confirmation to the user that the input was received.
Fig. 1 illustrates a flow diagram of an example arced travel path process 100 consistent with embodiments of the present disclosure. The example process 100 may be implemented by one or more components shown in fig. 5, as described below, but other arrangements are possible. Additionally, process steps may be performed using one or more processors, storage devices, and user interfaces. It should be understood that the steps of process 100 may be performed in any order to achieve the objectives of the present disclosure, and that the illustrated order of fig. 1 is merely an example.
In step 102, the processor may receive an input. In an embodiment, the input may correspond to an input by a user through an input device. For example, a user may select a location on the user interface by tapping a mouse or pressing a touch screen. The user may use other input devices such as a trackball, keyboard, stylus, camera, microphone, joystick. For example, a user may press an arrow key on a keyboard to highlight a particular icon on the user interface. In embodiments, an application may provide input, such as the application highlighting a particular user interface element. For example, a "how-to" manual application may provide input to the user interface to highlight a particular menu item. In another example, the user may provide a widget to select to expand, such as a widget to display weather. The processor may move surrounding user interface elements (e.g., widgets and icons) to make room for the enlarged weather widget. Any input related to a location on the user interface may be used.
In step 104, the processor may determine start and end coordinates of the user interface element. The start and end points may be communicated directly from the received input (step 102), or derived therefrom. For example, the input may be the user selecting an icon and then selecting a new location for the icon. The input may directly identify an icon that may be used to derive the starting point, as well as directly indicate the terminal. In embodiments, the starting point and the ending point may be received from a programming interface, such as a user interface action that requires movement of one or more user interface elements. For example, the user may select to expand the weather widget. The processor may determine that the weather widget will overlap both icons when the weather widget expands. The processor may also determine that the icon should be moved and a new location to which the icon should be moved to for the weather widget to expand the way. The current position of the icon may be the corresponding start point coordinates and the determined new position may be the corresponding end point.
In step 106, the processor may determine whether the start and end coordinates are exceptions. A particular relative position may not travel using an arcuate path. For example, when the start and end points are vertically aligned, it may be preferable or enhanced for the user experience to use a straight travel path, descending a straight path, or ascending a straight path. This may allow the user interface to simulate the appearance of a falling object.
In an embodiment, the processor may consider the orientation of the user interface when determining the exception. For example, when holding a tablet computer horizontally in the air, the coordinates that fall within the vertical alignment exception are actually aligned horizontally on the unadjusted user interface axis. In an embodiment, when the processor determines that the user interface is lying flat on a surface (e.g., a table, desk, etc.), the processor may not trigger any exceptions, but instead may determine an arc-shaped path. This behavior can simulate the chaotic motion of a water droplet landing on a flat surface, moving in a curve rather than a straight line. The processor may determine the orientation using one or more sensors, including, for example, an accelerometer or an ambient light sensor.
In step 108, the processor determines the maximum and minimum arc angles of the arcuate path of travel. The processor may identify the maximum and minimum arc angles stored in the system settings or user preferences. For example, the processor may query user preferences to determine whether minimum and maximum arc angle settings exist and what their corresponding values are.
In an embodiment, the processor may determine the minimum and maximum arc angles based on the orientation of the user interface, the spacing between the starting and ending points, and the size of the user interface. For example, a more gradual curve may be used when the user interface is held vertically, while a more acute curve may be used when the user interface is oriented at a steeper angle. In another example, when the start and end points are close together, the processor may determine a lower maximum arc angle to ensure that the path is smooth and free of sharp curves. The processor may also use the size of the user interface when determining the maximum and minimum arc angles. The size of the display may require the use of sharper curves to accommodate smaller user interface sizes. For example, the processor may determine the resolution of the display used to provide the user interface (e.g., 320 pixels by 240 pixels). When the display is small, the processor may increase the maximum arc angle to accommodate the smaller space of moving user interface elements.
In step 110, the processor may generate an arc-shaped path for the user interface element to travel. The processor may determine the geometry of the arcuate path by fitting a start point and an end point on the circumference. For example, the processor may calculate the center of the circle from the start point and the end point using geometric principles. The processor may calculate the angle formed by the start point, the center of the circle, and the end point to determine the arc angle. The processor may then determine whether the existing arc angle is within the bounds of the minimum arc angle and the maximum arc angle (e.g., via step 108). The processor may modify the circle by moving the center of the circle while maintaining the start and end points on the circumference, thereby increasing or decreasing the arc angle between the start and end points.
In an embodiment, the processor may determine the concavity of the arcuate path. The processor may receive a system setting indicating a rule or preference for concavity. For example, the user may enter settings to make all paths concave up. The processor may determine the concavity based on an orientation of the user interface. The processor may receive a signal from the sensor indicating an orientation of the user interface. The processor may then generate an arced path that is concave relative to the orientation of the user interface so that the user interface may always display the concave arced path regardless of how the user interface is tilted.
When the processor determines in step 106 that the relative positions of the start and end points fall within an exception, then in step 112 the processor may generate a linear or straight path. In some embodiments, the processor may generate a straight line path from the start point to the end point. For example, when the end point is directly below the start point, the processor may determine a straight downward path by making the x-coordinates of the path the same.
In step 114, the processor may determine a velocity profile of the movement of the user interface element along the path (e.g., the path generated in step 110 or step 112). The velocity profile may be a representation of the magnitude of the velocity of the user interface element at a plurality of points along the path. The velocity profile may be static, such that the user interface elements move at the same velocity from the start point to the end point. The velocity profile may be accelerated linearly or exponentially. The velocity profile may decrease and decrease as the user interface element approaches the endpoint.
In an embodiment, the processor determines the velocity profile based on the slope of the path, the orientation of the user interface, and/or system settings (e.g., user preferences). The processor may determine a perceived slope based on the slope of the path and the angle of the user interface orientation. For example, the processor may generate a velocity profile that may increase more rapidly when the path is sharply curved or the user interface is held vertical. The processor may use the perceived slope to generate a velocity profile that may simulate the natural acceleration provided by earth gravity.
In step 116, the processor uses the velocity profile (e.g., the velocity profile determined in step 114) to generate instructions to move or animate the user interface along a path (e.g., the path generated in step 110 or step 112). The instructions may be a series of commands that indicate the user interface element, the path of travel, and the speed at which the user interface element travels. The processor may describe the path of travel using a list of points that enumerate to be moved, or by defining geometric elements that make up the path. For example, a path may be described as an arcuate path based on a circle having a center at (100, 300), a radius of 50, a starting point at (100, 250), and an arc extending 90 degrees counterclockwise.
In an embodiment, the processor generates an instruction indicating the speed at which the user interface element should be moved. The speed instructions may indicate the speed at various points along the path. The processor may generate a command indicative of the velocity profile based on distance or time. For example, the processor may command "speed (distance) — 1 × (distance) ^2+ 10". The speed may be indicated in units of pixels/second.
In an embodiment, the processor may combine the path and velocity instructions by using a command based on a series of vectors indicating the direction and acceleration of the user interface element.
In step 118, the processor may execute instructions (e.g., the instructions generated in step 116). The instructions may cause the user interface element to move from a starting point to an ending point on the user interface. For example, an icon of a web browser may move in an arc to a new location after a weather widget expands into the area where the icon was originally located.
In an embodiment, the processor may ignore conflicts or path intersections when computing the travel path and velocity profile for the plurality of user interface elements. Elements may "pass" or "over" each other without modifying instructions for each user interface element that may be generated, regardless of other user interface elements.
The processor may receive and store user-defined preferences in the system settings. By way of example, the system settings may determine, for example, exceptions (step 106), maximum and minimum arc angles (step 108), path behavior when multiple paths intersect ( steps 110 and 112, as described above), and velocity profiles (step 114). The user interface may receive user input, including the previously listed examples, that is sent to the processor to interpret and modify certain settings. For example, the user may enter "90 degrees" within the text box of the maximum arc angle.
FIG. 2 illustrates a view of an example user interface 201 of a handheld device or system 200 consistent with embodiments of the present disclosure. As shown in fig. 2, system 200 may include a user interface 201. The user interface may be a Graphical User Interface (GUI) including one or more user interface elements (e.g., the first user interface element 210, the second user interface element 241, and the third user interface element 240). For example, the user interface 201 may be a touch screen of a smartphone that includes a plurality of icons that may be selected to launch an application. The user interface may include other user interface elements, including those described above.
The user interface element may have a center or centroid (e.g., first user interface element centroid 211). The centroid coordinates of the user interface elements may represent the location of the user interface elements, such as the user interface element locations used in the steps of process 100 described above. The centroid may be calculated by the processor or determined by the characteristics of the user interface element itself. For example, the first user interface element 210 may include a series of features, which may include coordinates of a title centroid. For the process, other non-centroid coordinates may be used as the location of the user interface element. The starting point of the user interface element may be the centroid of the user interface element identified for the motion.
From steps 108 and 110, the processor may generate an arcuate path 220. The arc-shaped path may be based on, for example, the size and orientation of the user interface 201.
As shown in the exemplary embodiment shown in fig. 2, the arcuate path 220 may be concave upward. The concavity of the arced path 220 may be determined to avoid intersecting other user interface elements, such as the second user interface element 241. The arc path 220 may be more curved or less sharply curved based on user preferences or system settings corresponding to the minimum and maximum arc angles.
Fig. 3 illustrates a view of an example user interface 201 of a handheld device or system 200 consistent with embodiments of the present disclosure. As shown in fig. 3, the system 200 may include a user interface 201. The user interface 201 may be a Graphical User Interface (GUI) including one or more user interface elements (e.g., a first user interface element 310, a second user interface element 341, and a third user interface element 340). For example, the user interface 201 may be a touch screen of a smartphone that includes a plurality of icons that may be selected to launch an application. The user interface may include other user interface elements, including those described above.
The user interface element may have a center or centroid (e.g., first user interface element centroid 311). The centroid coordinates of the user interface elements may represent the location of the user interface elements, such as the user interface element locations used in the steps of process 100 described above. As described above, the centroid may be calculated by the processor or determined by the characteristics of the user interface element itself. The starting point of the user interface element may be the centroid of the user interface element identified for the motion.
The user interface 201 may receive an input indicating an endpoint 331 corresponding to the destination location 330. For example, a user may press a touch screen and the location may be recorded by the touch screen and sent to the processor in accordance with steps 102 and 104 of process 100. Other forms of input may be used, such as a stylus, mouse, keyboard, gestures, eye gaze, and trackball. In an embodiment, an application or program may identify endpoint 331.
From steps 106 and 112, the processor may generate an exception path 320. The first user interface element centroid 311 (e.g., start point) and endpoint 331 are located above each other. The processor may determine that this relative position is an exception to using an arc-shaped path and generate a linear exception path 320.
In an embodiment, the exception determination (step 106) may take into account surrounding user interface elements, including any potential intersections, in determining whether the processor should generate a linear path. For example, if the third user interface element 340 is moved to the destination location 330, the processor will determine that they are vertically aligned. However, if a linear path is used, the exception path will intersect the second user interface element 341 and the first user interface element 310. The processor may then override the exception and generate an arc-shaped path to avoid intersecting surrounding user interface elements. Other exceptions may be made to avoid conflicts. In other embodiments, the processor may ignore the intersection when generating the path (steps 110 and 120) and determining the exception (step 106).
FIG. 4 illustrates an example geometry for determining an arcuate path. Two different arcuate paths for traveling from point a to point B are shown. Arc ADB is a portion of circle F. Arc AEB is a portion of circle G. The arc angle of the path from a to B increases as the radius of the circle underlying the arc decreases. For example, angle AFB is less than angle AGB. The arc angle increases as the center of the circle moves from point F or G closer to point C. When the processor generates an arc-shaped path (step 110), the processor may generate a plurality of potential circles to derive a plurality of arcs. The processor may reject those arc angles having arc angles outside the limits of the maximum arc angle and the minimum arc angle. The processor may cull potential arc paths based on other simultaneous paths or potential intersections of user interface elements. Even if the center of the circle on which the arc is based is extremely far from points a and B, the resulting path between a and B will still be curved, even if only slightly curved. When a straight path is desired, the processor may determine an exception (step 106) and generate a linear path (step 112) instead of using an arc path calculation.
Fig. 5 illustrates an example system 200 for implementing embodiments consistent with the present disclosure. Variations of the system 200 may be used to implement the components or devices of the disclosed embodiments. The system 200 may be, for example, a desktop, laptop, tablet, hybrid tablet laptop, smart phone, wristband device, set-top box, or television. It should be understood that the components and features shown in fig. 5 may be multiplexed, omitted, or modified.
As shown in fig. 5, an example system 500 consistent with the present disclosure may include a central processing unit 501 (also referred to as an electronic processor or CPU) to manage and process data and perform operations. (CPU 501 may be implemented as one or more processors.) system 500 may also include one or more storage devices 503. The storage device 503 may include optical, magnetic, signal, and/or any other type of storage device. The system 500 may also include a network adapter 505. Network adapter 505 may allow system 500 to connect to an electronic network, such as the internet, a local area network, a wide area network, a cellular network, a wireless network, or any other type of network. The system 500 also includes a power supply unit 506, which power supply unit 506 may enable the system 500 and its components to receive power and fully operate.
In some embodiments, the system 500 may also include an input device 512, and the input device 512 may receive input from a user and/or a module or device. These modules or devices may include, but are not limited to, keyboards, mice, trackballs, trackpads, scanners, cameras, and devices connected by a Universal Serial Bus (USB), serial, parallel, infrared, wireless, wired, or other connection. The system 500 also includes an output device 514 that transmits data to a user and/or module or device. These modules or devices may include, but are not limited to, computer monitors, televisions, screens, projectors, printers, plotters, and other recording/display devices connected by wired or wireless connections.
The system 500 may include a user interface 516, and the user interface 516 may facilitate interaction with a user. Example user interfaces may include, for example, a touch screen (e.g., resistive or capacitive touch screen), a display (e.g., LCD monitor), an LED array, or any other display.
In the present disclosure, various embodiments have been described with reference to the drawings and embodiments. It will, however, be understood that various modifications and changes may be made thereto, and additional embodiments may be implemented, without departing from the disclosure. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
For example, advantageous results still could be achieved if steps of the disclosed methods were performed in a different order, and/or components in the disclosed systems were combined in a different manner and/or replaced or supplemented by other components. Other implementations are also within the scope of the present disclosure.
It is to be understood that the foregoing general description is provided for purposes of illustration and explanation only, and is not limiting. Furthermore, the accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate embodiments of the disclosure and, together with the description, are likewise not limiting.
Claims (20)
1. A computer-implemented system for generating an arced path of travel for a user interface element, comprising:
a memory device storing instructions; and
at least one processor that executes the instructions to:
determining coordinates of a start point and an end point of the user interface element corresponding to different locations on a user interface;
determining a minimum arc angle and a maximum arc angle of an arc based on an orientation of the user interface, wherein the orientation of the user interface is measured relative to a plane of a surface of the user interface and a ground surface, and wherein a steepness of the arc is inversely proportional to an interface orientation angle;
generating an arc path based on the start point coordinates, the end point coordinates, the minimum arc angle, and the maximum arc angle; and
generating a command to move the user interface element along the arc-shaped path in the user interface.
2. The computer-implemented system of claim 1, wherein the instructions further cause the processor to:
generating a velocity profile representing a velocity at which the user interface element is moved along the arcuate path based on the arcuate path distance and slope of the arcuate path,
wherein the command moves the user interface element at a speed corresponding to the velocity profile.
3. The computer-implemented system of claim 1, wherein the instructions further cause the processor to:
determining the concavity of the arc line,
wherein the arcuate path is also generated based on the concavity.
4. The computer-implemented system of claim 3, wherein the minimum arc angle, the maximum arc angle, and the concavity are based on at least one or more of stored user preferences, system default settings, and a size of the user interface.
5. The computer-implemented system of claim 1, wherein the arced path is generated in response to a determination that coordinates of the start and end points are not vertically aligned.
6. The computer-implemented system of claim 1, wherein the minimum arc angle and the maximum arc angle are modified to avoid intersecting an existing arc path for another user interface element.
7. The computer-implemented system of claim 1, wherein the coordinates of the start point and the end point correspond to a location of a centroid of the user interface element.
8. A method for generating an arced path for a user interface element to travel, the method comprising the following operations performed by at least one processor:
determining coordinates of a start point and an end point of the user interface element corresponding to different locations on a user interface;
determining a minimum arc angle and a maximum arc angle of an arc based on an orientation of the user interface, wherein the orientation of the user interface is measured relative to a plane of a surface of the user interface and a ground surface, and wherein a steepness of the arc is inversely proportional to an interface orientation angle;
generating an arc path based on the start point coordinates, the end point coordinates, the minimum arc angle, and the maximum arc angle; and
generating a command to move the user interface element along the arc-shaped path in the user interface.
9. The method of claim 8, further comprising the following operations performed by the at least one processor:
generating a velocity profile representing a velocity at which the user interface element is moved along the arcuate path based on the arcuate path distance and slope of the arcuate path,
wherein the command moves the user interface element at a speed corresponding to the velocity profile.
10. The method of claim 8, further comprising the following operations performed by the at least one processor:
determining the concavity of the arc line,
wherein the arcuate path is also generated based on the concavity.
11. The method of claim 10, wherein the minimum arc angle, the maximum arc angle, and the concavity are based on at least one or more of stored user preferences, system default settings, and a size of the user interface.
12. The method of claim 8, wherein the arced path is generated in response to a determination that the coordinates of the start and end points are not vertically aligned.
13. The method of claim 8, wherein the minimum arc angle and the maximum arc angle are modified to avoid intersecting an existing arc path for another user interface element.
14. The method of claim 8, wherein the coordinates of the start point and the end point correspond to a location of a centroid of the user interface element.
15. A non-transitory, computer-readable medium storing instructions that, when executed by at least one processor, cause the at least one processor to perform operations comprising:
determining coordinates of a start point and an end point of the user interface element corresponding to different locations on a user interface;
determining a minimum arc angle and a maximum arc angle of an arc based on an orientation of the user interface, wherein the orientation of the user interface is measured relative to a plane of a surface of the user interface and a ground surface, and wherein a steepness of the arc is inversely proportional to an interface orientation angle;
generating an arc path based on the start point coordinates, the end point coordinates, the minimum arc angle, and the maximum arc angle; and
generating a command to move the user interface element along the arc-shaped path in the user interface.
16. The computer-readable medium of claim 15, wherein the operations further comprise:
generating a velocity profile representing a velocity at which the user interface element is moved along the arcuate path based on the arcuate path distance and slope of the arcuate path,
wherein the command moves the user interface element at a speed corresponding to the velocity profile.
17. The computer-readable medium of claim 15, wherein the operations further comprise:
determining the concavity of the arc line,
wherein the arcuate path is also generated based on the concavity.
18. The computer-readable medium of claim 17, wherein the minimum arc angle, the maximum arc angle, and the concavity are based at least on one or more of stored user preferences, system default settings, and a size of the user interface.
19. The computer-readable medium of claim 15, wherein the arced path is generated in response to a determination that the coordinates of the start and end points are not vertically aligned.
20. The computer-readable medium of claim 15, wherein the minimum arc angle and the maximum arc angle are modified to avoid intersecting an existing arc path for another user interface element.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462016636P | 2014-06-24 | 2014-06-24 | |
US62/016,636 | 2014-06-24 | ||
PCT/US2015/037149 WO2015200303A1 (en) | 2014-06-24 | 2015-06-23 | User interface with quantum curves and quantum arcs |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106462327A CN106462327A (en) | 2017-02-22 |
CN106462327B true CN106462327B (en) | 2020-06-05 |
Family
ID=53539928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201580031303.2A Active CN106462327B (en) | 2014-06-24 | 2015-06-23 | Method, system, and medium for generating an arc-shaped path traveled by a user interface element |
Country Status (4)
Country | Link |
---|---|
US (1) | US10423314B2 (en) |
EP (1) | EP3161602A1 (en) |
CN (1) | CN106462327B (en) |
WO (1) | WO2015200303A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112055842B (en) * | 2018-05-08 | 2024-12-06 | 谷歌有限责任公司 | Drag gesture animation |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101819498A (en) * | 2009-02-27 | 2010-09-01 | 瞬联讯通科技(北京)有限公司 | Screen display-controlling method facing to slide body of touch screen |
CN101901098A (en) * | 2009-05-26 | 2010-12-01 | 鸿富锦精密工业(深圳)有限公司 | Electronic display device and icon display method thereof |
WO2014014242A1 (en) * | 2012-07-16 | 2014-01-23 | Samsung Electronics Co., Ltd. | Method and apparatus for moving object in mobile terminal |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7093192B2 (en) * | 1999-07-30 | 2006-08-15 | Microsoft Corporation | Establishing and displaying dynamic grids |
US6650328B1 (en) * | 2001-11-30 | 2003-11-18 | Microsoft Corporation | System and method for placing splines using refinement and obstacle avoidance techniques |
JP5142510B2 (en) | 2005-11-25 | 2013-02-13 | オセ−テクノロジーズ ビーブイ | Graphical user interface providing method and system |
US8487931B2 (en) * | 2006-09-07 | 2013-07-16 | Adobe Systems Incorporated | Dynamic feedback and interaction for parametric curves |
KR101484826B1 (en) * | 2009-08-25 | 2015-01-20 | 구글 잉크. | Direct manipulation gestures |
US9245368B2 (en) * | 2011-06-05 | 2016-01-26 | Apple Inc. | Device and method for dynamically rendering an animation |
KR20130070506A (en) * | 2011-12-19 | 2013-06-27 | 삼성전자주식회사 | Method for displaying page shape and display apparatus |
KR101397685B1 (en) * | 2012-02-29 | 2014-05-26 | 주식회사 팬택 | User terminal and method for displaying screen |
US20140053113A1 (en) * | 2012-08-15 | 2014-02-20 | Prss Holding BV | Processing user input pertaining to content movement |
-
2015
- 2015-06-23 US US15/317,790 patent/US10423314B2/en active Active
- 2015-06-23 WO PCT/US2015/037149 patent/WO2015200303A1/en active Application Filing
- 2015-06-23 EP EP15736096.7A patent/EP3161602A1/en not_active Withdrawn
- 2015-06-23 CN CN201580031303.2A patent/CN106462327B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101819498A (en) * | 2009-02-27 | 2010-09-01 | 瞬联讯通科技(北京)有限公司 | Screen display-controlling method facing to slide body of touch screen |
CN101901098A (en) * | 2009-05-26 | 2010-12-01 | 鸿富锦精密工业(深圳)有限公司 | Electronic display device and icon display method thereof |
WO2014014242A1 (en) * | 2012-07-16 | 2014-01-23 | Samsung Electronics Co., Ltd. | Method and apparatus for moving object in mobile terminal |
Non-Patent Citations (2)
Title |
---|
"任意两点间鼠标弧线移动,觉得有用的赞一个哦!~~";gzyshen;《http://bbs.anjian.com/showtopic-286577-1.aspx》;20120221;第1-3页 * |
gzyshen."任意两点间鼠标弧线移动,觉得有用的赞一个哦!~~".《http://bbs.anjian.com/showtopic-286577-1.aspx》.2012,第1-3页. * |
Also Published As
Publication number | Publication date |
---|---|
US20170102858A1 (en) | 2017-04-13 |
WO2015200303A1 (en) | 2015-12-30 |
US10423314B2 (en) | 2019-09-24 |
EP3161602A1 (en) | 2017-05-03 |
CN106462327A (en) | 2017-02-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10401964B2 (en) | Mobile terminal and method for controlling haptic feedback | |
US9489121B2 (en) | Optimal display and zoom of objects and text in a document | |
US9335899B2 (en) | Method and apparatus for executing function executing command through gesture input | |
US20150199125A1 (en) | Displaying an application image on two or more displays | |
KR102009054B1 (en) | Formula entry for limited display devices | |
US10628022B2 (en) | Method and system for prototyping graphic user interface | |
US20180350099A1 (en) | Method and Device for Detecting Planes and/or Quadtrees for Use as a Virtual Substrate | |
KR20170041219A (en) | Hover-based interaction with rendered content | |
TWI686728B (en) | Hot zone adjustment method and device, user terminal | |
KR102205283B1 (en) | Electro device executing at least one application and method for controlling thereof | |
US9773329B2 (en) | Interaction with a graph for device control | |
EP2960763A1 (en) | Computerized systems and methods for cascading user interface element animations | |
US20160162143A1 (en) | Information processing apparatus recognizing instruction by touch input, control method thereof, and storage medium | |
JP2015035092A (en) | Display controller and method of controlling the same | |
EP3204843B1 (en) | Multiple stage user interface | |
US20150355819A1 (en) | Information processing apparatus, input method, and recording medium | |
CN106462327B (en) | Method, system, and medium for generating an arc-shaped path traveled by a user interface element | |
US10838570B2 (en) | Multi-touch GUI featuring directional compression and expansion of graphical content | |
KR102438823B1 (en) | Method and apparatus for executing a function for a plurality of items on a list | |
CN107924276B (en) | Electronic equipment and text input method thereof | |
KR20150111651A (en) | Control method of favorites mode and device including touch screen performing the same | |
KR101692848B1 (en) | Control method of virtual touchpad using hovering and terminal performing the same | |
KR102205235B1 (en) | Control method of favorites mode and device including touch screen performing the same | |
KR101595229B1 (en) | Apparatus for control object in cad application and computer recordable medium storing program performing the method thereof | |
JP2023003565A (en) | Display controller and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: American California Applicant after: Google limited liability company Address before: American California Applicant before: Google Inc. |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |