[go: up one dir, main page]

CN107340964A - The animation effect implementation method and device of a kind of view - Google Patents

The animation effect implementation method and device of a kind of view Download PDF

Info

Publication number
CN107340964A
CN107340964A CN201710406260.2A CN201710406260A CN107340964A CN 107340964 A CN107340964 A CN 107340964A CN 201710406260 A CN201710406260 A CN 201710406260A CN 107340964 A CN107340964 A CN 107340964A
Authority
CN
China
Prior art keywords
view
gesture
offset distance
animation effect
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710406260.2A
Other languages
Chinese (zh)
Inventor
张磊
张文明
陈少杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Douyu Network Technology Co Ltd
Original Assignee
Wuhan Douyu Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Douyu Network Technology Co Ltd filed Critical Wuhan Douyu Network Technology Co Ltd
Priority to CN201710406260.2A priority Critical patent/CN107340964A/en
Publication of CN107340964A publication Critical patent/CN107340964A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention provides a kind of the animation effect implementation method and device of view, and applied to touch type terminal, methods described includes:S1, monitor the gesture operation event of touch type terminal top view;S2, the mapping relations between offset distance and gesture slip offset distance and view animation effect parameter are slided according to the gesture to view, obtain view animation effect parameter;S3, according to the view animation effect parameter, animated show is carried out to view.The present invention is directed to the dynamic effect demand of product in client, it is proposed that a set of elastic animation effect implementation method based on gesture, can assign a kind of inertia special efficacy of physics of animation effect so that animation is more true to nature and true.

Description

Method and device for realizing animation effect of view
Technical Field
The invention relates to the technical field of view animation, in particular to a method and a device for realizing animation effect of a view.
Background
In the process of developing a view at a client, animation effects are often used, and at present, it is a common practice to set the animation effect of each view by using the existing animation effects in a system, for example, a mode of flying one view into a screen, rotating the fly-in, or zooming the fly-in.
Aiming at different animation effects, multiple dynamic effect parameters need to be set for the same view, operation is complex, working efficiency is affected, meanwhile, in the process of setting the animation effect parameters of the view, the user and the user belong to a zero interaction process, and user experience is poor.
Disclosure of Invention
The present invention provides a method and apparatus for animation effect realization of views that overcomes or at least partially solves the above-mentioned problems.
According to a first aspect of the present invention, there is provided a method for implementing animation effect of a view, applied to a touch terminal, including:
s1, monitoring gesture operation events of the upper view of the touch terminal;
s2, obtaining view animation effect parameters according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameters;
and S3, performing animation display on the view according to the view animation effect parameters.
The invention has the beneficial effects that: aiming at the dynamic effect requirements of products in a client, a set of elastic animation special effect implementation method based on gestures is provided, different animation effects are given to the views according to gesture operation on the views, and the animations are more vivid and real.
On the basis of the technical scheme, the invention can be further improved as follows.
Further, the gesture operation event comprises a gesture pressing event, a gesture sliding event and a gesture lifting event;
in the step S2, the gesture sliding offset distance of the view is calculated as follows:
s21a, monitoring a gesture pressing event, and recording the horizontal and vertical coordinates (Ax, Ay) of a pressing point;
s22a, monitoring a gesture sliding event, and recording horizontal and vertical coordinates (Ex, Ey) of the current position of the finger in the gesture sliding process;
s23a, the slide offset distance Δ x on the horizontal axis is Ex-Ax and the slide offset distance Δ y on the vertical axis is Ey-Ay, and the larger of Δ x and Δ y is taken as the gesture slide offset distance for the view.
Further, in step S2, a mapping relationship between the gesture sliding offset distance and the view animation effect parameter for the view is established as follows:
s21b, detecting a gesture lifting event, and recording horizontal and vertical coordinates of a lifting point;
s22b, determining the value range of the gesture sliding offset distance according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the lifting point;
and S23b, establishing a mapping relation between the gesture sliding offset distance and the view animation effect parameter according to the determined value range of the gesture sliding offset distance and the preset value range of the view animation effect parameter.
Further, the step S22b specifically includes:
and (5) setting the horizontal and vertical coordinates of the lifting point as (Gx, Gy), respectively acquiring a gesture sliding offset distance delta x '= Gx-Ax of the horizontal axis and a gesture sliding offset distance delta y' = Gy-Ay of the vertical axis, and determining the value range of the gesture sliding offset distance according to the larger value of the delta x 'and the delta y'.
Further, the animation effect parameters include a view scale and a view transparency.
Furthermore, the preset value range of the view scaling ratio P is more than 0 and less than or equal to 1.1, and the value range of the view transparency T is more than 0 and less than or equal to 1.
Further, a second-order curve relationship exists between the gesture sliding offset distance and the view scaling, and a first-order linear relationship exists between the gesture sliding offset distance and the view transparency.
Further, the step S3 specifically includes:
in the process of sliding the view by the gesture, obtaining the corresponding view scaling and transparency according to the sliding offset distance of the current gesture in real time, and performing animation display on the view; or,
and when a gesture lifting event is monitored, obtaining the corresponding view scaling and transparency according to the gesture sliding offset distance when the gesture is lifted, and performing animation display on the view.
According to an aspect of the present invention, there is provided an animation effect implementation apparatus of a view, including:
the monitoring module is used for monitoring a gesture operation event of an upper view of the touch type terminal;
the acquisition module is used for acquiring view animation effect parameters according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameters;
and the display module is used for carrying out animation display on the view according to the view animation effect parameters.
According to a third aspect, a device of a method for realizing animation effect of view is characterized by comprising a processor (processor), a memory (memory) and a bus;
the processor and the memory complete mutual communication through the bus;
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform a method comprising:
monitoring a gesture operation event of an upper view of the touch type terminal; obtaining a view animation effect parameter according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameter; and performing animation display on the view according to the view animation effect parameters.
Drawings
FIG. 1 is a flow chart of a method for animation effect implementation of a view in one embodiment of the invention;
FIG. 2 is a graph illustrating a mapping between a gesture swipe offset distance and a view zoom ratio, in accordance with one embodiment of the present invention;
FIG. 3 is a graph illustrating a mapping between a gesture slide offset distance and a view transparency according to one embodiment of the present invention;
FIG. 4 is a block diagram of an animation effects implementation connection of another embodiment of the present invention;
FIG. 5 is an overall connection block diagram of an animation effect realization apparatus according to a view of one embodiment of the present invention;
fig. 6 is a device connection block diagram of a method for implementing animation effects of views according to still another embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 provides a method for implementing animation effect of a view, which is applied to a touch terminal and can give different animation effects to the view according to a gesture operation event of a user on the view, so that the view animation is more vivid and real. The method comprises the following steps: s1, monitoring gesture operation events of the upper view of the touch terminal; s2, obtaining view animation effect parameters according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameters; and S3, performing animation display on the view according to the view animation effect parameters.
When a user wants that the view shows different animation effects, the user can perform gesture operation on the view on the touch screen, wherein the touch terminal can be a touch mobile phone or a touch computer. When the touch terminal detects a gesture operation event on the view, the animation effect parameter corresponding to the gesture sliding offset distance is obtained according to the gesture sliding offset distance of the view and the pre-established mapping relation between the gesture sliding offset distance and the view animation effect parameter in the process of the view operation of the user. And then, displaying different animation effects on the view according to the animation effect parameters of the view.
According to the embodiment, a set of elastic animation special effect implementation method based on gestures is provided for the dynamic effect requirements of the view in the client, different animation effects are given to the view according to the gesture operation on the view, and the animation is more vivid and real.
In one embodiment of the invention, the gesture operation event comprises a gesture pressing event, a gesture sliding event and a gesture lifting event.
In the step S2, the gesture sliding offset distance of the view is calculated as follows: s21a, monitoring a gesture pressing event, and recording the horizontal and vertical coordinates (Ax, Ay) of a pressing point; s22a, monitoring a gesture sliding event, and recording the horizontal and vertical coordinates (Ex, Ey) of the current position in the gesture sliding process; s23a, the slide offset distance Δ x on the horizontal axis is Ex-Ax and the slide offset distance Δ y on the vertical axis is Ey-Ay, and the larger of Δ x and Δ y is taken as the gesture slide offset distance for the view.
In the process of performing gesture operation on the view on the touch terminal by the user, the gesture operation events of the user respectively comprise a gesture pressing event, a gesture sliding event and a gesture lifting event. In this embodiment, the gesture detection on the view is the basis of the whole animation effect, and the acquisition of the subsequent animation effect parameters is performed based on the gesture change. The detection of the gesture change in the scheme is monitored by an onTouch method of the carbon copy system, the onTouch method returns event information of the current gesture operation, and the current gesture state is judged by the operation event information of the gesture.
In the process of operating the view by the gesture, for example, when the view is zoomed, two fingers are usually scratched or furled, and the distance between the two fingers scratched or furled is generally equal to the distance between the pressing points. In a specific process, a gesture pressing event on a view is monitored, and horizontal and vertical coordinates (Ax, Ay) of a pressing point are recorded, so that the pressing point is used as a starting point of gesture sliding. And monitoring a gesture sliding event, and recording horizontal and vertical coordinates (Ex, Ey) of the current position of the finger in the gesture sliding process. The sliding offset distance Δ x of the horizontal axis is obtained as Ex-Ax and the sliding offset distance Δ y of the vertical axis is obtained as Ey-Ay, respectively, and the larger of Δ x and Δ y is taken as the sliding offset distance of the gesture to the view. The larger of Δ x and Δ y is selected as the gesture sliding offset distance because the gesture sliding offset distances on the horizontal axis and the vertical axis may not be equal during sliding, and in order to embody the gesture sliding action of the user, the larger of Δ x and Δ y is selected as the gesture sliding offset distance of the user on the touch terminal, and if the two are equal, any value of the two may be selected.
In another embodiment of the present invention, the mapping relationship between the gesture sliding offset distance to the view and the view animation effect parameter is established in step S2 by: s21b, monitoring a gesture lifting event, and recording horizontal and vertical coordinates of a lifting point; s22b, determining the value range of the gesture sliding offset distance according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the lifting point; and S23b, designing a mapping relation between the gesture sliding offset distance and the view animation effect parameter according to the determined value range of the gesture sliding offset distance and the preset value range of the view animation effect parameter.
After the gesture pressing event and the gesture sliding event of the user are detected, the gesture sliding offset distance of the user to the view is calculated according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the current position in the gesture sliding process. And then, detecting the gesture lifting event, recording horizontal and vertical coordinates (Gx, Gy) of the lifting point, and determining the value range of the gesture sliding offset distance according to the horizontal and vertical coordinates (Ax, Ay) of the gesture pressing point and the horizontal and vertical coordinates (Gx, Gy) of the lifting point. The embodiment also designs the value range of the view animation effect parameters according to specific requirements, and designs the corresponding relation between the gesture sliding offset distance and the view animation effect parameters under the condition that the value range of the gesture sliding offset distance and the value range of the view animation effect parameters are both satisfied. In the process that the actual user slides the gesture, the view animation effect parameters can be obtained according to the gesture sliding offset distance and the corresponding relation.
In an embodiment of the present invention, the determining, in the step S22b, a value range of the gesture sliding offset distance according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the lifting point specifically includes: and (5) setting the horizontal and vertical coordinates of the lifting point as (Gx, Gy), respectively acquiring a gesture sliding offset range delta x 'of the horizontal axis as Gx-Ax and a gesture sliding offset range delta y' of the vertical axis as Gy-Ay, and determining the value range of the gesture sliding offset distance according to the larger value of the delta x 'and the delta y'.
In the above embodiment, the horizontal and vertical coordinates (Ax, Ay) of the finger pressing point and the horizontal and vertical coordinates (Gx, Gy) of the finger lifting point are detected and recorded, then the gesture sliding offset range Δ x 'of the horizontal axis is calculated as Gx-Ax and the gesture sliding offset range Δ y' of the vertical axis is calculated as Gy-Ay according to the horizontal and vertical coordinates of the finger pressing point and the horizontal and vertical coordinates of the finger lifting point, respectively, and the value range of the gesture sliding offset distance is determined according to the larger value of Δ x 'and Δ y', wherein the value range of the gesture sliding offset distance is 0 to max (| Δ x |, | Δ y |).
In one embodiment of the invention, the animation effect parameters include view scale and view transparency; the preset value range of the view scaling ratio P is more than 0 and less than or equal to 1.1, and the value range of the view transparency T is more than 0 and less than or equal to 1.
In this embodiment, the view animation effect parameters mainly refer to a view scaling and a view transparency, and the scaling and the transparency corresponding to the view can be obtained according to the gesture sliding offset distance in the gesture operation process of the user on the view, and the animation effect display of the corresponding scaling and the corresponding transparency is performed on the view according to the view scaling and the transparency. In the embodiment, the value range of the view scaling P is designed to be 0 < 1.1, wherein the maximum value of the view scaling is designed to be 1.1, so as to realize a physical inertia effect, i.e. a process from small to large to small of the animation, and corresponding to the process from 0 to 1.1 to 1 of the view scaling ratio, the experience that the animation has an elastic effect, i.e. the so-called physical inertia effect, is provided. The value range of the view transparency T is more than 0 and less than or equal to 1, the view transparency is different from the view scaling, an animation special effect of zooming in and zooming out is generated when the view is zoomed, but the change of the transparency does not need the special effect of inertia, and therefore, the design is 0 to 1.
In another embodiment of the present invention, the gesture sliding offset distance and the view scaling value have a second order curve relationship, and the gesture sliding offset distance and the view transparency have a linear relationship.
In the above embodiment, it is described that the mapping relationship between the gesture sliding offset distance and the view animation effect parameter is determined according to the value range of the gesture sliding offset distance and the value range of the view animation effect parameter. It can be known from the above embodiments that the range of the gesture sliding offset distance is 0 to max (| Δ x '|, | Δ y' |), and the range of the designed view scaling is 0 to 1.1, and since the acceleration of the user sliding on the view is different, the mapping relationship between the gesture sliding offset distance and the view scaling designed in this embodiment is a second-order curve relationship, and the second-order curve relationship can be designed as the following function equation:
y=ax2+bx+c;
wherein a, b and c are constants, x is the gesture sliding offset distance, the value range is 0-max (| Δ x '|, | Δ y' |), and the value range of y is 0-1.1, wherein 0 is not included. For easy understanding, the designed functional relationship graph can be seen in fig. 2, where X1 to X2 are the range of values of the X axis, the Y axis represents the value of the view scaling, and the range of values of the Y axis is 0 to 1.1. Under the condition of meeting the value range of the gesture sliding offset distance and the value range of the view scaling, a second-order curve can be drawn, then constant coefficients a, b and c can be solved according to 3 points on the second-order curve, and further a specific function relation, namely a function mapping relation between the gesture sliding offset distance and the view scaling, is obtained.
Similarly, for the mapping relationship between the gesture sliding offset distance and the view transparency, in this embodiment, the view transparency and the gesture sliding offset distance are linearly related, and the functional mapping relationship between the gesture sliding offset distance and the view transparency may be designed as follows:
y=dx+e;
wherein d and e are constants, x is the gesture sliding offset distance, the value range is 0-max (| Δ x '|, | Δ y' |), y is the view transparency, the value range of y is 0 to 1, and 0 is excluded. For easy understanding, the designed functional relationship graph can be seen in fig. 3, the value range of the X axis is X1 to X2, the value range of the corresponding Y axis is 0 to 1, and the variation relationship is a linear variation relationship.
Under the condition of meeting the value range of the gesture sliding offset distance and the value range of the view transparency, a straight line can be made according to specific requirements, then constant coefficients d and e can be solved according to a point on the straight line, and further a specific function relation, namely a function mapping relation between the gesture sliding offset distance and the view transparency, is obtained.
By the method, the mapping relation between the gesture sliding offset distance and the view scaling is designed, the mapping relation between the gesture sliding offset distance and the view transparency is designed, and the relation between the view scaling and the transparency and the gesture sliding offset distance is established.
In another embodiment of the present invention, the step S3 specifically includes: in the process of sliding the view by the gesture, obtaining the corresponding view scaling and transparency according to the sliding offset distance of the current gesture in real time, and performing animation display on the view; or when a gesture lifting event is monitored, obtaining the corresponding view scaling and transparency according to the gesture sliding offset distance when the gesture is lifted, and performing animation display on the view.
In the embodiment, the mapping relationship between the gesture sliding offset distance and the view scaling and the view transparency is obtained, in the process of the gesture sliding of the view by the user, the view real-time scaling and the real-time transparency can be obtained according to the real-time gesture sliding offset distance, and the view is subjected to animation display according to the view scaling and the transparency. And marking the zooming animation as Anima1 and the transparency animation as Anima2, and in the process of sliding the view by the gesture, obtaining the corresponding view zooming scale and view transparency in real time according to the current gesture sliding offset distance, and performing animation display on the view. Or when a gesture lifting event is monitored, obtaining the corresponding view scaling and view transparency according to the gesture sliding offset distance when the gesture is lifted, and performing animation display on the view.
In order to enable the animation effects of Anima1 and Anima2 to be simultaneously started, an AinmationSet class can be used, and different animation special effects can be started for the same view through the AinmationSet tool class. Specifically, an add method in the AinmationSet class is called to add the animation of Anima1 and Anima2 into the AinmationSet, and then a start method in the AinmationSet is called, so that the animation of Anima1 and Anima2 can be executed simultaneously.
Referring to fig. 4, there is provided an animation effect implementation apparatus for a view according to another embodiment of the present invention, including a listening module 21, an obtaining module 22, and a presentation module 23.
The monitoring module 21 is configured to monitor a gesture operation event of an upper view of the touch screen terminal;
the obtaining module 22 is configured to obtain a view animation effect parameter according to the gesture sliding offset distance of the view and a mapping relationship between the gesture sliding offset distance and the view animation effect parameter;
and the display module 23 is configured to perform animation display on the view according to the view animation effect parameter.
Referring to fig. 5, the animation effect implementation of the view further includes a calculation module 24, a determination module 25, and a creation module 26.
The monitoring module 21 is further configured to monitor a gesture pressing event, and record horizontal and vertical coordinates (Ax, Ay) of a pressing point; and monitoring the gesture sliding event, and recording the horizontal and vertical coordinates (Ex, Ey) of the current position in the gesture sliding process.
And the calculating module 24 is configured to calculate a sliding offset distance Δ x on the horizontal axis as Ex-Ax and a sliding offset distance Δ y on the vertical axis as Ey-Ay, respectively, and use the larger of Δ x and Δ y as the sliding offset distance of the gesture to the view.
The monitoring module 21 is further configured to monitor a gesture lifting event, and record horizontal and vertical coordinates of a lifting point.
The determining module 25 is configured to determine a value range of the gesture sliding offset distance according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the lifting point.
The establishing module 26 is configured to establish a mapping relationship between the gesture sliding offset distance and the view animation effect parameter according to the determined value range of the gesture sliding offset distance and the preset value range of the view animation effect parameter.
The determining module 25 is further configured to calculate a gesture sliding offset distance Δ x 'of a horizontal axis and a gesture sliding offset distance Δ y' of a vertical axis, which are Gx-Ax and Gy-Ay, respectively, and determine a value range of the gesture sliding offset distance according to a larger value of Δ x 'and Δ y', where horizontal and vertical coordinates of the lifting point are (Gx, Gy).
The display module 23 is specifically configured to:
in the process of sliding the view by the gesture, obtaining the corresponding view scaling and transparency according to the sliding offset distance of the current gesture in real time, and performing animation display on the view; or,
and when a gesture lifting event is monitored, obtaining the corresponding view scaling and transparency according to the gesture sliding offset distance when the gesture is lifted, and performing animation display on the view.
Fig. 6 is a block diagram illustrating a structure of an apparatus of a method for implementing animation effects of views according to an embodiment of the present application.
Referring to fig. 6, the apparatus of the animation effect implementation method of the view includes: a processor (processor)601, a memory (memory)602, and a bus 603; wherein, the processor 601 and the memory 602 complete the communication with each other through the bus 603.
The processor 601 is configured to call program instructions in the memory 602 to perform the methods provided by the above-mentioned method embodiments, for example, including: monitoring a gesture operation event of an upper view of the touch type terminal; obtaining a view animation effect parameter according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameter; and performing animation display on the view according to the view animation effect parameters.
The present embodiment discloses a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the method provided by the above-mentioned method embodiments, for example, comprising: monitoring a gesture operation event of an upper view of the touch type terminal; obtaining a view animation effect parameter according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameter; and performing animation display on the view according to the view animation effect parameters.
The present embodiments provide a non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the methods provided by the above method embodiments, for example, including: monitoring a gesture operation event of an upper view of the touch type terminal; obtaining a view animation effect parameter according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameter; and performing animation display on the view according to the view animation effect parameters.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The above-described embodiments of the apparatus and the like of the animation effect implementation method of the views are merely illustrative, where units illustrated as separate components may or may not be physically separate, and components displayed as units may or may not be physical units, may be located in one place, or may also be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute the various embodiments or some parts of the methods of the embodiments.
According to the method and the device for realizing the view animation, provided by the invention, the gesture operation event on the touch terminal is monitored, the two mapping relations between the gesture sliding offset distance and the zoom ratio and transparency of the view are established, the two mapping relations are combined, and the animation effects of the zoom ratio and transparency of the view are realized at the same time, so that the elastic animation effect with physical special effects required by the view is achieved, and the use experience of a user and the friendliness of program man-machine interaction are greatly improved by skillfully establishing the mapping relation between the gesture and the animation effect.
Finally, the method of the present application is only a preferred embodiment and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for realizing animation effect of view is applied to a touch terminal and comprises the following steps:
s1, monitoring gesture operation events of the upper view of the touch terminal;
s2, obtaining view animation effect parameters according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameters;
and S3, performing animation display on the view according to the view animation effect parameters.
2. The method of claim 1, wherein the gesture operation events include a gesture press event, a gesture slide event, and a gesture lift event;
in the step S2, the gesture sliding offset distance of the view is calculated as follows:
s21a, monitoring a gesture pressing event, and recording the horizontal and vertical coordinates (Ax, Ay) of a pressing point;
s22a, monitoring a gesture sliding event, and recording horizontal and vertical coordinates (Ex, Ey) of the current position of the finger in the gesture sliding process;
in S23a, the slide offset distance Δ x on the horizontal axis is Ex-Ax and the slide offset distance Δ y on the vertical axis is Ey-Ay, and the larger of Δ x and Δ y is taken as the gesture slide offset distance of the opposite view.
3. The method for realizing animation effect of view according to claim 2, wherein the mapping relationship between the gesture sliding offset distance and the view animation effect parameter of the view is established in the step S2 by:
s21b, detecting a gesture lifting event, and recording horizontal and vertical coordinates of a lifting point;
s22b, determining the value range of the gesture sliding offset distance according to the horizontal and vertical coordinates of the gesture pressing point and the horizontal and vertical coordinates of the lifting point;
and S23b, establishing a mapping relation between the gesture sliding offset distance and the view animation effect parameter according to the determined value range of the gesture sliding offset distance and the preset value range of the view animation effect parameter.
4. The method for realizing animation effects of views according to claim 3, wherein the step S22b specifically comprises:
and (5) setting horizontal and vertical coordinates of the lifting starting point as (Gx, Gy), respectively acquiring a gesture sliding offset distance delta x 'of a horizontal axis as Gx-Ax and a gesture sliding offset distance delta y' of a vertical axis as Gy-Ay during gesture lifting, and determining a value range of the gesture sliding offset distance according to a larger value of delta x 'and delta y'.
5. The method of claim 3, wherein the animation effect parameters include a view scale and a view transparency.
6. The method for realizing animation effects of a view according to claim 5, wherein the preset value range of the view scaling ratio P is 0 < P ≦ 1.1, and the value range of the view transparency T is 0 < T ≦ 1.
7. The method of claim 6, wherein the gesture slide offset distance is in a second order curve relationship with respect to view scaling, and the gesture slide offset distance is in a first order linear relationship with respect to view transparency.
8. The method for realizing animation effects of views according to claim 7, wherein the step S3 specifically includes:
in the process of sliding the view by the gesture, obtaining the corresponding view scaling and transparency according to the sliding offset distance of the current gesture in real time, and performing animation display on the view; or,
and when a gesture lifting event is monitored, obtaining the corresponding view scaling and transparency according to the gesture sliding offset distance when the gesture is lifted, and performing animation display on the view.
9. An animation effect realization apparatus for a view, comprising:
the monitoring module is used for monitoring a gesture operation event of an upper view of the touch type terminal;
the acquisition module is used for acquiring view animation effect parameters according to the gesture sliding offset distance of the view and the mapping relation between the gesture sliding offset distance and the view animation effect parameters;
and the display module is used for carrying out animation display on the view according to the view animation effect parameters.
10. A device for implementing animation effect of view is characterized by comprising a processor (processor), a memory (memory) and a bus;
the processor and the memory complete mutual communication through the bus;
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any one of claims 1-8.
CN201710406260.2A 2017-06-02 2017-06-02 The animation effect implementation method and device of a kind of view Pending CN107340964A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710406260.2A CN107340964A (en) 2017-06-02 2017-06-02 The animation effect implementation method and device of a kind of view

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710406260.2A CN107340964A (en) 2017-06-02 2017-06-02 The animation effect implementation method and device of a kind of view

Publications (1)

Publication Number Publication Date
CN107340964A true CN107340964A (en) 2017-11-10

Family

ID=60221478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710406260.2A Pending CN107340964A (en) 2017-06-02 2017-06-02 The animation effect implementation method and device of a kind of view

Country Status (1)

Country Link
CN (1) CN107340964A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967102A (en) * 2017-12-29 2018-04-27 北京酷我科技有限公司 A kind of view control in android system
CN108628455A (en) * 2018-05-14 2018-10-09 中北大学 A kind of virtual husky picture method for drafting based on touch-screen gesture identification
CN110619615A (en) * 2018-12-29 2019-12-27 北京时光荏苒科技有限公司 Method and apparatus for processing image
CN110704138A (en) * 2018-06-25 2020-01-17 马上消费金融股份有限公司 Animation effect realization method and device and computer readable storage medium
CN112214856A (en) * 2020-11-04 2021-01-12 上海理工大学 Precision machine tool rigidity optimization design method for overall structure
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser
CN112882638A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Multi-layer animation display method and device
WO2022062985A1 (en) * 2020-09-25 2022-03-31 荣耀终端有限公司 Method and apparatus for adding special effect in video, and terminal device
WO2024098713A1 (en) * 2022-11-11 2024-05-16 中兴通讯股份有限公司 Terminal desktop display method, terminal and computer-readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942050A (en) * 2014-04-15 2014-07-23 Tcl集团股份有限公司 Implementation method and system for applying animation to Android platform
CN104780093A (en) * 2014-01-15 2015-07-15 阿里巴巴集团控股有限公司 Method and device for processing expression information in instant messaging process
CN105373291A (en) * 2015-11-11 2016-03-02 北京麒麟合盛网络技术有限公司 Interface switching method and device
US20160231895A1 (en) * 2013-07-24 2016-08-11 Innoventions, Inc. Tilt-Based View Scrolling with Baseline Update for Proportional and Dynamic Modes
CN106445337A (en) * 2016-09-13 2017-02-22 广州视睿电子科技有限公司 Method and device for realizing spotlight effect

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160231895A1 (en) * 2013-07-24 2016-08-11 Innoventions, Inc. Tilt-Based View Scrolling with Baseline Update for Proportional and Dynamic Modes
CN104780093A (en) * 2014-01-15 2015-07-15 阿里巴巴集团控股有限公司 Method and device for processing expression information in instant messaging process
CN103942050A (en) * 2014-04-15 2014-07-23 Tcl集团股份有限公司 Implementation method and system for applying animation to Android platform
CN105373291A (en) * 2015-11-11 2016-03-02 北京麒麟合盛网络技术有限公司 Interface switching method and device
CN106445337A (en) * 2016-09-13 2017-02-22 广州视睿电子科技有限公司 Method and device for realizing spotlight effect

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107967102A (en) * 2017-12-29 2018-04-27 北京酷我科技有限公司 A kind of view control in android system
CN108628455A (en) * 2018-05-14 2018-10-09 中北大学 A kind of virtual husky picture method for drafting based on touch-screen gesture identification
CN108628455B (en) * 2018-05-14 2021-10-15 中北大学 Virtual sand painting drawing method based on touch screen gesture recognition
CN110704138A (en) * 2018-06-25 2020-01-17 马上消费金融股份有限公司 Animation effect realization method and device and computer readable storage medium
CN110704138B (en) * 2018-06-25 2021-04-23 马上消费金融股份有限公司 Method, device and computer-readable storage medium for realizing animation effect
CN110619615A (en) * 2018-12-29 2019-12-27 北京时光荏苒科技有限公司 Method and apparatus for processing image
WO2022062985A1 (en) * 2020-09-25 2022-03-31 荣耀终端有限公司 Method and apparatus for adding special effect in video, and terminal device
CN112214856A (en) * 2020-11-04 2021-01-12 上海理工大学 Precision machine tool rigidity optimization design method for overall structure
CN112882637A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Interaction method for multi-layer animation display and browser
CN112882638A (en) * 2021-02-23 2021-06-01 上海哔哩哔哩科技有限公司 Multi-layer animation display method and device
WO2024098713A1 (en) * 2022-11-11 2024-05-16 中兴通讯股份有限公司 Terminal desktop display method, terminal and computer-readable medium

Similar Documents

Publication Publication Date Title
CN107340964A (en) The animation effect implementation method and device of a kind of view
CN109308469B (en) Method and apparatus for generating information
CN109242765B (en) Face image processing method and device and storage medium
CN109621418B (en) Method and device for adjusting and making expression of virtual character in game
CN105260103B (en) A kind of picture Zoom method and electronic equipment
RU2598802C2 (en) Animation playing method, device and apparatus
US20160239186A1 (en) Systems and methods for automated generation of graphical user interfaces
WO2017032078A1 (en) Interface control method and mobile terminal
CN104267931B (en) A kind of information processing method and electronic equipment
CN113359995B (en) Human-computer interaction method, device, device and storage medium
US10019087B2 (en) Touch input method and apparatus
CN109213668A (en) Operation note method, apparatus and terminal
CN108604142B (en) A touch screen device operation method and touch screen device
WO2015176376A1 (en) Method and device for automatically adjusting valid touch point, and computer storage medium
WO2016018682A1 (en) Processing image to identify object for insertion into document
KR101944454B1 (en) Information processing program and information processing method
CN102707917B (en) Method and device for visualizing high-dimensional data
US20150054847A1 (en) Status display controller, status display control method, and recording medium that stores program
CN105653249A (en) Character size adjusting method and device
CN106959991B (en) Dynamic presentation method and device for large data visualization analysis and terminal
CN117058565A (en) Interface rolling speed determining method and device, intelligent equipment and storage medium
CN115798026A (en) Eyeball living body detection method, device, storage medium and computer equipment
CN105320421A (en) Message displaying method and device, and terminal
CN117274876B (en) Video clip scoring method and device, electronic equipment and medium
CN114723855B (en) Image generation methods, apparatus, devices and media

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171110

RJ01 Rejection of invention patent application after publication