[go: up one dir, main page]

CN112367533B - Interactive service processing method, device, equipment and computer readable storage medium - Google Patents

Interactive service processing method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
CN112367533B
CN112367533B CN202011248528.2A CN202011248528A CN112367533B CN 112367533 B CN112367533 B CN 112367533B CN 202011248528 A CN202011248528 A CN 202011248528A CN 112367533 B CN112367533 B CN 112367533B
Authority
CN
China
Prior art keywords
interaction
target
competition
interactive
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011248528.2A
Other languages
Chinese (zh)
Other versions
CN112367533A (en
Inventor
谢晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011248528.2A priority Critical patent/CN112367533B/en
Publication of CN112367533A publication Critical patent/CN112367533A/en
Application granted granted Critical
Publication of CN112367533B publication Critical patent/CN112367533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • H04N21/8173End-user applications, e.g. Web browser, game

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, a device and equipment for processing interactive services and a computer readable storage medium. The method comprises the following steps: displaying interactive controls respectively corresponding to at least two competitive interaction groups on a playing page of the competitive interaction video; and displaying a target interaction effect corresponding to the target competition interaction group in a playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, wherein the target competition interaction group is any one of at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects. In the process, a plurality of interactive controls are provided for triggering the interactive objects, different target interactive effects are displayed when the interactive objects trigger different interactive controls, and the processing mode of interactive services is rich; in addition, the target interaction effect corresponds to the competition interaction group one by one, and the displayed interaction effect has stronger pertinence and is beneficial to improving the interaction rate.

Description

Interactive service processing method, device, equipment and computer readable storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device and equipment for processing interactive services and a computer readable storage medium.
Background
With the development of computer technology, more and more application programs can play competitive interactive videos for interactive objects to watch. The competitive interaction video refers to a video for at least two competitive interaction groups to perform competitive interaction. For example, competitive interactive video refers to live video of team a and team B playing a game match.
In the related technology, a universal interaction control is displayed on a playing page of a sports interaction video, and when a trigger instruction for the universal interaction control is detected, a terminal displays a default interaction effect. In the process, the universal interaction control is an interaction control corresponding to all the competition interaction groups, the default interaction effect is an interaction effect corresponding to all the competition interaction groups, the displayed interaction effect has poor pertinence, the processing mode of the interaction service is single, and the interaction rate is low.
Disclosure of Invention
The embodiment of the application provides a method, a device and equipment for processing an interactive service and a computer-readable storage medium, which can be used for improving the pertinence of the displayed interactive effect. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for processing an interactive service, where the method includes:
displaying interactive controls respectively corresponding to at least two competition interactive groups on a playing page of a competition interactive video, wherein the competition interactive video is a video for performing competition interaction on the at least two competition interactive groups;
and displaying a target interaction effect corresponding to the target competition interaction group in the playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, wherein the target competition interaction group is any one of the at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects.
In another aspect, an apparatus for processing an interactive service is provided, where the apparatus includes:
the display unit is used for displaying interactive controls corresponding to at least two competition interactive groups on a playing page of a competition interactive video, and the competition interactive video is a video for performing competition interaction on the at least two competition interactive groups;
and the display unit is used for displaying a target interaction effect corresponding to the target competition interaction group in the playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, wherein the target competition interaction group is any one of the at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects.
In one possible implementation manner, the target interaction effect corresponding to the target competition interaction group is a first interaction effect or a second interaction effect; the display unit is used for responding to the fact that the trigger instruction of the interaction control corresponding to the target competition interaction group does not meet the hit condition, and displaying the first interaction effect in the playing page, wherein the first interaction effect comprises at least one of a target lighting effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group; and responding to a triggering instruction of an interaction control corresponding to the target competition interaction group to meet the hit condition, and displaying the second interaction effect in the playing page, wherein the second interaction effect comprises at least one of the target light effect and the first dynamic icon effect and a second dynamic icon effect corresponding to the target competition interaction group.
In a possible implementation manner, the display unit is further configured to dynamically display lighting of a target color in a first target area in the playing page; dynamically displaying a first icon of the target color in a second target area in the playing page; dynamically displaying a second icon of the target color in a third target area in the playing page; the target color is a color corresponding to the target competition interaction group, and different competition interaction groups correspond to different colors.
In a possible implementation manner, the playing page further displays support rate information corresponding to each of the at least two competitive interaction groups; the device further comprises:
the obtaining unit is used for obtaining updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
and the replacing unit is used for replacing the support rate information corresponding to the target competition interaction group displayed in the playing page with the updated support rate information.
In a possible implementation manner, a support ratio bar is further displayed on the playing page, and the support ratio bar is used for indicating support ratios respectively corresponding to the at least two competitive interaction groups;
the obtaining unit is used for obtaining updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
the device further comprises:
a determining unit, configured to determine an updated support rate ratio bar based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group;
the replacing unit is further configured to replace the support ratio bar displayed in the playing page with the updated support ratio bar.
In one possible implementation, the apparatus further includes:
the receiving unit is used for receiving the interaction information corresponding to the target competition interaction group sent by the server;
the display unit is further configured to display a reference interaction effect corresponding to the target competition interaction group in the play page based on the interaction information, and different competition interaction groups correspond to different reference interaction effects.
In a possible implementation manner, the receiving unit is configured to receive interaction information corresponding to the target competition interaction group sent by a server;
the display unit is further configured to display, in the playing page, a superimposed interaction effect corresponding to the target competition interaction group based on the interaction information and a trigger instruction of an interaction control corresponding to the target competition interaction group, where the superimposed interaction effect corresponding to the target competition interaction group is a superimposed effect of the target interaction effect corresponding to the target competition interaction group and a superimposed reference interaction effect corresponding to the target competition interaction group, and different competition interaction groups correspond to different reference interaction effects.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one computer program, and the at least one computer program is loaded and executed by the processor to implement any one of the above interactive service processing methods.
On the other hand, a computer-readable storage medium is provided, in which at least one computer program is stored, and the at least one computer program is loaded and executed by a processor to implement any one of the above interactive service processing methods.
In another aspect, a computer program product or a computer program is also provided, comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes any one of the above interactive service processing methods.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
in the embodiment of the application, the interactive controls corresponding to the various competition interaction groups are displayed on the playing page of the competition interaction video, and different target interaction effects corresponding to different competition interaction groups are displayed in the playing page under the triggering instructions of the interactive controls corresponding to different competition interaction groups. In the process, a plurality of interactive controls are provided for triggering the interactive objects, different target interactive effects are displayed in the playing page when different interactive controls are triggered by the interactive objects, and the processing mode of interactive services is rich; in addition, the target interaction effect corresponds to the competition interaction group one by one, and the displayed interaction effect has stronger pertinence, thereby being beneficial to improving the interaction experience of the interaction object and further improving the interaction rate.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a method for processing an interactive service according to an embodiment of the present application;
fig. 2 is a flowchart of a method for processing an interactive service according to an embodiment of the present application;
fig. 3 is a schematic view of a playing page of a sports interactive video provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a playing page of another sports interaction video provided by the embodiment of the present application;
FIG. 5 is a schematic view of a playing page of another sports interaction video provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a process for processing interactive services corresponding to team A according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a process for processing interactive services corresponding to team B according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a processing apparatus for interactive services according to an embodiment of the present application;
fig. 9 is a schematic diagram of a processing apparatus for interactive services according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It is noted that the terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Referring to fig. 1, a schematic diagram of an implementation environment of a method for processing an interactive service provided in an embodiment of the present application is shown. The implementation environment includes: a terminal 11 and a server 12.
The terminal 11 is installed with an application program or a web page capable of playing the sports interactive video, and when the application program or the web page needs to process the interactive service of the interactive object on the playing page of the sports interactive video, the method provided by the embodiment of the present application can be applied to process the interactive service.
The server 12 is used for providing background services for the application program or the web page installed on the terminal 11 and capable of playing the sports interactive video. In one possible implementation, the server 12 undertakes primary computational work and the terminal 11 undertakes secondary computational work; or, the server 12 undertakes the secondary computing work, and the terminal 11 undertakes the primary computing work; alternatively, the server 12 and the terminal 11 perform cooperative computing by using a distributed computing architecture.
In one possible implementation, the terminal 11 is any electronic product capable of performing human-Computer interaction with a user through one or more modes of a keyboard, a touch pad, a touch screen, a remote controller, voice interaction or handwriting equipment, for example, a PC (Personal Computer), a mobile phone, a smart phone, a PDA (Personal Digital Assistant), a wearable device, a handheld portable game device, a pocket PC (pocket PC), a tablet PC, a smart car machine, a smart television, a smart speaker, and the like. The server 12 is a server, or a server cluster composed of a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
It should be understood by those skilled in the art that the above-mentioned terminal 11 and server 12 are only examples, and other existing or future terminals or servers may be suitable for the present application and are included within the scope of the present application and are herein incorporated by reference.
Based on the implementation environment shown in fig. 1, an embodiment of the present application provides a method for processing an interactive service, which is applied to a terminal 11 as an example. As shown in fig. 2, the method provided in the embodiment of the present application includes the following steps:
in step 201, interactive controls corresponding to at least two competition interaction groups are displayed on a playing page of the competition interaction video, and the competition interaction video is a video for performing competition interaction on the at least two competition interaction groups.
The competitive interaction video refers to a video for at least two competitive interaction groups to perform competitive interaction. One competition interaction group comprises at least one competition object, and the number of the competition objects in different competition interaction groups can be the same or different. The athletic object may refer to a real object or a virtual object, which is not limited in the embodiment of the present application. Illustratively, when the sports object is a real object, the sports interaction group is a group formed by the real object, and the sports interaction video refers to a video for shooting the real sports interaction situation of the sports interaction group formed by the real object. For example, the competitive interaction group refers to game teams, each game team is composed of at least one team member, and the competitive interaction video refers to video shot of real game match situations of at least two game teams.
In an exemplary embodiment, the athletic object may also be a virtual object. When the competitive object is a virtual object, the competitive interaction group is a group formed by the virtual object, and the competitive interaction video refers to a video shot of the virtual competitive interaction condition of the competitive interaction group formed by the virtual object. The virtual competition interaction condition refers to the interaction condition of a competition interaction group in a virtual environment, and the competition interaction video refers to the video shot for the virtual environment.
It should be noted that, in the embodiment of the present application, the number of the competition interaction groups for performing the competition interaction is not limited, as long as the number of the competition interaction groups is not less than 2, and exemplarily, the number of the competition interaction groups is 2; alternatively, the number of athletic interaction groups is 3.
When an interactive object corresponding to the terminal wants to watch the sports interactive video, an application program or a webpage installed on the terminal can play the sports interactive video. In an exemplary embodiment, the sports interactive video played by the terminal may be a live video or an on-demand video, which is not limited in the embodiment of the present application. The sports interactive video is played in the playing frame, and the page of the playing frame comprising the sports interactive video is called as the playing page of the sports interactive video in the embodiment of the application. The interactive object can watch the sports interactive video in a playing frame in a playing page of the sports interactive video and can also generate interactive behaviors in the process of watching the sports interactive video. The playing page of the competitive interaction video can display interaction controls corresponding to at least two competitive interaction groups respectively so as to support the interaction behavior of an interaction object.
In an exemplary embodiment, the application embodiment does not limit the timing of displaying the interaction controls corresponding to at least two competition interaction groups on the playing page of the competition interaction video. Illustratively, when the terminal starts playing the sports interaction video, the interactive controls corresponding to at least two sports interaction groups are automatically displayed on a playing page of the sports interaction video. Illustratively, when the terminal receives a display instruction of the interaction control, the interaction control corresponding to each of at least two competition interaction groups is passively displayed on a playing page of the competition interaction video.
In an exemplary embodiment, an interactive control display entry is displayed on a playing page of the sports interactive video, and in response to detecting a trigger instruction of an interactive object to the interactive control display entry, the terminal acquires a display instruction of the interactive control, and then displays interactive controls corresponding to at least two sports interactive groups on the playing page of the sports interactive video. The setting position and the display form of the interactive control display inlet are not limited in the embodiment of the application, for example, the interactive control display inlet is arranged at the bottom of a playing page, and the display form is a button or triggerable icon form.
In the playing page of the sports interactive video, the position relationship between the playing frame for playing the sports interactive video and the interactive controls respectively corresponding to each sports interactive group is set according to experience, or flexibly adjusted according to the number of the sports interactive groups, which is not limited in the embodiment of the present application. Illustratively, as shown in fig. 3, in the playing page 31 of the sports interactive video, the playing frame 311 for playing the sports interactive video is located at a position near the top in the playing page 31, and the interactive controls corresponding to two sports interactive groups are respectively located at left and right positions near the bottom in the playing page 31, where the two sports interactive groups are respectively a team a and a team B, the interactive control 312 corresponding to the team a is located at the left position near the bottom in the playing page 31, and the interactive control 313 corresponding to the team B is located at the right position near the bottom in the playing page 31.
The display form of the interaction control is not limited in the embodiment of the application, for example, the display form of the interaction control is a button or a triggerable icon. In an exemplary embodiment, different competitive interaction groups respectively correspond to different colors, and the interaction controls corresponding to the different competitive interaction groups are displayed by using the different colors. In this way, different interactive controls corresponding to different competitive interactive groups can be visually distinguished by using colors.
In a possible implementation manner, colors corresponding to the competition interaction groups are preconfigured by the server, and the server can send a preset color configuration table including the colors corresponding to the competition interaction groups to the terminal, so that the terminal can determine the colors corresponding to the competition interaction groups based on the color configuration table, and then displays interaction controls with different colors on a playing page.
In a possible implementation manner, the playing page of the sports interactive video in the embodiment of the present application may be used for one interactive object to independently view the sports interactive video, or may be used for a plurality of interactive objects to jointly view the sports interactive video. And for the condition that the playing page of the sports interactive video is used for a plurality of interactive objects to watch the sports interactive video together, the playing page of the sports interactive video is displayed in the group conversation room, and all the interactive objects entering the group conversation room can watch the sports interactive video together. Illustratively, a group conversation room refers to a group chat room, a chat room, or a live room, among others. All interactive objects entering the group conversation room can be subjected to online social contact, and the interactive service processing method provided by the embodiment of the application can be applied to the online social contact form of multiple people in the group conversation room. In an exemplary embodiment, all the interactive objects entering the group conversation room are interactive objects in a social group, for example, the social group is an Instant Messaging (IM) group, each interactive object is a group member in the IM group, and the group members in the IM group can view the competitive interactive video together.
In an exemplary embodiment, in the case that a plurality of interactive objects enter a group conversation room and activate a function of viewing the sports interactive video together, interactive controls corresponding to at least two sports interactive groups respectively are displayed on a playing page of the sports interactive video.
In an exemplary embodiment, in addition to displaying the interaction controls corresponding to the at least two competition interaction groups, at least one of the support rate information and the support rate proportion bar corresponding to the at least two competition interaction groups can be displayed in the playing page of the competition interaction video.
The support rate information corresponding to any competition interaction group is used for indicating the support rate corresponding to the competition interaction group. Illustratively, the support rate corresponding to any competition interaction group refers to the support rate of all interaction objects which are obtained by any competition interaction group and jointly watch the competition interaction video. In one possible implementation manner, the support rate corresponding to any competition interaction group is represented by the number of times that all interaction objects which watch the competition interaction video together trigger the interaction control corresponding to the competition interaction group. In an exemplary embodiment, the support rate information corresponding to a certain competition interaction group includes a support rate corresponding to the competition interaction group and a group identifier of the competition interaction group. The group identifier of the competitive interaction group is used for identifying a unique competitive interaction group, for example, the group identifier of the competitive interaction group is the group name of the competitive interaction group; or the group identification of the competitive interaction group is a group head portrait of the competitive interaction group, and the like.
The support ratio bar is used for indicating the support ratio ratios corresponding to at least two competitive interaction groups respectively. The support ratio corresponding to any competition interaction group refers to a ratio of the support ratio corresponding to the competition interaction group to the sum of the support ratios corresponding to the competition interaction groups. In an exemplary embodiment, the support ratio bar includes support ratio sub-bars corresponding to respective competitive interaction groups. The ratio of the length of the support ratio sub-bar corresponding to any competition interaction group included in the support ratio bar to the total length of the support ratio bar is the same as the support ratio corresponding to the competition interaction group.
In a possible implementation manner, for the situation that different competition interaction groups correspond to different colors, when the support rate information corresponding to at least two competition interaction groups respectively is displayed, the whole support rate information corresponding to at least two competition interaction groups respectively is displayed by utilizing the colors corresponding to at least two competition interaction groups respectively; or displaying the support rate and the like in the support rate information corresponding to the at least two competition interaction groups by using the colors corresponding to the at least two competition interaction groups respectively.
In one possible implementation manner, for the case that different competition interaction groups correspond to different colors, when the support ratio bar is displayed, a certain support ratio sub-bar included in the support ratio bar is displayed by using the color corresponding to the competition interaction group corresponding to the support ratio sub-bar.
The visual effect of the interactive object can be improved by displaying the support rate information (or the support rate in the support rate information) corresponding to different competitive interactive groups in a color difference mode and displaying the support rate ratio bars corresponding to different competitive interactive groups in a color difference mode.
For example, as shown in fig. 3, it is assumed that the sports interaction video played in the play box 311 relates to two sports interaction groups, which are a team a and a team B, respectively. Assuming that the color corresponding to team a is black and the color corresponding to team B is gray, the interactive control 312 corresponding to team a is displayed in black and the interactive control 313 corresponding to team B is displayed in gray. Furthermore, the support rate information 314 corresponding to the team a is displayed in black, and the support rate information 315 corresponding to the team B is displayed in gray.
In the support ratio bar 316, a support ratio sub bar 3161 corresponding to the team a is displayed in black, and a support ratio sub condition 3162 corresponding to the team B is displayed in gray. The support ratio sub-bar 3161 corresponding to team a and the support ratio sub-bar 3162 corresponding to display team B form the entire support ratio bar 316. If the support rate corresponding to team a is 3455 from the support rate information 314 corresponding to team a and the support rate corresponding to team B is 1240 from the support rate information 315 corresponding to team B, the support rate corresponding to team a is greater than the length of the sub-bar 3161 than the support rate corresponding to team B is displayed.
In step 202, a target interaction effect corresponding to the target competition interaction group is displayed in the play page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, the target competition interaction group is any one of at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects.
After the interactive controls corresponding to at least two competitive interaction groups are displayed on the playing page of the competitive interaction video, the interactive object can trigger the interactive control corresponding to one interactive group to show that the interactive group is supported.
In an exemplary embodiment, in response to detecting that an interaction object triggers an interaction control corresponding to a target competition interaction group, a terminal acquires a trigger instruction of the interaction control corresponding to the target competition interaction group. At this time, the terminal considers that the interactive object supports the target competition interactive group, and the terminal displays a target interactive effect corresponding to the target competition interactive group in the playing page so as to inform the interactive object that the target competition interactive group is successfully supported by the interactive object through the target interactive effect. The target competitive interaction group is any one of at least two competitive interaction groups. The interactive object can trigger the interactive controls corresponding to different competitive interaction groups at different moments. Different competitive interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects, that is, different competitive interaction groups correspond to different target interaction effects. When the interactive object triggers the interactive controls corresponding to different competitive interactive groups, the target interactive effects displayed in the playing page by the terminal are different, and therefore the interactive experience of the interactive object is improved.
The type of the trigger operation of the interactive control is set by a developer, or is flexibly adjusted according to the type of the terminal, which is not limited in the embodiment of the present application. Illustratively, the triggering operation of the interactive control refers to a clicking operation of the interactive control; or, the triggering operation of the interaction control refers to a touch operation of the interaction control, and the like.
It should be noted that, in the embodiment of the present application, a target interaction effect corresponding to a target competition interaction group is shown as an example for description. Because different competitive interaction groups correspond to different target interaction effects, when the target competitive interaction group changes, the displayed target interaction effect also changes.
In one possible implementation manner, the target interaction effect corresponding to the target competition interaction group is the first interaction effect or the second interaction effect. Based on the trigger instruction of the interaction control corresponding to the target competition interaction group, the modes for displaying the target interaction effect corresponding to the target competition interaction group in the playing page include the following two modes:
the first method is as follows: and responding to that the interaction control corresponding to the target competition interaction group does not meet the hit condition, and displaying a first interaction effect in the playing page, wherein the first interaction effect comprises at least one of a target light effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group.
The method comprises the steps that a trigger instruction of an interaction control corresponding to a target competition interaction group is obtained based on one-time trigger operation of an interaction object on the interaction control corresponding to the target competition interaction group, after the trigger instruction of the interaction control corresponding to the target competition interaction group is obtained, whether the trigger instruction of the interaction control corresponding to the target competition interaction group meets a hit condition is judged, and then when the trigger instruction of the interaction control corresponding to the target competition interaction group meets the hit condition is determined, a first interaction effect is displayed based on a first mode; and when the triggering instruction of the interaction control corresponding to the target competition interaction group does not meet the hit condition, displaying a second interaction effect based on the second mode.
In a possible implementation manner, the condition that the trigger instruction of the interaction control corresponding to the target competition interaction group meets the hit condition means that the number of times of continuous trigger operations of the interaction object on the interaction control corresponding to the target competition interaction group hits the reference number of times when the trigger instruction of the interaction control corresponding to the target competition interaction group is obtained. The reference times are set empirically or flexibly adjusted according to the application scenario, which is not limited in the embodiments of the present application.
And if the number of the reference times is one or more, and the number of the reference times is more than one, and if the number of the continuous trigger operations of the interactive object on the interactive control corresponding to the target competition interaction group hits any reference time when the trigger instruction of the interactive control corresponding to the target competition interaction group is obtained, determining that the trigger instruction of the interactive control corresponding to the target competition interaction group meets the hit condition. Illustratively, the reference number of times includes, but is not limited to, 6 times, 66 times, etc. If the triggering instruction of the interactive control corresponding to the target competition interaction group is obtained, when the number of times of continuous triggering operation of the interactive control corresponding to the target competition interaction group by the interactive object is hit for 6 times or 66 times, determining that the triggering instruction of the interactive control corresponding to the target competition interaction group meets the hit condition.
In an exemplary embodiment, when a time interval between any two triggering operations of the interaction object on the interaction control corresponding to the target competition interaction group is smaller than a time interval threshold, the any two triggering operations are determined to be continuous triggering operations. The time interval threshold is set empirically or flexibly adjusted according to an application scenario, which is not limited in the embodiment of the present application.
And when the triggering instruction of the interaction control corresponding to the target competition interaction group does not meet the hit condition, displaying a first interaction effect in the playing page. The first interaction effect comprises at least one of a target light effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group. That is, the first interaction effect only includes the target lamp light effect; or, the first interactive effect includes only the first dynamic icon effect; alternatively, the first interactive effect comprises a target light effect and a first dynamic icon effect.
In an exemplary embodiment, the target light effect corresponding to the competitive interaction group is displayed in the following manner: and dynamically displaying the light of the target color in a first target area in the playing page. The target color is the color corresponding to the target competition interaction group, and different competition interaction groups correspond to different colors. The first target area is set empirically or flexibly adjusted according to the screen size, which is not limited in the embodiments of the present application. Illustratively, as shown in (1) in fig. 4, the first target area is left and right side edge areas 401 in the play page.
In one possible implementation manner, the manner of dynamically displaying the light of the target color in the first target area in the playing page is as follows: and displaying the light of the target color in the target rhythm in a first target area in the playing page. The target rhythm refers to a flickering rhythm of the light of the target color. In an exemplary embodiment, for a case that the first interaction effect includes a target light effect, for a one-time trigger instruction of an interaction control corresponding to the target competition interaction group, a display duration of the target light effect is a first duration, and the first duration is flexibly adjusted according to experience setting or according to an application scene. Namely, the light of the target color is displayed in the first target area in the playing page at the target rhythm until the display duration reaches the first duration, and one display of the light effect of the target light is completed.
The first dynamic icon effect corresponding to the target competition interaction group refers to a basic support effect corresponding to the target competition interaction group. In an exemplary embodiment, the display manner of the first dynamic icon effect corresponding to the target competition interaction group is as follows: and dynamically displaying the first icon of the target color in a second target area in the playing page. The second target area is set according to experience or flexibly adjusted according to an application scene. Illustratively, the second target area refers to an area near the interaction control corresponding to the target competition interaction group. The first icon is set by the developer.
Illustratively, the manner of dynamically displaying the first icon of the target color in the second target area in the playing page is as follows: and displaying the first icon of the target color in a second target area in the playing page in a gradually decreasing mode. Illustratively, for the case that the first interaction effect includes a first dynamic icon effect, for a one-time trigger instruction of an interaction control corresponding to the target competitive interaction group, a time length from appearance to disappearance of the first icon of the target color is a second time length, and the second time length is set empirically or flexibly adjusted according to an application scene. That is, the first icon of the target color is displayed in a gradually decreasing size for the second period of time until it disappears for the second period of time. In an exemplary embodiment, in the process of displaying the first icon of the target color in a manner of gradually fading from large to small, the first icon of the target color is gradually moved in position according to a target track, and the target track is set by a developer.
For example, as shown in (1) in fig. 4, it is assumed that the target competition interaction group is team a, the color corresponding to team a is black, the interaction control corresponding to the target competition interaction group is the interaction control 402 displayed in black, and the second target area 403 is located above the interaction control 402. A schematic view of the first icon (icon of numeral 666) shown in a large-to-small fade-out fashion in the second area 403 is shown.
In an exemplary embodiment, the second target areas are different for the first dynamic icon effects corresponding to different competitive interaction groups, and the first icons presented in the different second target areas may be the same, but the colors of the presented first icons are different.
In one possible implementation manner, the first interaction effect includes a target light effect and a first dynamic icon effect, and then the manner of displaying the first interaction effect in the playing page is as follows: dynamically displaying the light of the target color in a first target area in the playing page, and dynamically displaying the first icon of the target color in a second target area in the playing page. In an exemplary embodiment, the first target area and the second target area are different areas, so that the diversity of interaction effects is increased, and the interaction experience of an interaction object is improved. In an exemplary embodiment, the first time period for displaying the target lamp light effect and the second time period for displaying the first dynamic icon effect may be the same or different, and this is not limited in this application.
The second method comprises the following steps: and responding to a triggering instruction of the interaction control corresponding to the target competition interaction group to meet a hit condition, and displaying a second interaction effect in the playing page, wherein the second interaction effect comprises at least one of a target light effect and a first dynamic icon effect and a second dynamic icon effect corresponding to the target competition interaction group.
When the triggering instruction of the interaction control corresponding to the target competition interaction group meets the hit condition, the number of times of continuous triggering operation of the interaction control corresponding to the target competition interaction group by the interaction object is hit to the reference number of times when the triggering instruction of the interaction control corresponding to the target competition interaction group is obtained, and at the moment, the second interaction effect is displayed in the playing page. The second interactive effect comprises at least one of a target light effect and a first dynamic icon effect and a second dynamic icon effect corresponding to the target competitive interactive group. That is, the second interactive effect includes a target light effect and a second dynamic icon effect; or the second interactive effect comprises a first dynamic icon effect and a second dynamic icon effect; alternatively, the second interactive effect includes a target light effect, a first dynamic icon effect, and a second dynamic icon effect.
The display manner of the target lamp light effect and the first dynamic icon effect is described in the first embodiment, and will not be described herein. The second dynamic icon effect is an enhanced support effect triggered when the trigger instruction of the interactive control corresponding to the target competition interactive group meets the hit condition. In one possible implementation manner, the display manner of the second dynamic icon is: and dynamically displaying a second icon of the target color in a third target area in the playing page. The third target area is different from a second target area of the first icon for displaying the target color, and the second icon is different from the first icon. For example, as shown in (1) of fig. 4, the third target area 404 and the second target area 403 are different areas, and the black second icon displayed in the third target area 404 is different from the black first icon displayed in the second target area 403.
In one possible implementation manner, the manner of dynamically displaying the second icon of the target color in the third target area in the playing page is as follows: and displaying the second icon of the target color in a third target area in the playing page in an animation mode.
In an exemplary embodiment, for the second dynamic icon effects corresponding to different competitive interaction groups, the third target areas are the same, and the second icons displayed in the same third target areas may be the same, but the colors of the displayed second icons are different.
In an exemplary embodiment, for the case that the second interactive effect includes the target light effect, the first dynamic icon effect, and the second dynamic icon effect, the manner of presenting the second interactive effect in the playing page is: dynamically displaying the light of the target color in a first target area in the playing page, dynamically displaying the first icon of the target color in a second target area in the playing page, and dynamically displaying the second icon of the target color in a third target area in the playing page. Compared with the first interaction effect, the second interaction effect at least comprises a difference effect of a second dynamic icon effect, and the difference effect of the second dynamic icon effect can visually prompt that the frequency of continuous trigger operation of the interaction object on the interaction control corresponding to the target competitive interaction group is hit to the reference frequency.
Illustratively, different competitive interaction groups correspond to different target interaction effects, and it is assumed that the competitive interaction video relates to two competitive interaction groups, which are respectively a team a and a team B, the team a has a black color, the team B has a gray color, the target interaction effect display effect corresponding to the team a is shown as (1) in fig. 4, and the target interaction effect display effect corresponding to the team B is shown as fig. 5. In fig. 5, the first target area 501 for realizing the target light effect corresponding to team B is the same as the first target area 401 for realizing the target light effect corresponding to team a in (1) in fig. 4, but gray light is dynamically displayed in the first target area 501 for realizing the target light effect corresponding to team B, and black light is dynamically displayed in the first target area 401 for realizing the target light effect corresponding to team a.
In addition, in fig. 5, a second target area 503 for implementing the first dynamic icon effect corresponding to team B is located above the interactive control 502 corresponding to team B, and the gray first icon is dynamically displayed in the second target area 503, regardless of the position of the second target area or the color of the first icon, which is different from (1) in fig. 4. The third target area 504 for realizing the second dynamic icon effect corresponding to the team B is the same as the third target area 404 for realizing the second dynamic icon effect corresponding to the team a in (1) in fig. 4, but the gray second icon is dynamically displayed in the third target area 504 for realizing the second dynamic icon effect corresponding to the team B and the black second icon is dynamically displayed in the third target area 404 for realizing the second dynamic icon effect corresponding to the team a.
In one possible implementation manner, the interactive effect is displayed in real time based on the trigger instruction of the interactive control. When the target interaction effect corresponding to the target competition interaction group is displayed in the playing page based on the trigger instruction of the interaction control corresponding to the target competition interaction group, the target interaction effect displayed based on the trigger instruction of the last interaction control may not be displayed completely, and under the condition, the target interaction effect corresponding to the target competition interaction group is displayed in a superposed manner on the basis of continuously displaying the target interaction effect which is not displayed completely.
In a possible implementation manner, for a situation that the play page displays, in addition to the interaction controls corresponding to the at least two competition interaction groups, the support rate information corresponding to the at least two competition interaction groups, respectively, after acquiring the trigger instruction of the interaction control corresponding to the target competition interaction group, the method further includes: and based on a trigger instruction of an interaction control corresponding to the target competition interaction group, acquiring updated support rate information corresponding to the target competition interaction group, and replacing the support rate information corresponding to the target competition interaction group displayed in the playing page with the updated support rate information.
In one possible implementation manner, based on the trigger instruction of the interaction control corresponding to the target competition interaction group, the manner of obtaining the updated support rate information corresponding to the target competition interaction group is as follows: the terminal sends a support rate increasing request to the server based on a trigger instruction of an interactive control corresponding to the target competition interactive group; the server receives the support rate increasing request and verifies that the support rate increasing request is effective, and then sends response information to the terminal, wherein the response information carries updated support rate information corresponding to the target competition interaction group; and after receiving the response information sent by the server, the terminal acquires updated support rate information corresponding to the target competition interaction group from the response information. And after the updated support rate information corresponding to the target competition interaction group is obtained, replacing the support rate information corresponding to the target competition interaction group displayed in the playing page with the updated support rate information. That is, the latest support rate information corresponding to the target competition interaction group is always displayed in the playing page.
It should be noted that the above description is only an exemplary implementation manner of updating the support rate information corresponding to the target competition interaction group displayed in the play page, and the embodiment of the present application is not limited thereto. In an exemplary embodiment, for a case that a current terminal does not obtain a trigger instruction of an interaction control corresponding to a target competition interaction group, other interaction objects watching a competition interaction video simultaneously with the interaction object corresponding to the current terminal can trigger the interaction control corresponding to the target competition interaction group at terminals of the other interaction objects, so that the other terminals send a support rate increase request to a server based on the trigger instruction of the interaction control corresponding to the target competition interaction group. In this case, the current terminal can receive a support rate synchronization instruction sent by the server, where the support rate synchronization instruction carries updated support rate information corresponding to the target competition interaction group, and the current terminal can still replace the support rate information corresponding to the target competition interaction group displayed in the play page with the updated support rate information.
It should be further noted that, because the target competition interaction group is any one of the at least two competition interaction groups, the support rate information corresponding to each competition interaction group can be updated in real time in the play page based on the manner provided by the embodiment of the present application.
In a possible implementation manner, for a situation that a play page displays, in addition to interactive controls corresponding to at least two competition interaction groups respectively, a support ratio proportion bar, after acquiring a trigger instruction of an interactive control corresponding to a target competition interaction group, the method further includes: acquiring updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group; determining an updated support rate ratio bar based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group; and replacing the support ratio bar displayed in the playing page with the updated support ratio bar.
The current support rate information corresponding to the other competition interaction groups except the target competition interaction group refers to the support rate information corresponding to the other competition interaction groups currently displayed in the playing page. In one possible implementation manner, based on the updated support rate information and the current support rate information corresponding to the other competition interaction groups except the target competition interaction group, the manner of determining the updated support rate ratio is as follows: determining updated support rate ratios corresponding to at least two competition interaction groups respectively based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group; determining updated support ratio sub-bars corresponding to the at least two competition interaction groups respectively based on the updated support ratio corresponding to the at least two competition interaction groups respectively; and obtaining updated support ratio sub-bars based on the updated support ratio sub-bars respectively corresponding to the at least two competitive interaction groups.
In an exemplary embodiment, the total length of the updated support ratio bar is unchanged compared to the support ratio bar before updating, and the length ratio of each support ratio sub-bar included in the updated support ratio bar is changed. In an exemplary embodiment, for a case where a certain support ratio sub-bar included in the support ratio sub-bar is displayed with a color corresponding to the competitive interaction group corresponding to the support ratio sub-bar, even if the length ratio of the certain support ratio sub-bar changes, the display color of the support ratio sub-bar does not change.
It should be noted that the above is only an exemplary implementation manner of updating the support ratio bar displayed in the playing page, and the embodiment of the present application is not limited to this, that is, the terminal may also update the support ratio bar displayed in the playing page based on other implementation manners. Illustratively, the terminal may further update, in real time, the support rate information corresponding to each of the competition interaction groups displayed in the playing page, and determine, when each of the support rate information changes, an updated support rate proportion bar based on the latest support rate information corresponding to each of the competition interaction groups, and further replace the support rate proportion bar displayed in the playing page with the updated support rate proportion bar.
In a possible implementation manner, for a situation that, in addition to displaying interactive controls corresponding to at least two competition interaction groups respectively on a play page, support rate information and a support rate ratio bar corresponding to at least two competition interaction groups respectively are also displayed, after acquiring a trigger instruction of an interactive control corresponding to a target competition interaction group, the method further includes: acquiring updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group; determining an updated support rate ratio bar based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group; replacing the support rate information corresponding to the target competition interaction group displayed in the playing page with updated support rate information; and replacing the support ratio bar displayed in the playing page with the updated support ratio bar.
It should be noted that, for the condition that the terminal acquires the trigger instruction of the interaction control corresponding to the target competition interaction group, the process of displaying the target interaction effect corresponding to the target competition interaction group, updating the support rate information corresponding to the target competition interaction group displayed on the play page, and updating the support rate proportion bar displayed on the play page may be executed simultaneously, or may be executed sequentially according to a reference sequence, which is not limited in this embodiment of the present application. The reference sequence is set according to experience or flexibly adjusted according to application scenes.
In a possible implementation manner, the terminal can display a target interaction effect corresponding to the target competition interaction group in a playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, and can also receive interaction information corresponding to the target competition interaction group sent by the server, and then display a reference interaction effect corresponding to the target competition interaction group in the playing page based on the interaction information, wherein different competition interaction groups correspond to different reference interaction effects.
And the interactive information of the target competition interactive group sent by the server is used for indicating that the interactive object corresponding to the current terminal simultaneously watches other interactive objects of the competition interactive video to trigger the interactive control corresponding to the target competition interactive group. Under the condition, the terminal displays the reference interaction effect corresponding to the target competition interaction group in the playing page based on the interaction information. The reference interaction effect corresponding to the target competition interaction group refers to an interaction effect corresponding to the target competition interaction group and needing to be displayed at a guest state terminal, the guest state terminal refers to other terminals except a master state terminal, and the master state terminal refers to a terminal generating a trigger instruction of an interaction control corresponding to the target competition interaction group.
In an exemplary embodiment, the target interaction effect corresponding to the target competition interaction group refers to an interaction effect that needs to be displayed on the master terminal and corresponds to the target competition interaction group. The reference interaction effect corresponding to the target competition interaction group is different from the target interaction effect corresponding to the target competition interaction group, so that the interaction effect corresponding to the target competition interaction group and needing to be displayed at the guest state terminal and the interaction effect needing to be displayed at the main state terminal can be distinguished.
Illustratively, in the case that the target interaction effect corresponding to the target competition interaction group is the first interaction effect, and the first interaction effect includes the target light effect and the first dynamic icon effect, the reference interaction effect corresponding to the target competition interaction group is the first dynamic icon effect. Illustratively, in the case that the target interaction effect corresponding to the target competition interaction group is the second interaction effect, and the second interaction effect includes the target light effect, the first dynamic icon effect and the second dynamic icon effect, the reference interaction effect corresponding to the target competition interaction group is the first dynamic icon effect. For example, the target competitive interaction group is team a, the display effect of the target interaction effect corresponding to team a is shown as (1) in fig. 4, and the display effect of the reference interaction effect corresponding to team a is shown as (2) in fig. 4.
In a possible implementation manner, if the interactive information corresponding to the target competition interactive group sent by the server is received and a trigger instruction of the interactive control corresponding to the target competition interactive group is also obtained, a superimposed interactive effect corresponding to the target competition interactive group is displayed in a playing page based on the interactive information and the trigger instruction of the interactive control corresponding to the target competition interactive group, and the superimposed interactive effect corresponding to the target competition interactive group is obtained by superimposing the target interactive effect corresponding to the target competition interactive group and a reference interactive effect corresponding to the target competition interactive group. Different competitive interaction groups correspond to different reference interaction effects.
It should be noted that, the above embodiment only performs relevant description from the perspective of displaying the interaction effect corresponding to the target competition interaction group. And the playing page of the terminal can also display the interaction effect corresponding to other competitive interaction groups in real time according to the actual situation. For example, in the case that the competitive interaction video is a video for a team a and a team B to play a game match, the terminal can show the target interaction effect a corresponding to the team a in the play page based on the trigger instruction of the interaction control a corresponding to the team a, and simultaneously show the reference interaction effect B corresponding to the team B in the play page based on the interaction information corresponding to the team B sent by the server. In the exemplary embodiment, for a case where there are conflicting effects in the target interactive effect a and the reference interactive effect B, presentation is performed based on the effect in the target interactive effect a.
Illustratively, the target competition interaction group is any one of at least two competition interaction groups, that is, each competition interaction group can be used as a target competition interaction group, and then an interaction service corresponding to the target competition interaction group is processed.
The description will be given by taking an example in which the competitive interaction video relates to two competitive interaction groups, namely, a team a and a team B.
When the target competitive interaction group is team a, a process of processing an interactive service corresponding to team a is shown in fig. 6. The color corresponding to the team A is color A, and the interactive control A corresponding to the team A is displayed in the playing page of the competitive interactive video. When the interactive object does not trigger the interactive control A, the current terminal is a guest state terminal, and the interactive effect (guest state effect for short) required to be displayed at the guest state terminal is displayed in the playing page. And when the interactive object triggers the interactive control A, updating the support rate information corresponding to the team A in the playing page, and activating the triggering instruction A of the interactive control A corresponding to the team A. And when the triggering instruction A does not meet the hit condition, displaying a first dynamic icon effect A corresponding to the battle team A and a target lamp light effect A corresponding to the battle team A in the playing page, and not displaying a second dynamic icon effect A corresponding to the battle team A. When the trigger instruction A meets the hit condition, a first dynamic icon effect A corresponding to the team A and a target lamp light effect A corresponding to the team A are displayed in the playing page, and a second dynamic icon effect A corresponding to the team A is also displayed. And the first dynamic icon effect A, the target lamp effect A and the second dynamic icon effect A are all realized based on the color A.
When the target competitive interaction group is team B, a process of processing an interactive service corresponding to team B is shown in fig. 7. The color corresponding to the team B is color B, and the interactive control B corresponding to the team B is displayed in the playing page of the competitive interactive video. When the interactive object does not trigger the interactive control B, the current terminal is a guest state terminal, and the interactive effect (guest state effect for short) required to be displayed at the guest state terminal is displayed in the playing page. And when the interactive object triggers the interactive control B, updating the support rate information corresponding to the team B in the playing page, and activating a triggering instruction B of the interactive control B corresponding to the team B. And when the triggering instruction B does not meet the hit condition, displaying a first dynamic icon effect B corresponding to the battle team B and a target lamp light effect B corresponding to the battle team B in the playing page, and not displaying a second dynamic icon effect B corresponding to the battle team B. When the trigger instruction B meets the hit condition, a first dynamic icon effect B corresponding to the team B and a target lamp light effect B corresponding to the team B are displayed in the playing page, and a second dynamic icon effect B corresponding to the team B is also displayed. And the first dynamic icon effect B, the target light effect B and the second dynamic icon effect B are all realized based on the color B.
In an exemplary embodiment, different target interaction effects are displayed based on the triggering instructions of the interaction controls corresponding to different competition interaction groups, so that the interaction effects change in real time along with the change of the triggered interaction controls. Exemplarily, different target interactive effects include different target lamp light effects, and the different target lamp light effects are realized by displaying different colors of light, so that the light effects are combined with interactive feedback of the interactive object, and the light with different colors corresponding to different competitive interactive groups is used for distinguishing processing procedures of interactive services of different competitive interactive groups, thereby improving visual experience of the interactive object.
In addition, different competitive interaction groups correspond to different target interaction effects, the relevance between the competitive interaction groups and the interaction effects can be highlighted, interaction and emotion expression of the interactive objects are facilitated in a targeted and more immersive manner, and the interactive objects experience immersive interactive experience. For the situation that a plurality of interactive objects in a social group watch sports interactive videos together, the method provided by the embodiment of the application can widen the emotion expression mode of the interactive objects in the social group, enhance the interactive atmosphere in the social group, increase the interactive possibility in the social group and improve the activity in the social group.
In the embodiment of the application, the interactive controls corresponding to the various competition interaction groups are displayed on the playing page of the competition interaction video, and different target interaction effects corresponding to different competition interaction groups are displayed in the playing page under the triggering instructions of the interactive controls corresponding to different competition interaction groups. In the process, a plurality of interactive controls are provided for triggering the interactive objects, different target interactive effects are displayed in the playing page when different interactive controls are triggered by the interactive objects, and the processing mode of interactive services is rich; in addition, the target interaction effect corresponds to the competition interaction group one by one, and the displayed interaction effect has stronger pertinence, thereby being beneficial to improving the interaction experience of the interaction object and further improving the interaction rate.
Referring to fig. 8, an embodiment of the present application provides an apparatus for processing an interactive service, where the apparatus includes:
the display unit 801 is configured to display interactive controls corresponding to at least two competition interaction groups on a playing page of the competition interaction video, where the competition interaction video is a video for performing competition interaction on the at least two competition interaction groups;
the display unit 802 is configured to display a target interaction effect corresponding to a target competition interaction group in a playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, where the target competition interaction group is any one of at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, and different interaction controls correspond to different target interaction effects.
In one possible implementation manner, the target interaction effect corresponding to the target competition interaction group is a first interaction effect or a second interaction effect; the display unit 802 is configured to display a first interaction effect in a playing page in response to that a trigger instruction of an interaction control corresponding to a target competition interaction group does not meet a hit condition, where the first interaction effect includes at least one of a target lighting effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group; and responding to a triggering instruction of the interaction control corresponding to the target competition interaction group to meet a hit condition, and displaying a second interaction effect in the playing page, wherein the second interaction effect comprises at least one of a target light effect and a first dynamic icon effect and a second dynamic icon effect corresponding to the target competition interaction group.
In a possible implementation manner, the display unit 802 is further configured to dynamically display a light of a target color in a first target area in the playing page; dynamically displaying a first icon of a target color in a second target area in the playing page; dynamically displaying a second icon of the target color in a third target area in the playing page; the target color is the color corresponding to the target competition interaction group, and different competition interaction groups correspond to different colors.
In a possible implementation manner, the playing page further displays support rate information corresponding to at least two competitive interaction groups respectively; referring to fig. 9, the apparatus further comprises:
an obtaining unit 803, configured to obtain updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
a replacing unit 804, configured to replace the support rate information corresponding to the target competition interaction group displayed in the play page with the updated support rate information.
In a possible implementation manner, a support ratio bar is further displayed on the playing page, and the support ratio bar is used for indicating the support ratio ratios respectively corresponding to the at least two competitive interaction groups;
an obtaining unit 803, configured to obtain updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
referring to fig. 9, the apparatus further comprises:
a determining unit 805, configured to determine an updated support rate ratio bar based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group;
the replacing unit 804 is further configured to replace the support ratio bar displayed in the playing page with the updated support ratio bar.
In one possible implementation, referring to fig. 9, the apparatus further includes:
a receiving unit 806, configured to receive interaction information corresponding to the target competition interaction group sent by the server;
the display unit 802 is further configured to display, in the playing page, a reference interaction effect corresponding to the target competition interaction group based on the interaction information, where different competition interaction groups correspond to different reference interaction effects.
In a possible implementation manner, the receiving unit 806 is configured to receive interaction information corresponding to a target competition interaction group sent by a server;
the display unit 802 is further configured to display, in a playing page, a superimposed interaction effect corresponding to the target competition interaction group based on the interaction information and a trigger instruction of an interaction control corresponding to the target competition interaction group, where the superimposed interaction effect corresponding to the target competition interaction group is a superimposed effect of the target interaction effect corresponding to the target competition interaction group and a superimposed reference interaction effect corresponding to the target competition interaction group, and different competition interaction groups correspond to different reference interaction effects.
In the embodiment of the application, the interactive controls corresponding to the various competition interaction groups are displayed on the playing page of the competition interaction video, and different target interaction effects corresponding to different competition interaction groups are displayed in the playing page under the triggering instructions of the interactive controls corresponding to different competition interaction groups. In the process, a plurality of interactive controls are provided for triggering the interactive objects, different target interactive effects are displayed in the playing page when different interactive controls are triggered by the interactive objects, and the processing mode of interactive services is rich; in addition, the target interaction effect corresponds to the competition interaction group one by one, and the displayed interaction effect has stronger pertinence, thereby being beneficial to improving the interaction experience of the interaction object and further improving the interaction rate.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 10 is a schematic structural diagram of a server according to an embodiment of the present application, where the server may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1001 and one or more memories 1002, where the one or more memories 1002 store at least one computer program, and the at least one computer program is loaded and executed by the one or more processors 1001 to implement the Processing method of the interactive service provided by the foregoing method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
Fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application. Illustratively, the terminal is: a smartphone, a tablet, a laptop, or a desktop computer. A terminal may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, a terminal includes: a processor 1101 and a memory 1102.
Processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1101 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1101 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1101 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and rendering content that the display screen needs to display. In some embodiments, the processor 1101 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1102 is used to store at least one instruction for execution by the processor 1101 to implement the processing method of the interactive service provided by the method embodiments of the present application.
In some embodiments, the terminal may further include: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102 and peripheral interface 1103 may be connected by a bus or signal lines. Various peripheral devices may be connected to the peripheral interface 1103 by buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1104, display screen 1105, camera assembly 1106, audio circuitry 1107, positioning assembly 1108, and power supply 1109.
The peripheral interface 1103 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1101 and the memory 1102. In some embodiments, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1101, the memory 1102 and the peripheral device interface 1103 may be implemented on separate chips or circuit boards, which is not limited by this embodiment.
The Radio Frequency circuit 1104 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1104 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1104 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1104 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1104 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1105 is a touch display screen, the display screen 1105 also has the ability to capture touch signals on or over the surface of the display screen 1105. The touch signal may be input to the processor 1101 as a control signal for processing. At this point, the display screen 1105 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1105 may be one, disposed on the front panel of the terminal; in other embodiments, the display screens 1105 may be at least two, respectively disposed on different surfaces of the terminal or in a folded design; in other embodiments, the display 1105 may be a flexible display disposed on a curved surface or on a folded surface of the terminal. Even further, the display screen 1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display screen 1105 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and the like.
Camera assembly 1106 is used to capture images or video. Optionally, camera assembly 1106 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1106 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1101 for processing or inputting the electric signals to the radio frequency circuit 1104 to achieve voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones can be arranged at different parts of the terminal respectively. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1101 or the radio frequency circuit 1104 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1107 may also include a headphone jack.
The positioning component 1108 is used to locate the current geographic Location of the terminal to implement navigation or LBS (Location Based Service). The Positioning component 1108 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
The power supply 1109 is used to supply power to the various components in the terminal. The power supply 1109 may be alternating current, direct current, disposable or rechargeable. When the power supply 1109 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal also includes one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyro sensor 1112, pressure sensor 1113, fingerprint sensor 1114, optical sensor 1115, and proximity sensor 1116.
The acceleration sensor 1111 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1111 may be configured to detect components of the gravitational acceleration in three coordinate axes. The processor 1101 may control the display screen 1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1111. The acceleration sensor 1111 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the terminal, and the gyro sensor 1112 may cooperate with the acceleration sensor 1111 to acquire a 3D motion of the user with respect to the terminal. From the data collected by gyroscope sensor 1112, processor 1101 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1113 may be disposed on the side frame of the terminal and/or underneath the display screen 1105. When the pressure sensor 1113 is arranged on the side frame of the terminal, a holding signal of a user to the terminal can be detected, and the processor 1101 performs left-right hand identification or shortcut operation according to the holding signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the display screen 1105, the processor 1101 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1105. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1114 is configured to collect a fingerprint of the user, and the processor 1101 identifies the user according to the fingerprint collected by the fingerprint sensor 1114, or the fingerprint sensor 1114 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 1101 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1114 may be disposed on the front, back, or side of the terminal. When a physical key or a vendor Logo is provided on the terminal, the fingerprint sensor 1114 may be integrated with the physical key or the vendor Logo.
Optical sensor 1115 is used to collect ambient light intensity. In one embodiment, the processor 1101 may control the display brightness of the display screen 1105 based on the ambient light intensity collected by the optical sensor 1115. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1105 is increased; when the ambient light intensity is low, the display brightness of the display screen 1105 is reduced. In another embodiment, processor 1101 may also dynamically adjust the shooting parameters of camera assembly 1106 based on the ambient light intensity collected by optical sensor 1115.
A proximity sensor 1116, also referred to as a distance sensor, is typically provided on the front panel of the terminal. The proximity sensor 1116 is used to capture the distance between the user and the front face of the terminal. In one embodiment, the display screen 1105 is controlled by the processor 1101 to switch from a bright screen state to a dark screen state when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal is gradually decreasing; when the proximity sensor 1116 detects that the distance between the user and the front face of the terminal is gradually increasing, the display screen 1105 is controlled by the processor 1101 to switch from a breath-screen state to a bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer device is also provided, the computer device comprising a processor and a memory, the memory having at least one computer program stored therein. The at least one computer program is loaded and executed by one or more processors to implement any of the above-described interactive service processing methods.
In an exemplary embodiment, there is also provided a computer-readable storage medium, in which at least one computer program is stored, the at least one computer program being loaded and executed by a processor of a computer device to implement the processing method of any one of the interactive services.
In one possible implementation, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes any one of the above interactive service processing methods.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method for processing an interactive service, the method comprising:
displaying interactive controls respectively corresponding to at least two competition interactive groups on a playing page of a competition interactive video, wherein the competition interactive video is a video for performing competition interaction on the at least two competition interactive groups;
displaying a target interaction effect corresponding to a target competition interaction group in the playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, wherein the target competition interaction group is any one of the at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, different interaction controls correspond to different target interaction effects, the target interaction effect corresponding to the target competition interaction group comprises at least one of a target light effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group, and a first target area displaying the target light effect is different from a second target area displaying the first dynamic icon effect;
the target light effect and the first dynamic icon effect are achieved based on a target color, the target color is a color corresponding to the target competition interaction group, and different competition interaction groups correspond to different colors.
2. The method according to claim 1, wherein the target interaction effect corresponding to the target competition interaction group is a first interaction effect or a second interaction effect; the displaying of the target interaction effect corresponding to the target competition interaction group in the playing page based on the trigger instruction of the interaction control corresponding to the target competition interaction group includes:
responding to the fact that the triggering instruction of the interaction control corresponding to the target competition interaction group does not meet a hit condition, and displaying the first interaction effect in the playing page;
and responding to a triggering instruction of an interaction control corresponding to the target competition interaction group to meet the hit condition, and displaying the second interaction effect in the playing page, wherein the second interaction effect further comprises a second dynamic icon effect corresponding to the target competition interaction group.
3. The method of claim 2, wherein the target lamp light effect is displayed in a manner comprising: dynamically displaying the light of the target color in the first target area in the playing page;
the display mode of the first dynamic icon effect comprises the following steps: dynamically displaying a first icon of the target color in the second target area in the playing page;
the display mode of the second dynamic icon effect comprises the following steps: and dynamically displaying a second icon of the target color in a third target area in the playing page.
4. The method according to any one of claims 1 to 3, wherein the display page further displays support rate information corresponding to each of the at least two competitive interaction groups; after the interactive controls respectively corresponding to the at least two competition interaction groups are displayed on the playing page of the competition interaction video, the method further comprises the following steps:
acquiring updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
replacing the support rate information corresponding to the target competition interaction group displayed in the playing page with the updated support rate information.
5. The method according to any one of claims 1 to 3, wherein a support ratio bar is further displayed on the playing page, and the support ratio bar is used for indicating the support ratio corresponding to each of the at least two competitive interaction groups;
after the interactive controls respectively corresponding to the at least two competition interaction groups are displayed on the playing page of the competition interaction video, the method further comprises the following steps:
acquiring updated support rate information corresponding to the target competition interaction group based on a trigger instruction of an interaction control corresponding to the target competition interaction group;
determining an updated support rate ratio bar based on the updated support rate information and current support rate information corresponding to other competition interaction groups except the target competition interaction group;
and replacing the support ratio bar displayed in the playing page with the updated support ratio bar.
6. The method according to any one of claims 1-3, further comprising:
receiving interaction information corresponding to the target competition interaction group sent by a server;
and displaying a reference interaction effect corresponding to the target competition interaction group in the playing page based on the interaction information, wherein different competition interaction groups correspond to different reference interaction effects.
7. The method according to any one of claims 1-3, further comprising:
receiving interaction information corresponding to the target competition interaction group sent by a server;
and displaying a superposed interactive effect corresponding to the target competition interactive group in the playing page based on the interactive information and a triggering instruction of an interactive control corresponding to the target competition interactive group, wherein the superposed interactive effect corresponding to the target competition interactive group is a superposed effect of the target interactive effect corresponding to the target competition interactive group and a superposed reference interactive effect corresponding to the target competition interactive group, and different competition interactive groups correspond to different reference interactive effects.
8. An apparatus for processing interactive services, the apparatus comprising:
the display unit is used for displaying interactive controls corresponding to at least two competition interactive groups on a playing page of a competition interactive video, and the competition interactive video is a video for performing competition interaction on the at least two competition interactive groups;
the display unit is used for displaying a target interaction effect corresponding to a target competition interaction group in the playing page based on a trigger instruction of an interaction control corresponding to the target competition interaction group, the target competition interaction group is any one of the at least two competition interaction groups, different competition interaction groups correspond to different interaction controls, different interaction controls correspond to different target interaction effects, the target interaction effect corresponding to the target competition interaction group comprises at least one of a target light effect corresponding to the target competition interaction group and a first dynamic icon effect corresponding to the target competition interaction group, and a first target area displaying the target light effect is different from a second target area displaying the first dynamic icon effect;
the target light effect and the first dynamic icon effect are achieved based on a target color, the target color is a color corresponding to the target competition interaction group, and different competition interaction groups correspond to different colors.
9. A computer device, characterized in that it comprises a processor and a memory, in which at least one computer program is stored, which is loaded and executed by the processor to implement the method of processing an interactive service according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which at least one computer program is stored, which is loaded and executed by a processor, to implement the method of processing an interactive service according to any one of claims 1 to 7.
CN202011248528.2A 2020-11-10 2020-11-10 Interactive service processing method, device, equipment and computer readable storage medium Active CN112367533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011248528.2A CN112367533B (en) 2020-11-10 2020-11-10 Interactive service processing method, device, equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011248528.2A CN112367533B (en) 2020-11-10 2020-11-10 Interactive service processing method, device, equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112367533A CN112367533A (en) 2021-02-12
CN112367533B true CN112367533B (en) 2022-01-11

Family

ID=74509271

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011248528.2A Active CN112367533B (en) 2020-11-10 2020-11-10 Interactive service processing method, device, equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112367533B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113419800B (en) * 2021-06-11 2023-03-24 北京字跳网络技术有限公司 Interaction method, device, medium and electronic equipment
CN115686290A (en) * 2022-11-07 2023-02-03 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7536300B2 (en) * 1998-10-09 2009-05-19 Enounce, Inc. Method and apparatus to determine and use audience affinity and aptitude
CN104333775A (en) * 2014-11-25 2015-02-04 广州华多网络科技有限公司 Virtual goods interaction method, device and system in live channel
CN104363475A (en) * 2014-11-14 2015-02-18 广州华多网络科技有限公司 Audience grouping association method, device and system
CN105307036A (en) * 2015-10-26 2016-02-03 天脉聚源(北京)科技有限公司 Method and device for displaying support rates of competing teams in real time

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8928811B2 (en) * 2012-10-17 2015-01-06 Sony Corporation Methods and systems for generating ambient light effects based on video content
CN106060676B (en) * 2016-05-17 2019-06-07 腾讯科技(深圳)有限公司 Online interaction method and apparatus based on live streaming
CN107172488A (en) * 2017-04-01 2017-09-15 武汉斗鱼网络科技有限公司 Present animated show method and system in a kind of network direct broadcasting
CN116761007A (en) * 2017-12-29 2023-09-15 广州方硅信息技术有限公司 Method for giving virtual gift to multicast live broadcasting room and electronic equipment
CN109756747B (en) * 2019-03-25 2023-02-28 广州方硅信息技术有限公司 Interactive live broadcasting method and system for multiple anchor
CN111881940B (en) * 2020-06-29 2023-11-24 广州方硅信息技术有限公司 Live broadcast continuous wheat matching method and device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7536300B2 (en) * 1998-10-09 2009-05-19 Enounce, Inc. Method and apparatus to determine and use audience affinity and aptitude
CN104363475A (en) * 2014-11-14 2015-02-18 广州华多网络科技有限公司 Audience grouping association method, device and system
CN104333775A (en) * 2014-11-25 2015-02-04 广州华多网络科技有限公司 Virtual goods interaction method, device and system in live channel
CN105307036A (en) * 2015-10-26 2016-02-03 天脉聚源(北京)科技有限公司 Method and device for displaying support rates of competing teams in real time

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高清虚拟演播室建设的探索与实践;林鐄等;《实验室研究与探索》;20191215(第12期);第282-285、295页 *

Also Published As

Publication number Publication date
CN112367533A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN108619721B (en) Distance information display method and device in virtual scene and computer equipment
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN110213608B (en) Method, device, equipment and readable storage medium for displaying virtual gift
CN110971930A (en) Live virtual image broadcasting method, device, terminal and storage medium
CN109729411B (en) Live broadcast interaction method and device
CN107982918B (en) Game game result display method and device and terminal
CN112533017B (en) Live broadcast method, device, terminal and storage medium
CN110300274B (en) Video file recording method, device and storage medium
CN109803154B (en) Live broadcast method, equipment and storage medium for chess game
CN109646944B (en) Control information processing method, control information processing device, electronic equipment and storage medium
CN111918090B (en) Live broadcast picture display method and device, terminal and storage medium
CN113204672B (en) Resource display method, device, computer equipment and medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN111050189A (en) Live broadcast method, apparatus, device, storage medium, and program product
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN114116053A (en) Resource display method and device, computer equipment and medium
CN111028566A (en) Live broadcast teaching method, device, terminal and storage medium
CN113204671A (en) Resource display method, device, terminal, server, medium and product
CN112612387B (en) Method, device and equipment for displaying information and storage medium
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device
CN112023403B (en) Battle process display method and device based on image-text information
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN112007362A (en) Display control method, device, storage medium and equipment in virtual world
CN112910752A (en) Voice expression display method and device and voice expression generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40039038

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant