[go: up one dir, main page]

CN110191304B - Data processing method, device and storage medium - Google Patents

Data processing method, device and storage medium Download PDF

Info

Publication number
CN110191304B
CN110191304B CN201910314606.5A CN201910314606A CN110191304B CN 110191304 B CN110191304 B CN 110191304B CN 201910314606 A CN201910314606 A CN 201910314606A CN 110191304 B CN110191304 B CN 110191304B
Authority
CN
China
Prior art keywords
video
data packet
image transmission
streaming media
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910314606.5A
Other languages
Chinese (zh)
Other versions
CN110191304A (en
Inventor
谢文龙
赵虎彪
李云鹏
沈军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visionvera Information Technology Co Ltd
Original Assignee
Visionvera Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionvera Information Technology Co Ltd filed Critical Visionvera Information Technology Co Ltd
Priority to CN201910314606.5A priority Critical patent/CN110191304B/en
Publication of CN110191304A publication Critical patent/CN110191304A/en
Application granted granted Critical
Publication of CN110191304B publication Critical patent/CN110191304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/08Protocols for interworking; Protocol conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

本发明提供了一种数据处理方法、装置及存储介质。其中方法包括:流媒体服务器在图传设备登录流媒体服务器后,接收图传设备以发布直播方式发送的基于互联网协议的直播数据包;流媒体服务器在图传设备加入视联网视频会议后,获取基于互联网协议的直播数据包;流媒体服务器将基于互联网协议的直播数据包转换成基于视联网协议的直播数据包;流媒体服务器将基于视联网协议的直播数据包,经由视联网服务器发送至加入视联网视频会议的视联网终端。本发明实现了将互联网中的图传设备作为参会终端加入视联网视频会议中,由流媒体服务器实现视联网协议和互联网协议之间的转换,增加了参会终端的多样性。

Figure 201910314606

The present invention provides a data processing method, device and storage medium. The method includes: after the image transmission device logs in to the streaming media server, the streaming media server receives an Internet protocol-based live data packet sent by the image transmission device in a live broadcast mode; the streaming media server obtains the Internet Protocol-based live data packet after the image transmission device joins the video conference. Live broadcast data packets based on Internet Protocol; the streaming media server converts the live broadcast data packets based on Internet Protocol into live broadcast data packets based on Internet Protocol Protocol; the streaming media server sends the live broadcast data packets based on Internet Protocol A video-connected terminal for video-connected video conferencing. The invention realizes that the image transmission device in the Internet is used as a participant terminal to join the video conference, and the streaming media server realizes the conversion between the video protocol and the Internet protocol, which increases the diversity of participant terminals.

Figure 201910314606

Description

Data processing method, device and storage medium
Technical Field
The present invention relates to the field of video networking technologies, and in particular, to a data processing method, apparatus, and storage medium.
Background
With the rapid development of network technologies, bidirectional communications such as video conferences, video teaching, video phones, and the like are widely popularized in the aspects of life, work, learning, and the like of users.
Video conferencing refers to a conference in which people at two or more locations have a face-to-face conversation via a communication device and a network. Video conferences can be divided into point-to-point conferences and multipoint conferences according to different numbers of participating places. Individuals in daily life have no requirements on the security of conversation contents, the quality of a conference and the scale of the conference, and can adopt video software to carry out video chat. And the commercial video conference of government organs and enterprise institutions requires conditions such as stable and safe network, reliable conference quality, formal conference environment and the like, and professional video conference equipment is used to establish a special video conference system.
In the prior art, only professional video conference equipment can be added into a video conference as a participant terminal, and the participant terminal can speak and also can receive speaking data of the speaking participant terminal. However, other devices, such as image transmission devices, cannot join the video conference as participating terminals.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a data processing method, apparatus and storage medium that overcome or at least partially solve the above problems.
In a first aspect, an embodiment of the present invention discloses a data processing method, where the method is applied to a video conference in a video network, the video network includes a streaming media server, a video network server and a video network terminal, and the internet includes a graph transmission device, and the method includes:
the streaming media server receives a live broadcast data packet which is sent by the image transmission equipment in a live broadcast releasing mode and is based on an internet protocol after the image transmission equipment logs in the streaming media server;
the streaming media server acquires the live broadcast data packet based on the Internet protocol after the image transmission equipment joins the video conference of the video network;
the streaming media server converts the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on a video networking protocol;
and the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
Optionally, the method further comprises: after the image transmission equipment logs in the streaming media server, the streaming media server establishes a live broadcast channel corresponding to the image transmission equipment; the step of receiving the live broadcast data packet based on the internet protocol sent by the image transmission equipment comprises the following steps: and receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through the live broadcast channel corresponding to the image transmission equipment.
Optionally, the method further comprises: the streaming media server creates a middleware after the image transmission equipment joins the video conference of the video network; the step of obtaining the live broadcast data packet based on the internet protocol comprises the following steps: and pulling the live broadcast data packet based on the Internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware.
Optionally, the method further comprises: when the streaming media server receives a speech data packet based on a video networking protocol sent by a video networking terminal joining the video networking video conference, screening an audio speech data packet from the speech data packet based on the video networking protocol; the streaming media server converts the audio speech data packet into an audio speech data packet based on an internet protocol; and the streaming media server sends the audio speech data packet based on the Internet protocol to the image transmission equipment.
Optionally, the method further comprises: the streaming media server establishes an audio channel corresponding to the image transmission equipment after the image transmission equipment joins the video conference of the video network; the step of sending the audio speech data packet based on the internet protocol to the image transmission device by the streaming media server includes: and the streaming media server transmits the audio speech data packet based on the Internet protocol to the audio channel and sends the audio speech data packet to the image transmission equipment through the audio channel.
Optionally, the step of screening out an audio speech packet from the speech packets based on the video networking protocol includes: analyzing the speech data packet based on the video networking protocol, and extracting a field representing the data type carried in a packet header of the video networking; the data types comprise an audio type and a video type; and determining the speech data packet with the data type represented by the extracted field as the audio type as the audio speech data packet.
In a second aspect, an embodiment of the present invention discloses a data processing apparatus, where the apparatus is applied to a video conference over a video network, the video network includes a streaming media server, a video network server, and a video network terminal, the internet includes a graph transmission device, and the streaming media server includes:
the receiving module is used for receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast releasing mode and is based on an internet protocol after the image transmission equipment logs in the streaming media server;
the acquisition module is used for acquiring the live broadcast data packet based on the Internet protocol after the image transmission equipment joins the video conference of the video network;
the first conversion module is used for converting the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on a video networking protocol;
and the first sending module is used for sending the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
Optionally, the streaming media server further includes: the first establishing module is used for establishing a live channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server; the receiving module is used for receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through the live broadcast channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server.
Optionally, the streaming media server further includes: the creating module is used for creating a middleware after the image transmission equipment joins the video conference of the video network; the acquisition module is used for pulling the live broadcast data packet based on the internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware after the image transmission equipment joins the video conference of the video network.
Optionally, the streaming media server further includes: the screening module is used for screening out audio speaking data packets from the speaking data packets based on the video networking protocol when receiving the speaking data packets based on the video networking protocol sent by the video networking terminals joining the video networking video conference; the second conversion module is used for converting the audio speech data packet into an audio speech data packet based on an internet protocol; and the second sending module is used for sending the audio speech data packet based on the internet protocol to the image transmission equipment.
Optionally, the streaming media server further includes: the second establishing module is used for establishing an audio channel corresponding to the image transmission equipment after the image transmission equipment joins the video conference of the video network; the second sending module is configured to transmit the internet protocol-based audio speech packet to the audio channel, and send the internet protocol-based audio speech packet to the image transmission device through the audio channel.
Optionally, the screening module comprises: the extraction unit is used for analyzing the speech data packet based on the video networking protocol and extracting a field representing the data type carried in a video networking packet header; the data types comprise an audio type and a video type; and the determining unit is used for determining that the speech data packet with the data type represented by the extracted field as the audio type is an audio speech data packet.
In a third aspect, an embodiment of the present invention discloses an apparatus, including: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the data processing method of any of the above.
In a fourth aspect, an embodiment of the present invention discloses a computer-readable storage medium storing a computer program for causing a processor to execute a data processing method as described in any one of the above.
In the embodiment of the invention, a streaming media server receives a live broadcast data packet which is sent by an image transmission device in a broadcast live broadcast mode and is based on an internet protocol after the image transmission device logs in; the streaming media server acquires the live broadcast data packet based on the Internet protocol after the image transmission equipment joins the video conference of the video network; the streaming media server converts the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on a video networking protocol; and the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server. Therefore, in the embodiment of the invention, the image transmission equipment in the internet is used as the participant terminal to be added into the video conference of the video network, the streaming media server is used for realizing the conversion between the video networking protocol and the internet protocol, and the image transmission equipment can be used as a speaking party to send live data in a live broadcasting mode in the video conference of the video network, so that the diversity of the participant terminal is increased, the video network terminal added into the video conference of the video network can watch the live data, the live data can be discussed according to the live data, and the like.
Drawings
FIG. 1 is a schematic networking diagram of a video network of the present invention;
FIG. 2 is a schematic diagram of a hardware architecture of a node server according to the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch of the present invention;
fig. 4 is a schematic diagram of a hardware structure of an ethernet protocol conversion gateway according to the present invention;
FIG. 5 is a flow chart of steps of a data processing method according to a first embodiment of the present invention;
FIG. 6 is a flowchart illustrating steps of a data processing method according to a second embodiment of the present invention;
fig. 7 is a block diagram of a data processing apparatus according to a third embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved the traditional Ethernet (Ethernet) to face the potentially huge first video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network Circuit Switching (Circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: server, exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, code board, memory, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node server, access exchanger (including Ethernet protocol conversion gateway), terminal (including various set-top boxes, coding board, memory, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the incoming data packet of the CPU module 304 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues and may include two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 308 is configured by the CPU module 304, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MAC SA of the ethernet protocol gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA SA Reserved Payload CRC
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA SA Reserved label (R) Payload CRC
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Based on the characteristics of the video network, the data processing scheme provided by the embodiment of the invention follows the protocol of the video network, so that the image transmission equipment can be used as a participant terminal to join in the video conference of the video network.
The data processing scheme of the embodiment of the invention can be applied to video conferences of video networking. The devices involved in video conferencing may include devices in the video network and devices in the internet. The video network can comprise a streaming media server, a video network server (can be the node server) and a video network terminal. A graph-passing device may be included in the internet.
The terminal of the video networking is a terminal for performing services based on the video networking protocol, and the terminal of the video networking may be various Set Top Boxes (STBs) and the like based on the video networking protocol. The video network terminal registers to the video network server to perform normal service, and the video network server allocates a video network number, a Media Access Control (MAC) address and the like for the video network terminal after successful registration. And the video network terminal logs in the video network server according to the video network number so as to be connected with the video network server. In the video network, each video network terminal can be distinguished through the video network number and the video network MAC address of the video network terminal.
The streaming media server may be connected to a plurality of image transmission devices on the internet. The image transmission device is also called a wireless video monitoring device or a wireless video server, or can be called a network camera. The vehicle-mounted mobile monitoring is adopted in typical application occasions, and the image transmission equipment is introduced on the basis of the traditional vehicle-mounted hard disk/SD card video recorder, so that the rear-end monitoring center personnel can check images in the vehicle driving process in real time while recording the images in the whole process, and can immediately deal with emergency situations. Such as police cars, road enforcement vehicles, securicars, buses, hazardous freight vehicles, river law enforcement boats, and the like. The image transmission device can be a 3G image transmission device, a 4G image transmission device and the like. The image transmission device may log in on the streaming media server, and after the log-in is successful, the streaming media server may store information of each image transmission device, where the information of the image transmission device may include a name, a geographic location (e.g., longitude and latitude), an IP address, and the like of the image transmission device.
The streaming media server can be understood as a gateway, and is responsible for accessing the image transmission device of the internet into the video network. The streaming media server can comprise a plurality of virtual terminals, and the virtual terminals serve as a bridge for the association between the internet and the video network. Each virtual terminal is respectively registered to the upper part of the video network server to carry out normal service, and the video network server can distribute video network numbers, video network MAC addresses and the like for each virtual terminal after successful registration. And each virtual terminal logs in the video network server according to the video network number so as to monitor the connection of the access server and the video network server. In the video network, the virtual terminals can be distinguished through the video network numbers and the video network MAC addresses of the virtual terminals.
Example one
Referring to fig. 5, a flowchart illustrating steps of a data processing method according to a first embodiment of the present invention is shown.
The data processing method of the embodiment of the invention can comprise the following steps:
step 501, after the image transmission device logs in the streaming media server, the streaming media server receives a live broadcast data packet based on an internet protocol, which is sent by the image transmission device in a release live broadcast manner.
The image transmission device can register on the streaming media server in a user mode, and information such as an account number, a password and the like is obtained after the registration is successful. The image transmission device may log in the streaming media server by using an account and a password, and after the login is successful, the image transmission device and the streaming media server may interact based on an Internet Protocol, such as an IP (Internet Protocol).
The image transmission equipment supports a live broadcast mode, can acquire data such as external audio and video streams, encodes the acquired data to serve as live broadcast data, and encapsulates the live broadcast data into a live broadcast data packet based on an internet protocol. The live broadcast data comprises audio live broadcast data and video live broadcast data, and the live broadcast data packet comprises an audio live broadcast data packet and a video live broadcast data packet. The live internet protocol-based data packet may include an internet packet header and an internet data body. The internet packet header may include information such as an IP address of the image transmission monitoring device, an IP address of the streaming media server, and a field for characterizing a data type, where the data type includes an audio type and a video type. The internet data body comprises specific live broadcast data.
And after successfully logging in the streaming media server, the image transmission equipment sends the live broadcast data packet based on the Internet protocol to the streaming media server in a live broadcast releasing mode.
In an optional implementation manner, after the streaming media server logs in the streaming media server by the image transmission device, a live channel corresponding to the image transmission device may also be established, where each image transmission device corresponds to one live channel. Therefore, after the graph transmission device successfully logs in the streaming media server, the graph transmission device sends the live broadcast data packet based on the internet protocol to a live broadcast channel corresponding to the graph transmission device in the streaming media server in a live broadcast releasing mode. And the streaming media server receives a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through a live broadcast channel corresponding to the image transmission equipment.
Step 502, the streaming media server acquires the live broadcast data packet based on the internet protocol after the image transmission device joins the video conference of the video network.
In the video conference, the video conference can be established through the conference control software of the video network. For example, conference control software is used to select the participant terminals for group conference, set the roles (chairman, speech, participant, etc.) of the participant terminals in the conference, switch the roles of the participant terminals in the conference, and control the participant terminals to quit the conference. The conference control software may be installed on devices such as PCs (personal computers), mobile phones, tablets, etc.
The conference control software can send a conference invitation to the video networking terminal and the image transmission equipment so that the video networking terminal and the image transmission equipment are used as participant terminals to join in the video networking video conference. The conference control software sends an invitation to join the conference to the video network server. And the video networking server sends the meeting invitation aiming at the video networking terminal to the video networking terminal according to the downlink communication link configured for the video networking terminal. And the video network server sends the meeting invitation aiming at the image transmission equipment to the streaming media server according to the downlink communication link configured for the streaming media server, and then the streaming media server sends the meeting invitation to the image transmission equipment. And after the video network terminal and the image transmission equipment accept the invitation to join, returning an invitation accepting response according to the respective corresponding paths, thereby joining the video network terminal and the image transmission equipment as a participant terminal in the video conference of the video network. The video networking server can store information such as video networking numbers, video networking MAC addresses and roles of all the participant terminals in the video networking video conference, so that the video networking server can know which participant terminals the data is sent to after receiving the data sent by the speaking party.
After the graph transmission equipment is used as a participant terminal to join the video conference of the video network, the graph transmission equipment can be used as a speaking party, so that the streaming media server can obtain a live data packet which is sent by the graph transmission equipment after logging in and is based on an internet protocol, and then the live data packet is transmitted in the video conference of the video network.
In an alternative embodiment, the streaming media server may obtain, through the middleware, a live data packet based on an internet protocol, which is sent by the image transmission device in a live manner. Therefore, the streaming media server can create the middleware after the image transmission device joins the video conference of the video network. Middleware is software that provides a connection between system software and application software, and includes a set of services to facilitate communication between the various components of the software, particularly the centralized logic of the application software with respect to the system software.
The step of obtaining the live data packet based on the internet protocol may include: and pulling the live broadcast data packet based on the Internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware. The method for watching the live broadcast by the middleware is characterized in that the middleware is added into a watching list of a live broadcast source (namely, image transmission equipment), and the live broadcast source transmits a stream to a member of the live broadcast watching list in a traversing manner. The middleware has two functions, one is used as a live viewer to pull live data, and the other is used as a conference data source to send out the live data.
Step 503, the streaming media server converts the live data packet based on the internet protocol into a live data packet based on the video networking protocol.
The streaming media server and the video networking server are interacted based on a video networking protocol, and the streaming media server converts the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on the video networking protocol.
In the implementation, the streaming media server analyzes the live data packet based on the internet protocol to obtain a video networking data body (namely, live data), and then encapsulates the live data based on the video networking protocol to obtain the live data packet based on the video networking protocol. The live data packet based on the video network protocol can comprise a video network packet header and a video network data body. The video network header may include information such as a video network number, a video network MAC address, and a field for characterizing a data type of a streaming server (specifically, a virtual terminal currently processing a live data packet), where the data type includes an audio type and a video type. The video network data body comprises specific live broadcast data.
And step 504, the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
And the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking server, and then the video networking server forwards the live broadcast data packet based on the video networking protocol to the video networking terminal added to the video networking video conference according to a downlink communication link configured for the video networking terminal added to the video networking video conference.
In practical applications, the video network is a network with a centralized control function, and includes a master control server and a lower level network device, where the lower level network device includes a terminal, and one of the core concepts of the video network is to configure a table for a downlink communication link of a current service by notifying a switching device by the master control server, and then transmit a data packet based on the configured table.
Namely, the communication method in the video network includes:
and the master control server configures the downlink communication link of the current service.
And transmitting the data packet of the current service sent by the source terminal to the target terminal according to the downlink communication link.
In the embodiment of the present invention, configuring the downlink communication link of the current service includes: and informing the switching equipment related to the downlink communication link of the current service to allocate the table.
Further, transmitting according to the downlink communication link includes: the configured table is consulted, and the switching equipment transmits the received data packet through the corresponding port.
In particular implementations, the services include unicast communication services and multicast communication services. Namely, whether multicast communication or unicast communication, the core concept of the table matching-table can be adopted to realize communication in the video network.
As mentioned above, the video network includes an access network portion, in which the master server is a node server and the lower-level network devices include an access switch and a terminal.
For the unicast communication service in the access network, the step of configuring the downlink communication link of the current service by the master server may include the following steps:
and a substep S11, the main control server obtains the downlink communication link information of the current service according to the service request protocol packet initiated by the source terminal, wherein the downlink communication link information includes the downlink communication port information of the main control server and the access switch participating in the current service.
In the substep S12, the main control server sets a downlink port to which a packet of the current service is directed in a packet address table inside the main control server according to the downlink communication port information of the main control server; and sending a port configuration command to the corresponding access switch according to the downlink communication port information of the access switch.
In sub-step S13, the access switch sets the downstream port to which the packet of the current service is directed in its internal packet address table according to the port configuration command.
For a multicast communication service (e.g., video conference) in the access network, the step of the master server obtaining downlink information of the current service may include the following sub-steps:
in sub-step S21, the main control server obtains a service request protocol packet initiated by the target terminal and applying for the multicast communication service, where the service request protocol packet includes service type information, service content information, and an access network address of the target terminal.
Wherein, the service content information includes a service number.
And a substep S22, the main control server extracts the access network address of the source terminal in a preset content-address mapping table according to the service number.
In the substep of S23, the main control server obtains the multicast address corresponding to the source terminal and distributes the multicast address to the target terminal; and acquiring the communication link information of the current multicast service according to the service type information and the access network addresses of the source terminal and the target terminal.
And after receiving the live broadcast data packet, the video networking terminal joining the video networking video conference decodes and plays the live broadcast data packet.
In the embodiment of the invention, the image transmission equipment in the internet is used as a participant terminal to be added into the video conference of the video network, the streaming media server is used for realizing the conversion between the video networking protocol and the internet protocol, and the image transmission equipment can be used as a speaking party to send live data in a live broadcasting mode in the video conference of the video network, so that the diversity of the participant terminal is increased, the video network terminal added into the video conference of the video network can watch the live data, and the discussion and the like are carried out on the live data.
Example two
Referring to fig. 6, a flowchart illustrating steps of a data processing method according to a second embodiment of the present invention is shown.
The data processing method of the embodiment of the invention can comprise the following steps:
step 601, after the image transmission device logs in the streaming media server, the streaming media server receives a live broadcast data packet based on an internet protocol, which is sent by the image transmission device in a release live broadcast manner.
Step 602, the streaming media server acquires the live broadcast data packet based on the internet protocol after the image transmission device joins the video conference.
Step 603, the streaming media server converts the live data packet based on the internet protocol into a live data packet based on the video networking protocol.
And step 604, the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
Steps 602-604 describe the data transmission process when the image transmission device is used as the speaking party in the video networking video conference. The following steps 605 to 607 describe a data transmission process when the video network terminal is a speaking party in the video network video conference.
And 605, when receiving a talk data packet based on a video networking protocol sent by a video networking terminal joining the video networking video conference, the streaming media server screens out an audio talk data packet from the talk data packet based on the video networking protocol.
And if the video networking terminal speaks in the video networking video conference, the speaking data is collected by the video networking terminal, the speaking data is encoded, and the video networking terminal is packaged into a speaking data packet based on a video networking protocol. The speech data comprises audio speech data and video speech data, and the speech data packet comprises an audio speech data packet and a video speech data packet. The talk data packet based on the video networking protocol may include a video networking packet header and a video networking data body. The video network header can include information such as a video network number, a video network MAC address, a field for representing a data type and the like of a video network terminal which makes a speech, wherein the data type comprises an audio type and a video type. The video network data body comprises specific speech data.
The video network terminal sends the speech data packet based on the video network protocol to the video network server, and the video network server forwards the speech data packet to other participant terminals including other video network terminals and image transmission equipment participating in the video network conference.
And for other video networking terminals participating in the video networking conference, the video networking server sends the speech data packet based on the video networking protocol to the other video networking terminals according to the downlink communication link configured for the other video networking terminals.
For the graph transmission device, the graph transmission device is connected with the streaming media server, and the streaming media server is connected with the video network server, so that the video network server sends the speech data packet based on the video network protocol to the streaming media server according to the downlink communication link configured for the streaming media server, and specifically, the speech data packet can be sent to the idle virtual terminal of the streaming media server.
After the streaming media server receives the speech data packets based on the video networking protocol, because the image transmission device usually has video input and audio input and output functions, the streaming media server screens the audio speech data packets from the speech data packets based on the video networking protocol. For video talk packets, they may be discarded.
In an alternative embodiment, the step of screening out the audio speech packet from the speech packets based on the video networking protocol may include: analyzing the speech data packet based on the video networking protocol, and extracting a field representing the data type carried in a packet header of the video networking; the data types comprise an audio type and a video type; and determining the speech data packet with the data type represented by the extracted field as the audio type as the audio speech data packet.
Step 606, the streaming media server converts the audio speech data packet into an audio speech data packet based on the internet protocol.
The streaming media server and the image transmission device interact based on an internet protocol, and the streaming media server converts an audio speech data packet based on the video networking protocol into an audio speech data packet based on the internet protocol.
In the implementation, the streaming media server analyzes the audio speech data packet based on the video networking protocol to obtain a video networking data body (that is, audio speech data), and then encapsulates the audio speech data based on the internet protocol to obtain the audio speech data packet based on the internet protocol. The internet protocol based audio talk data packet may include an internet packet header and an internet data body. The internet packet header may include information such as an IP address of the streaming media server, an IP address of the image transmission device, and a field for characterizing a data type, where the data type includes an audio type and a video type. The internet data body comprises specific audio speech data.
Step 607, the streaming media server sends the audio speech data packet based on the internet protocol to the map transmission device.
The streaming media server may establish an audio channel corresponding to the image transmission device after the image transmission device joins the video conference over the internet of view, and the audio channel may be used for the streaming media server to send audio speech data to the corresponding image transmission device. Therefore, the streaming media server transmits the audio speech data packet based on the internet protocol to the audio channel corresponding to the image transmission device, and transmits the audio speech data packet based on the internet protocol to the image transmission device through the audio channel.
And after receiving the audio speech data packet, the image transmission equipment decodes and plays the audio speech data packet.
The embodiment of the invention realizes that the graph transmission equipment carries out speaking in the video conference of the video network in a live broadcast mode, and the graph transmission equipment is used as a participant to receive the speaking of other speaking parties and the like.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
EXAMPLE III
Referring to fig. 7, a block diagram of a data processing apparatus according to a third embodiment of the present invention is shown. The data processing device of the embodiment of the invention can be applied to video conferences of video networking, wherein the video networking comprises a streaming media server, a video networking server and a video networking terminal, and the internet comprises a picture transmission device.
The data processing device of the embodiment of the invention can comprise the following modules:
the streaming media server includes:
a receiving module 701, configured to receive a live broadcast data packet based on an internet protocol, sent by the image transmission device in a live broadcast publishing manner after the image transmission device logs in the streaming media server;
an obtaining module 702, configured to obtain the live broadcast data packet based on the internet protocol after the graph transmission device joins the video conference;
a first conversion module 703, configured to convert the live broadcast packet based on the internet protocol into a live broadcast packet based on a video networking protocol;
a first sending module 704, configured to send the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference via the video networking server.
In an optional embodiment, the streaming server further comprises: the first establishing module is used for establishing a live channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server; the receiving module is used for receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through the live broadcast channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server.
In an optional embodiment, the streaming server further comprises: the creating module is used for creating a middleware after the image transmission equipment joins the video conference of the video network; the acquisition module is used for pulling the live broadcast data packet based on the internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware after the image transmission equipment joins the video conference of the video network.
In an optional embodiment, the streaming server further comprises: the screening module is used for screening out audio speaking data packets from the speaking data packets based on the video networking protocol when receiving the speaking data packets based on the video networking protocol sent by the video networking terminals joining the video networking video conference; the second conversion module is used for converting the audio speech data packet into an audio speech data packet based on an internet protocol; and the second sending module is used for sending the audio speech data packet based on the internet protocol to the image transmission equipment.
In an optional embodiment, the streaming server further comprises: the second establishing module is used for establishing an audio channel corresponding to the image transmission equipment after the image transmission equipment joins the video conference of the video network; the second sending module is configured to transmit the internet protocol-based audio speech packet to the audio channel, and send the internet protocol-based audio speech packet to the image transmission device through the audio channel.
In an alternative embodiment, the screening module comprises: the extraction unit is used for analyzing the speech data packet based on the video networking protocol and extracting a field representing the data type carried in a video networking packet header; the data types comprise an audio type and a video type; and the determining unit is used for determining that the speech data packet with the data type represented by the extracted field as the audio type is an audio speech data packet.
In the embodiment of the invention, the image transmission equipment in the internet is used as a participant terminal to be added into the video conference of the video network, the streaming media server is used for realizing the conversion between the video networking protocol and the internet protocol, and the image transmission equipment can be used as a speaking party to send live data in a live broadcasting mode in the video conference of the video network, so that the diversity of the participant terminal is increased, the video network terminal added into the video conference of the video network can watch the live data, and the discussion and the like are carried out on the live data.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
In an embodiment of the invention, an apparatus is also provided. The apparatus may include one or more processors and one or more machine-readable media having instructions, such as an application program, stored thereon. When executed by the one or more processors, cause the apparatus to perform the data processing method described above.
In an embodiment of the invention, there is also provided a non-transitory computer readable storage medium, such as a memory, comprising instructions executable by a processor of an electronic device to perform the above-described data processing method. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The data processing method, the data processing apparatus and the storage medium provided by the present invention are described in detail above, and a specific example is applied in the present disclosure to explain the principle and the implementation of the present invention, and the description of the above embodiment is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (14)

1. A data processing method is applied to a video conference of a video network, the video network comprises a streaming media server, the video server and a video terminal, the internet comprises a graph transmission device, the video terminal and the graph transmission device are used as participant terminals to join the video conference of the video network, the graph transmission device makes a speech in the video conference of the video network in a live broadcast mode and is used as a participant to receive the speech of other speakers, and the method comprises the following steps:
the method comprises the steps that after the image transmission equipment logs in the streaming media server, the streaming media server receives a live broadcast data packet which is sent by the image transmission equipment in a live broadcast release mode and is based on an internet protocol, wherein the image transmission equipment registers on the streaming media server in a user mode, information such as an account number and a password is obtained after the registration is successful, and the streaming media server logs in by using the account number and the password;
the streaming media server acquires the live broadcast data packet based on the Internet protocol after the image transmission equipment joins the video conference of the video network;
the streaming media server converts the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on a video networking protocol;
and the streaming media server sends the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
2. The method of claim 1,
the method further comprises the following steps: after the image transmission equipment logs in the streaming media server, the streaming media server establishes a live broadcast channel corresponding to the image transmission equipment;
the step of receiving the live broadcast data packet based on the internet protocol sent by the image transmission equipment comprises the following steps: and receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through the live broadcast channel corresponding to the image transmission equipment.
3. The method of claim 2,
the method further comprises the following steps: the streaming media server creates a middleware after the image transmission equipment joins the video conference of the video network;
the step of obtaining the live broadcast data packet based on the internet protocol comprises the following steps: and pulling the live broadcast data packet based on the Internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware.
4. The method of claim 1, further comprising:
when the streaming media server receives a speech data packet based on a video networking protocol sent by a video networking terminal joining the video networking video conference, screening an audio speech data packet from the speech data packet based on the video networking protocol;
the streaming media server converts the audio speech data packet into an audio speech data packet based on an internet protocol;
and the streaming media server sends the audio speech data packet based on the Internet protocol to the image transmission equipment.
5. The method of claim 4,
the method further comprises the following steps: the streaming media server establishes an audio channel corresponding to the image transmission equipment after the image transmission equipment joins the video conference of the video network;
the step of sending the audio speech data packet based on the internet protocol to the image transmission device by the streaming media server includes: and the streaming media server transmits the audio speech data packet based on the Internet protocol to the audio channel and sends the audio speech data packet to the image transmission equipment through the audio channel.
6. The method of claim 4, wherein the step of screening out audio talk packets from the talk packets based on the video networking protocol comprises:
analyzing the speech data packet based on the video networking protocol, and extracting a field representing the data type carried in a packet header of the video networking; the data types comprise an audio type and a video type;
and determining the speech data packet with the data type represented by the extracted field as the audio type as the audio speech data packet.
7. A data processing device is applied to a video conference of a video network, the video conference comprises a streaming media server, the video server and a video terminal, the internet comprises a graph transmission device, the video terminal and the graph transmission device are used as participant terminals to join the video conference of the video network, the graph transmission device makes a speech in the video conference of the video network in a live broadcast mode and is used as a participant to receive the speech of other speakers, and the streaming media server comprises:
the receiving module is used for receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast release mode and is based on an internet protocol after the image transmission equipment logs in the streaming media server, wherein the image transmission equipment is registered on the streaming media server in a user mode, information such as an account number and a password is obtained after the registration is successful, and the image transmission equipment logs in the streaming media server by using the account number and the password;
the acquisition module is used for acquiring the live broadcast data packet based on the Internet protocol after the image transmission equipment joins the video conference of the video network;
the first conversion module is used for converting the live broadcast data packet based on the Internet protocol into a live broadcast data packet based on a video networking protocol;
and the first sending module is used for sending the live broadcast data packet based on the video networking protocol to a video networking terminal joining the video networking video conference through the video networking server.
8. The apparatus of claim 7,
the streaming media server further comprises: the first establishing module is used for establishing a live channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server;
the receiving module is used for receiving a live broadcast data packet which is sent by the image transmission equipment in a live broadcast mode and is based on an internet protocol through the live broadcast channel corresponding to the image transmission equipment after the image transmission equipment logs in the streaming media server.
9. The apparatus of claim 8,
the streaming media server further comprises: the creating module is used for creating a middleware after the image transmission equipment joins the video conference of the video network;
the acquisition module is used for pulling the live broadcast data packet based on the internet protocol from the live broadcast channel corresponding to the image transmission equipment in a live broadcast watching mode through the middleware after the image transmission equipment joins the video conference of the video network.
10. The apparatus of claim 7, wherein the streaming server further comprises:
the screening module is used for screening out audio speaking data packets from the speaking data packets based on the video networking protocol when receiving the speaking data packets based on the video networking protocol sent by the video networking terminals joining the video networking video conference;
the second conversion module is used for converting the audio speech data packet into an audio speech data packet based on an internet protocol;
and the second sending module is used for sending the audio speech data packet based on the internet protocol to the image transmission equipment.
11. The apparatus of claim 10,
the streaming media server further comprises: the second establishing module is used for establishing an audio channel corresponding to the image transmission equipment after the image transmission equipment joins the video conference of the video network;
the second sending module is configured to transmit the internet protocol-based audio speech packet to the audio channel, and send the internet protocol-based audio speech packet to the image transmission device through the audio channel.
12. The apparatus of claim 11, wherein the screening module comprises:
the extraction unit is used for analyzing the speech data packet based on the video networking protocol and extracting a field representing the data type carried in a video networking packet header; the data types comprise an audio type and a video type;
and the determining unit is used for determining that the speech data packet with the data type represented by the extracted field as the audio type is an audio speech data packet.
13. An apparatus, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the data processing method of any of claims 1-6.
14. A computer-readable storage medium, characterized in that it stores a computer program causing a processor to execute the data processing method according to any one of claims 1 to 6.
CN201910314606.5A 2019-04-18 2019-04-18 Data processing method, device and storage medium Active CN110191304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910314606.5A CN110191304B (en) 2019-04-18 2019-04-18 Data processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910314606.5A CN110191304B (en) 2019-04-18 2019-04-18 Data processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN110191304A CN110191304A (en) 2019-08-30
CN110191304B true CN110191304B (en) 2021-06-29

Family

ID=67714600

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910314606.5A Active CN110191304B (en) 2019-04-18 2019-04-18 Data processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN110191304B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114189648A (en) * 2021-11-17 2022-03-15 海南乾唐视联信息技术有限公司 Method and device for adding live broadcast source into video conference
CN114189649A (en) * 2021-11-17 2022-03-15 海南乾唐视联信息技术有限公司 Video conference live broadcasting method and device
CN114501068B (en) * 2022-04-07 2022-07-29 鹏城实验室 Video live broadcast method, architecture, system and computer readable storage medium
CN115242996A (en) * 2022-06-16 2022-10-25 海南视联通信技术有限公司 Conference processing method and device, terminal equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108011861B (en) * 2016-10-28 2019-05-17 视联动力信息技术股份有限公司 A kind of method and system of the cloud resource display control based on view networking
CN108418778A (en) * 2017-02-09 2018-08-17 北京视联动力国际信息技术有限公司 A kind of internet and method, apparatus and interactive system regarding connected network communication
CN107197197A (en) * 2017-04-20 2017-09-22 中化舟山危化品应急救援基地有限公司 Passenger ship emergency commading system
US10412463B2 (en) * 2017-07-07 2019-09-10 Verizon Patent And Licensing Inc. Resource based-video quality adjustment
CN108632398B (en) * 2017-07-27 2019-11-12 视联动力信息技术股份有限公司 A kind of conference access method and system, association turn server and conference management terminal
CN109640103A (en) * 2018-11-08 2019-04-16 视联动力信息技术股份有限公司 A kind of live streaming throwing screen method and apparatus based on view networking
CN109525854B (en) * 2018-12-20 2021-07-06 视联动力信息技术股份有限公司 Live broadcast processing method and device

Also Published As

Publication number Publication date
CN110191304A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN110049271B (en) Video networking conference information display method and device
CN109618120B (en) Video conference processing method and device
CN109302576B (en) Conference processing method and device
CN109068186B (en) Method and device for processing packet loss rate
CN110191304B (en) Data processing method, device and storage medium
CN110460804B (en) Conference data transmitting method, system, device and computer readable storage medium
CN110049273B (en) Video networking-based conference recording method and transfer server
CN109120879B (en) Video conference processing method and system
CN110572607A (en) Video conference method, system and device and storage medium
CN108616487B (en) Audio mixing method and device based on video networking
CN110855926A (en) Video conference processing method and device
CN109040656B (en) Video conference processing method and system
CN110719432A (en) Data transmission method and device, electronic equipment and storage medium
CN110430385B (en) Video conference processing method, device and storage medium
CN111641800A (en) Method and device for realizing conference
CN110049275B (en) Information processing method and device in video conference and storage medium
CN110719435B (en) Method and system for carrying out terminal conference
CN110611639A (en) Audio data processing method and device for streaming media conference
CN110072154B (en) A group building method and transfer server based on Internet of Views
CN110392227B (en) Data processing method, device and storage medium
CN109889516B (en) Method and device for establishing session channel
CN110049069B (en) Data acquisition method and device
CN111182258A (en) Data transmission method and device for network conference
CN110099025B (en) Call method and device based on video networking
CN111246153A (en) Video conference establishing method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 33rd Floor, No.1 Huasheng Road, Yuzhong District, Chongqing 400013

Patentee after: VISIONVERA INFORMATION TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: 100000 Beijing Dongcheng District Qinglong Hutong 1 Song Hua Building A1103-1113

Patentee before: VISIONVERA INFORMATION TECHNOLOGY Co.,Ltd.

Country or region before: China