CN110475089A - A kind of processing method of multi-medium data and view networked terminals - Google Patents
A kind of processing method of multi-medium data and view networked terminals Download PDFInfo
- Publication number
- CN110475089A CN110475089A CN201810443414.XA CN201810443414A CN110475089A CN 110475089 A CN110475089 A CN 110475089A CN 201810443414 A CN201810443414 A CN 201810443414A CN 110475089 A CN110475089 A CN 110475089A
- Authority
- CN
- China
- Prior art keywords
- video
- data
- chip
- video data
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 7
- 230000006855 networking Effects 0.000 claims abstract description 114
- 230000006854 communication Effects 0.000 claims abstract description 112
- 238000004891 communication Methods 0.000 claims abstract description 111
- 238000000034 method Methods 0.000 claims description 47
- 230000002093 peripheral effect Effects 0.000 claims description 24
- 238000003860 storage Methods 0.000 claims description 23
- 238000006243 chemical reaction Methods 0.000 claims description 16
- 238000012545 processing Methods 0.000 description 26
- 238000005516 engineering process Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000015654 memory Effects 0.000 description 6
- 238000010295 mobile communication Methods 0.000 description 4
- 238000011144 upstream manufacturing Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 241000700605 Viruses Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000002155 anti-virotic effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L69/00—Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
- H04L69/08—Protocols for interworking; Protocol conversion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The embodiment of the invention provides a kind of processing methods of multi-medium data and view networked terminals, which comprises video camera acquires original video data, and the original video data is sent to coding chip;The original video data is encoded to target video data by coding chip, and the original video data, the target video data are sent to interchanger;The original video data is sent to decoding chip by interchanger, and the target video data is sent to view networked server depending on networking interface by described, to be sent to another view networked terminals according to another downstream communications link depending on networked terminals configuration;Decoding chip generates feature video data according to the original video data received, and feature video data are sent to projector;Projector projects the feature video data.Meet the business demands such as video conference in a terminal, reduce the equipment of connection, reduce the simplicity of operation, reduces the probability of error, reduce cost.
Description
Technical Field
The present invention relates to the field of video networking technologies, and in particular, to a multimedia data processing method and a video networking terminal.
Background
With the rapid development of network technologies, the communication of multimedia data such as video conferences and video teaching is widely popularized in the aspects of life, work, learning and the like of users.
At present, in the communication process of multimedia data such as video data, audio data and the like, a video network terminal is generally externally connected with equipment such as a television, a camera and the like to realize a complete video network function, such as a video conference, and the operation of connecting equipment is more complicated, error is easy to occur and the cost is higher.
Disclosure of Invention
In view of the above problems, embodiments of the present invention are proposed to provide a multimedia data processing method and a video network terminal that overcome or at least partially solve the above problems.
According to one aspect of the invention, a method for processing multimedia data is provided, wherein a video network terminal located in a video network is provided with a camera, a projector, an encoding chip, a switch and a decoding chip, the camera is connected to the encoding chip, the projector is connected to the decoding chip, the encoding chip and the decoding chip are connected to the switch, and the switch is provided with a video network interface;
the method comprises the following steps:
the method comprises the steps that a camera collects original video data and sends the original video data to a coding chip;
the encoding chip encodes the original video data into target video data and sends the original video data and the target video data to the switch;
the switch sends the original video data to a decoding chip, and sends the target video data to a video networking server through the video networking interface so as to send the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal;
the decoding chip generates characteristic video data according to the received original video data and sends the characteristic video data to the projector;
and projecting the characteristic video data by a projector.
Optionally, the encoding chip has a data input interface, and the peripheral input device is connected to the data input interface, and the method further includes:
the encoding chip receives original video data and/or original audio data from the peripheral input equipment through the data input interface;
the encoding chip encodes the original audio data into target video data and sends the original audio data and the target audio data to the switch;
and the switch sends the original audio data to a decoding chip, and sends the target audio data to a video networking server through the video networking interface so as to send the target audio data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
Optionally, the decoding chip has a data output interface, the peripheral output device is connected to the data output interface, the video networking terminal is further provided with a communication chip, and the method further includes:
the decoding chip generates characteristic audio data according to the received original audio data;
the decoding chip sends the characteristic video data and/or the characteristic audio data to peripheral output equipment through the data output interface for playing;
or,
the decoding chip sends the characteristic audio data to a communication chip;
and the communication chip sends the characteristic audio data to a connected audio player for playing.
Optionally, the video networking terminal is further provided with a communication chip, and the method further includes:
the switch sends the received target video data and/or target audio data to a decoding chip;
the decoding chip sends the target video data and/or the target audio data to a communication chip;
and the communication chip sends the target video data and/or the target audio data to a storage server positioned in an IP network through a mobile network so as to forward the target video data and/or the target audio data to another video network terminal.
Optionally, the method further comprises:
the switch receives target video data and/or target audio data of another video networking terminal, which are sent to the video networking terminal by the video networking server according to a downlink communication link configured for the video networking terminal, through the video networking interface;
the switch sends the target video data and/or the target audio data to a decoding chip;
and the decoding chip decodes the target video data into original video data and/or decodes the target audio data into original audio data.
Optionally, the video networking terminal is further provided with a communication chip, and the method further includes:
the communication chip receives target video data and/or target audio data of another video network terminal after the protocol conversion from the video network server to the IP network is carried out by the video network server through a storage server positioned in the IP network;
and the communication chip sends the target video data and/or the target audio data to a decoding chip.
According to another aspect of the invention, a video network terminal is provided, which is located in a video network and is provided with a camera, a projector, an encoding chip, a switch and a decoding chip, wherein the camera is accessed to the encoding chip, the projector is accessed to the decoding chip, the encoding chip and the decoding chip are accessed to the switch, and the switch is provided with a video network interface;
the camera is used for acquiring original video data and sending the original video data to the coding chip;
the encoding chip is used for encoding the original video data into target video data and sending the original video data and the target video data to the switch;
the switch is used for sending the original video data to a decoding chip, sending the target video data to a video networking server through the video networking interface, and sending the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal;
the decoding chip is used for generating characteristic video data according to the received original video data and sending the characteristic video data to the projector;
and the projector is used for projecting the characteristic video data.
Optionally, the encoding chip has a data input interface, and the peripheral input device is accessed to the data input interface;
the encoding chip is also used for receiving original video data and/or original audio data from the peripheral input equipment through the data input interface;
the encoding chip is also used for encoding the original audio data into target video data and sending the original audio data and the target audio data to the switch;
and the switch is also used for sending the original audio data to a decoding chip, sending the target audio data to a video networking server through the video networking interface, and sending the target audio data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
Optionally, the decoding chip has a data output interface, the peripheral output device is connected to the data output interface, and the video networking terminal is further provided with a communication chip;
the decoding chip is also used for generating characteristic audio data according to the received original audio data;
the decoding chip is also used for sending the characteristic video data and/or the characteristic audio data to peripheral output equipment through the data output interface for playing;
or,
the decoding chip is also used for sending the characteristic audio data to the communication chip;
and the communication chip is also used for sending the characteristic audio data to a connected audio player for playing.
Optionally, the video networking terminal is further provided with a communication chip;
the switch is also used for sending the received target video data and/or target audio data to the decoding chip;
the decoding chip is also used for sending the target video data and/or the target audio data to a communication chip;
and the communication chip is also used for sending the target video data and/or the target audio data to a storage server positioned in an IP network through a mobile network so as to forward the data to another video network terminal.
Optionally, the switch is further configured to receive, through the video networking interface, target video data and/or target audio data of another video networking terminal, which is sent to the video networking terminal by the video networking server according to the downlink communication link configured for the video networking terminal;
the switch is also used for sending the target video data and/or the target audio data to a decoding chip;
the decoding chip is further used for decoding the target video data into original video data and/or decoding the target audio data into original audio data.
Optionally, the video networking terminal is further provided with a communication chip;
the communication chip is also used for receiving target video data and/or target audio data of another video network terminal after the protocol conversion from the video network server to the IP network is carried out by the video network server through a mobile network;
and the communication chip is also used for sending the target video data and/or the target audio data to a decoding chip.
The embodiment of the invention has the following advantages:
in a video network terminal, a camera collects original video data, the original video data are sent to a coding chip, the coding chip codes the original video data into target video data, the original video data and the target video data are sent to a switch, the switch sends the original video data to a decoding chip, the target video data are sent to a video network server through a video network interface so as to be sent to another video network terminal according to a downlink communication link configured for the other video network terminal, the decoding chip generates characteristic video data according to the received original video data, the characteristic video data are sent to a projector, the projector projects the characteristic video data, the video network terminal is integrated with the camera and the projector, the service requirements of video conferences and the like can be met in one terminal, connected equipment is reduced, and the simplicity and convenience of operation are reduced, the error probability is reduced, and the cost is reduced.
Drawings
FIG. 1 is a networking diagram of a video network, according to one embodiment of the invention;
FIG. 2 is a diagram illustrating a hardware architecture of a node server according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a hardware structure of an access switch according to an embodiment of the present invention;
fig. 4 is a schematic hardware structure diagram of an ethernet protocol conversion gateway according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a video network terminal according to an embodiment of the present invention;
FIG. 6 is a flow chart illustrating steps of a method for processing multimedia data according to an embodiment of the present invention;
FIG. 7 is a flow chart of steps of another method of processing multimedia data in accordance with one embodiment of the present invention;
fig. 8 is a flowchart illustrating steps of a method for processing multimedia data according to another embodiment of the present invention;
fig. 9 is a flowchart illustrating steps of a method for processing multimedia data according to an embodiment of the present invention;
fig. 10 is a block diagram of a terminal of a video network according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The video networking is an important milestone for network development, is a real-time network, can realize high-definition video real-time transmission, and pushes a plurality of internet applications to high-definition video, and high-definition faces each other.
The video networking adopts a real-time high-definition video exchange technology, can integrate required services such as dozens of services of video, voice, pictures, characters, communication, data and the like on a system platform on a network platform, such as high-definition video conference, video monitoring, intelligent monitoring analysis, emergency command, digital broadcast television, delayed television, network teaching, live broadcast, VOD on demand, television mail, Personal Video Recorder (PVR), intranet (self-office) channels, intelligent video broadcast control, information distribution and the like, and realizes high-definition quality video broadcast through a television or a computer.
To better understand the embodiments of the present invention, the following description refers to the internet of view:
some of the technologies applied in the video networking are as follows:
network Technology (Network Technology)
Network technology innovation in video networking has improved over traditional Ethernet (Ethernet) to face the potentially enormous video traffic on the network. Unlike pure network Packet Switching (Packet Switching) or network circuit Switching (circuit Switching), the Packet Switching is adopted by the technology of the video networking to meet the Streaming requirement. The video networking technology has the advantages of flexibility, simplicity and low price of packet switching, and simultaneously has the quality and safety guarantee of circuit switching, thereby realizing the seamless connection of the whole network switching type virtual circuit and the data format.
Switching Technology (Switching Technology)
The video network adopts two advantages of asynchronism and packet switching of the Ethernet, eliminates the defects of the Ethernet on the premise of full compatibility, has end-to-end seamless connection of the whole network, is directly communicated with a user terminal, and directly bears an IP data packet. The user data does not require any format conversion across the entire network. The video networking is a higher-level form of the Ethernet, is a real-time exchange platform, can realize the real-time transmission of the whole-network large-scale high-definition video which cannot be realized by the existing Internet, and pushes a plurality of network video applications to high-definition and unification.
Server Technology (Server Technology)
The server technology on the video networking and unified video platform is different from the traditional server, the streaming media transmission of the video networking and unified video platform is established on the basis of connection orientation, the data processing capacity of the video networking and unified video platform is independent of flow and communication time, and a single network layer can contain signaling and data transmission. For voice and video services, the complexity of video networking and unified video platform streaming media processing is much simpler than that of data processing, and the efficiency is greatly improved by more than one hundred times compared with that of a traditional server.
Storage Technology (Storage Technology)
The super-high speed storage technology of the unified video platform adopts the most advanced real-time operating system in order to adapt to the media content with super-large capacity and super-large flow, the program information in the server instruction is mapped to the specific hard disk space, the media content is not passed through the server any more, and is directly sent to the user terminal instantly, and the general waiting time of the user is less than 0.2 second. The optimized sector distribution greatly reduces the mechanical motion of the magnetic head track seeking of the hard disk, the resource consumption only accounts for 20% of that of the IP internet of the same grade, but concurrent flow which is 3 times larger than that of the traditional hard disk array is generated, and the comprehensive efficiency is improved by more than 10 times.
Network Security Technology (Network Security Technology)
The structural design of the video network completely eliminates the network security problem troubling the internet structurally by the modes of independent service permission control each time, complete isolation of equipment and user data and the like, generally does not need antivirus programs and firewalls, avoids the attack of hackers and viruses, and provides a structural carefree security network for users.
Service Innovation Technology (Service Innovation Technology)
The unified video platform integrates services and transmission, and is not only automatically connected once whether a single user, a private network user or a network aggregate. The user terminal, the set-top box or the PC are directly connected to the unified video platform to obtain various multimedia video services in various forms. The unified video platform adopts a menu type configuration table mode to replace the traditional complex application programming, can realize complex application by using very few codes, and realizes infinite new service innovation.
Networking of the video network is as follows:
the video network is a centralized control network structure, and the network can be a tree network, a star network, a ring network and the like, but on the basis of the centralized control node, the whole network is controlled by the centralized control node in the network.
As shown in fig. 1, the video network is divided into an access network and a metropolitan network.
The devices of the access network part can be mainly classified into 3 types: node server, access switch, terminal (including various set-top boxes, coding boards, memories, etc.). The node server is connected to an access switch, which may be connected to a plurality of terminals and may be connected to an ethernet network.
The node server is a node which plays a centralized control function in the access network and can control the access switch and the terminal. The node server can be directly connected with the access switch or directly connected with the terminal.
Similarly, devices of the metropolitan network portion may also be classified into 3 types: a metropolitan area server, a node switch and a node server. The metro server is connected to a node switch, which may be connected to a plurality of node servers.
The node server is a node server of the access network part, namely the node server belongs to both the access network part and the metropolitan area network part.
The metropolitan area server is a node which plays a centralized control function in the metropolitan area network and can control a node switch and a node server. The metropolitan area server can be directly connected with the node switch or directly connected with the node server.
Therefore, the whole video network is a network structure with layered centralized control, and the network controlled by the node server and the metropolitan area server can be in various structures such as tree, star and ring.
The access network part can form a unified video platform (the part in the dotted circle), and a plurality of unified video platforms can form a video network; each unified video platform may be interconnected via metropolitan area and wide area video networking.
Video networking device classification
1.1 devices in the video network of the embodiment of the present invention can be mainly classified into 3 types: servers, switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.). The video network as a whole can be divided into a metropolitan area network (or national network, global network, etc.) and an access network.
1.2 wherein the devices of the access network part can be mainly classified into 3 types: node servers, access switches (including ethernet gateways), terminals (including various set-top boxes, code boards, memories, etc.).
The specific hardware structure of each access network device is as follows:
a node server:
as shown in fig. 2, the system mainly includes a network interface module 201, a switching engine module 202, a CPU module 203, and a disk array module 204;
the network interface module 201, the CPU module 203, and the disk array module 204 all enter the switching engine module 202; the switching engine module 202 performs an operation of looking up the address table 205 on the incoming packet, thereby obtaining the direction information of the packet; and stores the packet in a queue of the corresponding packet buffer 206 based on the packet's steering information; if the queue of the packet buffer 206 is nearly full, it is discarded; the switching engine module 202 polls all packet buffer queues for forwarding if the following conditions are met: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero. The disk array module 204 mainly implements control over the hard disk, including initialization, read-write, and other operations on the hard disk; the CPU module 203 is mainly responsible for protocol processing with an access switch and a terminal (not shown in the figure), configuring an address table 205 (including a downlink protocol packet address table, an uplink protocol packet address table, and a data packet address table), and configuring the disk array module 204.
The access switch:
as shown in fig. 3, the network interface module mainly includes a network interface module (a downlink network interface module 301 and an uplink network interface module 302), a switching engine module 303 and a CPU module 304;
wherein, the packet (uplink data) coming from the downlink network interface module 301 enters the packet detection module 305; the packet detection module 305 detects whether the Destination Address (DA), the Source Address (SA), the packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id) and enters the switching engine module 303, otherwise, discards the stream identifier; the packet (downstream data) coming from the upstream network interface module 302 enters the switching engine module 303; the data packet coming from the CPU module 204 enters the switching engine module 303; the switching engine module 303 performs an operation of looking up the address table 306 on the incoming packet, thereby obtaining the direction information of the packet; if the packet entering the switching engine module 303 is from the downstream network interface to the upstream network interface, the packet is stored in the queue of the corresponding packet buffer 307 in association with the stream-id; if the queue of the packet buffer 307 is nearly full, it is discarded; if the packet entering the switching engine module 303 is not from the downlink network interface to the uplink network interface, the data packet is stored in the queue of the corresponding packet buffer 307 according to the guiding information of the packet; if the queue of the packet buffer 307 is nearly full, it is discarded.
The switching engine module 303 polls all packet buffer queues, which in this embodiment of the present invention is divided into two cases:
if the queue is from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queued packet counter is greater than zero; 3) obtaining a token generated by a code rate control module;
if the queue is not from the downlink network interface to the uplink network interface, the following conditions are met for forwarding: 1) the port send buffer is not full; 2) the queue packet counter is greater than zero.
The rate control module 208 is configured by the CPU module 204, and generates tokens for packet buffer queues from all downstream network interfaces to upstream network interfaces at programmable intervals to control the rate of upstream forwarding.
The CPU module 304 is mainly responsible for protocol processing with the node server, configuration of the address table 306, and configuration of the code rate control module 308.
Ethernet protocol conversion gateway:
As shown in fig. 4, the apparatus mainly includes a network interface module (a downlink network interface module 401 and an uplink network interface module 402), a switching engine module 403, a CPU module 404, a packet detection module 405, a rate control module 408, an address table 406, a packet buffer 407, a MAC adding module 409, and a MAC deleting module 410.
Wherein, the data packet coming from the downlink network interface module 401 enters the packet detection module 405; the packet detection module 405 detects whether the ethernet MAC DA, the ethernet MAC SA, the ethernet length or frame type, the video network destination address DA, the video network source address SA, the video network packet type, and the packet length of the packet meet the requirements, and if so, allocates a corresponding stream identifier (stream-id); then, the MAC deletion module 410 subtracts MAC DA, MAC SA, length or frame type (2byte) and enters the corresponding receiving buffer, otherwise, discards it;
the downlink network interface module 401 detects the sending buffer of the port, and if there is a packet, obtains the ethernet MAC DA of the corresponding terminal according to the destination address DA of the packet, adds the ethernet MAC DA of the terminal, the MACSA of the ethernet coordination gateway, and the ethernet length or frame type, and sends the packet.
The other modules in the ethernet protocol gateway function similarly to the access switch.
A terminal:
the system mainly comprises a network interface module, a service processing module and a CPU module; for example, the set-top box mainly comprises a network interface module, a video and audio coding and decoding engine module and a CPU module; the coding board mainly comprises a network interface module, a video and audio coding engine module and a CPU module; the memory mainly comprises a network interface module, a CPU module and a disk array module.
1.3 devices of the metropolitan area network part can be mainly classified into 2 types: node server, node exchanger, metropolitan area server. The node switch mainly comprises a network interface module, a switching engine module and a CPU module; the metropolitan area server mainly comprises a network interface module, a switching engine module and a CPU module.
2. Video networking packet definition
2.1 Access network packet definition
The data packet of the access network mainly comprises the following parts: destination Address (DA), Source Address (SA), reserved bytes, payload (pdu), CRC.
As shown in the following table, the data packet of the access network mainly includes the following parts:
DA | SA | Reserved | Payload | CRC |
wherein:
the Destination Address (DA) is composed of 8 bytes (byte), the first byte represents the type of the data packet (such as various protocol packets, multicast data packets, unicast data packets, etc.), there are 256 possibilities at most, the second byte to the sixth byte are metropolitan area network addresses, and the seventh byte and the eighth byte are access network addresses;
the Source Address (SA) is also composed of 8 bytes (byte), defined as the same as the Destination Address (DA);
the reserved byte consists of 2 bytes;
the payload part has different lengths according to different types of datagrams, and is 64 bytes if the datagram is various types of protocol packets, and is 32+1024 or 1056 bytes if the datagram is a unicast packet, of course, the length is not limited to the above 2 types;
the CRC consists of 4 bytes and is calculated in accordance with the standard ethernet CRC algorithm.
2.2 metropolitan area network packet definition
The topology of a metropolitan area network is a graph and there may be 2, or even more than 2, connections between two devices, i.e., there may be more than 2 connections between a node switch and a node server, a node switch and a node switch, and a node switch and a node server. However, the metro network address of the metro network device is unique, and in order to accurately describe the connection relationship between the metro network devices, parameters are introduced in the embodiment of the present invention: a label to uniquely describe a metropolitan area network device.
In this specification, the definition of the Label is similar to that of the Label of MPLS (Multi-Protocol Label Switch), and assuming that there are two connections between the device a and the device B, there are 2 labels for the packet from the device a to the device B, and 2 labels for the packet from the device B to the device a. The label is classified into an incoming label and an outgoing label, and assuming that the label (incoming label) of the packet entering the device a is 0x0000, the label (outgoing label) of the packet leaving the device a may become 0x 0001. The network access process of the metro network is a network access process under centralized control, that is, address allocation and label allocation of the metro network are both dominated by the metro server, and the node switch and the node server are both passively executed, which is different from label allocation of MPLS, and label allocation of MPLS is a result of mutual negotiation between the switch and the server.
As shown in the following table, the data packet of the metro network mainly includes the following parts:
DA | SA | Reserved | label (R) | Payload | CRC |
Namely Destination Address (DA), Source Address (SA), Reserved byte (Reserved), tag, payload (pdu), CRC. The format of the tag may be defined by reference to the following: the tag is 32 bits with the upper 16 bits reserved and only the lower 16 bits used, and its position is between the reserved bytes and payload of the packet.
Referring to fig. 5, a schematic structural diagram of a video networking terminal according to an embodiment of the present invention is shown.
As shown, the video network terminal 500 is provided with a camera 501, a projector 502, an encoding chip 503, a switch 504, a decoding chip 505, and a communication chip 506.
The camera 501 is connected to the encoding chip 503, the projector 502 is connected to the decoding chip 505, the communication chip 506 is connected to the decoding chip 505, the encoding chip 503 and the decoding chip 505 are connected to the switch 504, the encoding chip 503 is provided with a data input interface 5031, the switch 504 is provided with a video network interface 5041, and the decoding chip 505 is provided with a data output interface 5051.
The camera 501 is configured to capture video data, and is configured with a pan/tilt head, which can rotate.
The projector 502 is configured to project video data, and is provided with a pan/tilt head that can rotate.
The encoding chip 503 is configured to encode video data and audio data.
Taking HI3516A as an example of the encoding chip 503, the HI3516A acquires video data of one path of HDMI (High Definition Multimedia Interface) through an HDMI _ IN Interface, the processing capacity is 1080P @60fps, and the acquired video data can be further subjected to h.264 encoding IN the HI3516A and then sent to the switch 504 through one port.
The switch 504 is configured to forward the received video data and audio data to the corresponding components.
An internet of view interface 5041 for devices in an internet of view, such as an internet of view server, to communicate.
The decoding chip 505 is configured to decode the encoded video data and the video data, and output the decoded video data and the video data to a corresponding component for processing, such as transmission and playing.
Taking DM8148 as an example of the decoding chip 505, DM8148 receives encoded video data sent by the switch 504 through a port, decodes the encoded video data, and outputs the decoded video data to a connected television for display through two HDMI _ OUT interfaces, where the decoding capability is 1080P @60 fps.
The communication chip 506 may include a mobile communication chip such as a 4G chip for communicating with the mobile internet, or may include a short-range communication chip such as bluetooth for connecting to a local audio player.
Referring to fig. 6, a flowchart illustrating steps of a method for processing multimedia data according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 601, a camera collects original video data and sends the original video data to a coding chip.
In a video conference and other scenes, the camera can face a speaking party and the like to acquire original video data according to the operation of a user, and sends the original video data to the coding chip.
Step 602, an encoding chip encodes the original video data into target video data, and sends the original video data and the target video data to a switch.
The encoding chip respectively carries out two kinds of processing on original video data acquired by the camera, wherein one kind of processing is that the original video data are directly sent to the switch, and the other kind of processing is that the original video data are encoded into target video data according to encoding formats such as H.264 and H.265, and the target video data are sent to the switch.
Meanwhile, the encoding chip may encapsulate the data packet of the target video data through the 2000 specification of the following video networking protocol:
step 603, the switch sends the original video data to a decoding chip, and sends the target video data to a video networking server through the video networking interface, so as to send the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
The switch receives two paths of video streams sent by the coding chip, namely original video data and target video data.
For raw video data, the switch may send directly to the decoding chip.
For the target video data, the switch can send the target video data to the video network server through the video network interface according to the protocol of the video network, and the video network server can send the target video data to the video network terminal at the other end for decoding and displaying according to the downlink communication link configured for another video network terminal (such as another video network terminal participating in the video conference).
In practical applications, the video network is a network with a centralized control function, and includes a master control server and a lower level network device, where the lower level network device includes a terminal, and one of the core concepts of the video network is to configure a table for a downlink communication link of a current service by notifying a switching device by the master control server, and then transmit a data packet based on the configured table.
Namely, the communication method in the video network includes:
and the master control server configures the downlink communication link of the current service.
And transmitting the data packet of the current service sent by the source terminal to a target terminal (such as another video network terminal) according to the downlink communication link.
In the embodiment of the present invention, configuring the downlink communication link of the current service includes: and informing the switching equipment related to the downlink communication link of the current service to allocate the table.
Further, transmitting according to the downlink communication link includes: the configured table is consulted, and the switching equipment transmits the received data packet through the corresponding port.
In particular implementations, the services include unicast communication services and multicast communication services. Namely, whether multicast communication or unicast communication, the core concept of the table matching-table can be adopted to realize communication in the video network.
As mentioned above, the video network includes an access network portion, in which the master server is a node server and the lower-level network devices include an access switch and a terminal.
For the unicast communication service in the access network, the step of configuring the downlink communication link of the current service by the master server may include the following steps:
and a substep S11, the main control server obtains the downlink communication link information of the current service according to the service request protocol packet initiated by the source terminal, wherein the downlink communication link information includes the downlink communication port information of the main control server and the access switch participating in the current service.
In the substep S12, the main control server sets a downlink port to which a packet of the current service is directed in a packet address table inside the main control server according to the downlink communication port information of the control server; and sending a port configuration command to the corresponding access switch according to the downlink communication port information of the access switch.
In sub-step S13, the access switch sets the downstream port to which the packet of the current service is directed in its internal packet address table according to the port configuration command.
For a multicast communication service (e.g., video conference) in the access network, the step of the master server obtaining downlink information of the current service may include the following sub-steps:
in sub-step S21, the main control server obtains a service request protocol packet initiated by the target terminal and applying for the multicast communication service, where the service request protocol packet includes service type information, service content information, and an access network address of the target terminal.
Wherein, the service content information includes a service number.
And a substep S22, the main control server extracts the access network address of the source terminal in a preset content-address mapping table according to the service number.
In the substep of S23, the main control server obtains the multicast address corresponding to the source terminal and distributes the multicast address to the target terminal; and acquiring the communication link information of the current multicast service according to the service type information and the access network addresses of the source terminal and the target terminal.
And step 604, generating characteristic video data by the decoding chip according to the received original video data, and sending the characteristic video data to the projector.
In practical application, the decoding chip may receive one or more paths of original video data, and for these original video data, feature video data may be generated according to a negotiated processing mode and sent to the projector.
Further, the video network terminal may generate a play option, and generate the feature video data according to the play option when receiving a selection operation of a user for a certain play option.
The playing option may be to play one of the original video data (i.e., to select one of the original video data as the feature video data), or may be to play two or more of the original video data (i.e., to combine two or more of the original video data into the feature video data).
Step 605, the projector projects the characteristic video data.
After the video network terminal is powered on and started, the projector starts to perform automatic detection, the camera surrounds and collects image data after the automatic detection is completed, the collected image data is analyzed, the direction of the curtain is obtained after the analysis, and the projector is guided to project the received characteristic video data to the direction so as to project the characteristic video data to the curtain.
Further, since the curtain has obvious features, such as a rectangular white area, the feature data can be trained on the curtain in advance, and the feature data of the curtain can be detected from the collected image data.
If the feature data is detected, the orientation of the curtain is determined based on the image data to which the feature data belongs, and the projector is rotated according to the orientation to face the curtain.
In a video network terminal, a camera collects original video data, the original video data are sent to a coding chip, the coding chip codes the original video data into target video data, the original video data and the target video data are sent to a switch, the switch sends the original video data to a decoding chip, the target video data are sent to a video network server through a video network interface so as to be sent to another video network terminal according to a downlink communication link configured for the other video network terminal, the decoding chip generates characteristic video data according to the received original video data, the characteristic video data are sent to a projector, the projector projects the characteristic video data, the video network terminal is integrated with the camera and the projector, the service requirements of video conferences and the like can be met in one terminal, connected equipment is reduced, and the simplicity and convenience of operation are reduced, the error probability is reduced, and the cost is reduced.
Referring to fig. 7, a flowchart illustrating steps of another method for processing multimedia data according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 701, the encoding chip receives original video data and/or original audio data from the peripheral input device through the data input interface.
In the embodiment of the invention, the video network terminal can be accessed to other peripheral input devices, such as a microphone, a personal computer, a mobile terminal and the like, so that a user can play documents, presentations and the like in scenes such as video conferences conveniently.
At this time, the encoding chip receives original video data and/or original audio data from the external input device through the data input interface.
Step 702, an encoding chip encodes the original audio data into target video data, and sends the original audio data and the target audio data to a switch.
The encoding chip respectively performs two processes on original video data of the peripheral input device, wherein one process is to directly transmit the original Audio data to the switch, and the other process is to encode the original Audio data into target Audio data according to encoding formats such as AAC (Advanced Audio Coding) and the like, and transmit the target Audio data to the switch.
Meanwhile, the encoding chip may encapsulate the data packet of the target audio data by the 2001 specification of the following video networking protocol:
and 703, the switch sends the original audio data to a decoding chip, and sends the target audio data to a video networking server through the video networking interface so as to send the target audio data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
The exchanger receives two paths of audio streams sent by the coding chip, namely original audio data and target audio data.
For raw audio data, the switch may send directly to the decoding chip.
For the target audio data, the switch can send the target audio data to the video network server through the video network interface according to the protocol of the video network, and the video network server can send the target video data to the video network terminal at the other end for decoding and playing according to the downlink communication link configured for another video network terminal (such as another video network terminal participating in the video conference).
Step 704, the decoding chip generates characteristic audio data according to the received original audio data.
In practical application, the decoding chip may receive one or more paths of original audio data, and for these original audio data, the characteristic audio data may be generated according to a negotiated processing mode.
Further, the video network terminal may generate a play option, and generate the characteristic audio data according to the play option when receiving a selection operation of a user for a certain play option.
The playing option may be to play one of the two or more paths of original audio data (i.e., to select one of the two or more paths of original audio data as the characteristic audio data), or may be to play two or more paths of original audio data (i.e., to synthesize two or more paths of original audio data into the characteristic audio data).
Step 705, the decoding chip sends the characteristic video data and/or the characteristic audio data to an external output device through the data output interface for playing.
In the embodiment of the present invention, the video network terminal may access other peripheral output devices, for example, a personal computer, a mobile terminal, a television, and the like, and the decoding chip may send the feature video data and/or the feature audio data to the peripheral output devices through the data output interface for playing.
And step 706, the decoding chip sends the characteristic audio data to the communication chip.
And step 707, the communication chip sends the characteristic audio data to a connected audio player for playing.
In the embodiment of the present invention, the communication chip (e.g., a short-range communication chip) may be connected to a local audio player, such as a bluetooth speaker.
The decoding chip may send the characteristic audio data to a communication chip, and the communication chip (e.g., a short-range communication chip) may send the characteristic audio data to a local audio player, which plays the characteristic audio data.
Referring to fig. 8, a flowchart illustrating steps of a method for processing multimedia data according to another embodiment of the present invention is shown, which may specifically include the following steps:
in step 801, the switch sends the received target video data and/or target audio data to a decoding chip.
Step 802, the decoding chip sends the target video data and/or the target audio data to a communication chip.
And step 803, the communication chip sends the target video data and/or the target audio data to a storage server located in an IP network through a mobile network so as to forward the target video data and/or the target audio data to another video network terminal.
In the embodiment of the invention, if the switch fails to send the target video data and/or the target audio data through the video networking interface, the coding chip can package the target video data and/or the target audio data into a data packet according to the protocol of the IP network and send the data packet to the switch, the switch sends the target video data and/or the target audio data to the decoding chip, and the decoding chip sends the target video data and/or the target audio data to the communication chip.
The communication chip (e.g., mobile communication chip) transmits the target video data and/or the target audio data to a storage server located in an IP network through a mobile network.
The storage server sends the target video data and/or the target audio data to the protocol conversion server, the protocol conversion server carries out protocol conversion from an IP network to a video network on the target video data and/or the target audio data, and sends the target video data and/or the target audio data after the protocol conversion to the video network server, and the video network server can send the target video data and/or the target audio data to the video network terminal at the other end for decoding, displaying/playing according to a downlink communication link configured for another video network terminal (such as another video network terminal participating in a video conference) so as to ensure normal operation of services such as the video conference.
In addition, the video network terminal at the other end can also directly receive the target video data and/or the target audio data of the video network terminal at the other end from the storage server through a communication chip (such as a mobile communication chip).
Referring to fig. 9, a flowchart illustrating steps of a method for processing multimedia data according to another embodiment of the present invention is shown, which may specifically include the following steps:
step 901, the switch receives, through the video networking interface, target video data and/or target audio data of another video networking terminal, which is sent to the video networking terminal by the video networking server according to the downlink communication link configured for the video networking terminal.
And step 902, the switch sends the target video data and/or the target audio data to a decoding chip.
In scenes such as video conferences and the like, a video networking terminal at the other end collects video data and/or audio data, codes the video data and/or the audio data, sends the coded target video data and/or target audio data to a video networking server, and the video networking server sends the video networking terminal to the local terminal.
The switch receives the target video data and/or the target audio data coded by the video networking terminal at the other end through the video networking interface.
In addition, if the video network terminal at the other end fails to transmit the target video data and/or the target audio data in the video network, the video network terminal is transmitted to a storage server located in the IP network.
The storage server sends the target video data and/or the target audio data to the protocol conversion server, the protocol conversion server carries out protocol conversion from an IP network to a video network on the target video data and/or the target audio data, and sends the target video data and/or the target audio data after the protocol conversion to the video network server, and the video network server can send the target video data and/or the target audio data to a video network terminal of a local terminal for decoding, displaying/playing according to a downlink communication link configured for the video network terminal of the local terminal, so that normal operation of services such as a video conference and the like is guaranteed.
And the switch sends the received target video data and/or target audio data coded by the video networking terminal at the other end to the decoding chip.
Step 903, the communication chip receives target video data and/or target audio data of another video network terminal after the protocol conversion from the video network server to the IP network is performed from a storage server located in the IP network through the mobile network.
Step 904, the communication chip sends the target video data and/or the target audio data to a decoding chip.
And if the video network terminal at the other end fails to send the target video data and/or the target audio data in the video network, sending the target video data and/or the target audio data to a storage server positioned in the IP network.
In the embodiment of the invention, if the video network terminal at the other end fails to send the target video data and/or the target audio data in the video network, the video network terminal is sent to the storage server positioned in the IP network.
The communication chip (such as a mobile communication chip) of the video network terminal at the local end can directly receive the target video data and/or the target audio data of the video network terminal at the other end from the storage server, and send the received target video data and/or the received target audio data coded by the video network terminal at the other end to the decoding chip.
Step 905, decoding the target video data into original video data by a decoding chip, and/or decoding the target audio data into original audio data.
The decoding chip decodes target video data and/or target Audio data coded by the video networking terminal at the other end into original video data and/or original Audio data according to Coding formats such as H.264, H.265, AAC (Advanced Audio Coding) and the like, and a projector, peripheral output equipment, a local Audio player and the like are selected for playing in the follow-up process.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 10, a block diagram of a video network terminal according to an embodiment of the present invention is shown, and is located in a video network, a camera 1001, a projector 1002, an encoding chip 1003, a switch 1004, and a decoding chip 1005 are provided, the camera 1001 accesses the encoding chip 1003, the projector 1002 accesses the decoding chip 1005, the encoding chip 1003 and the decoding chip 1005 access the switch 1004, and the switch 1004 is provided with a video network interface;
the video camera 1001 is used for acquiring original video data and sending the original video data to the encoding chip 1003;
the encoding chip 1003 is configured to encode the original video data into target video data, and send the original video data and the target video data to the switch 1004;
the switch 1004 is configured to send the original video data to the decoding chip 1005, send the target video data to a video networking server through the video networking interface, and send the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal;
the decoding chip 1005 is configured to generate feature video data according to the received original video data, and send the feature video data to the projector 1002;
a projector 1002, configured to project the feature video data.
In an embodiment of the present invention, the encoding chip 1003 has a data input interface, and a peripheral input device is connected to the data input interface;
the encoding chip 1003 is further configured to receive original video data and/or original audio data from the peripheral input device through the data input interface;
the encoding chip 1003 is further configured to encode the original audio data into target video data, and send the original audio data and the target audio data to the switch 1004;
the switch 1004 is further configured to send the original audio data to the decoding chip 1005, and send the target audio data to the video network server through the video network interface, so as to send the target audio data to another video network terminal according to a downlink communication link configured for the other video network terminal.
In an embodiment of the present invention, the decoding chip 1005 has a data output interface, a peripheral output device is connected to the data output interface, and the video networking terminal is further provided with a communication chip;
the decoding chip 1005 is further configured to generate characteristic audio data according to the received original audio data;
the decoding chip 1005 is further configured to send the feature video data and/or the feature audio data to an external output device through the data output interface for playing;
or,
the decoding chip 1005 is further configured to send the characteristic audio data to a communication chip;
and the communication chip is also used for sending the characteristic audio data to a connected audio player for playing.
In one embodiment of the invention, the video network terminal is further provided with a communication chip;
the switch 1004 is further configured to send the received target video data and/or target audio data to the decoding chip 1005;
the decoding chip 1005 is further configured to send the target video data and/or the target audio data to a communication chip;
and the communication chip is also used for sending the target video data and/or the target audio data to a storage server positioned in an IP network through a mobile network so as to forward the data to another video network terminal.
In an embodiment of the present invention, the switch 1004 is further configured to receive, through the video networking interface, target video data and/or target audio data of another video networking terminal, which is sent by the video networking server to the video networking terminal according to the configured downlink communication link for the video networking terminal;
the switch 1004 is further configured to send the target video data and/or the target audio data to a decoding chip 1005;
the decoding chip 1005 is further configured to decode the target video data into original video data, and/or decode the target audio data into original audio data.
In one embodiment of the invention, the video network terminal is further provided with a communication chip;
the communication chip is also used for receiving target video data and/or target audio data of another video network terminal after the protocol conversion from the video network server to the IP network is carried out by the video network server through a mobile network;
the communication chip is further configured to send the target video data and/or the target audio data to the decoding chip 1005.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
In a video network terminal, a camera collects original video data, the original video data are sent to a coding chip, the coding chip codes the original video data into target video data, the original video data and the target video data are sent to a switch, the switch sends the original video data to a decoding chip, the target video data are sent to a video network server through a video network interface so as to be sent to another video network terminal according to a downlink communication link configured for the other video network terminal, the decoding chip generates characteristic video data according to the received original video data, the characteristic video data are sent to a projector, the projector projects the characteristic video data, the video network terminal is integrated with the camera and the projector, the service requirements of video conferences and the like can be met in one terminal, connected equipment is reduced, and the simplicity and convenience of operation are reduced, the error probability is reduced, and the cost is reduced.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The foregoing describes in detail a multimedia data processing method and a video network terminal provided by the present invention, and a specific example is applied in the present document to explain the principle and the implementation of the present invention, and the description of the foregoing embodiment is only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (10)
1. A processing method of multimedia data is characterized in that a video network terminal located in a video network is provided with a camera, a projector, an encoding chip, a switch and a decoding chip, wherein the camera is connected to the encoding chip, the projector is connected to the decoding chip, the encoding chip and the decoding chip are connected to the switch, and the switch is provided with a video network interface;
the method comprises the following steps:
the method comprises the steps that a camera collects original video data and sends the original video data to a coding chip;
the encoding chip encodes the original video data into target video data and sends the original video data and the target video data to the switch;
the switch sends the original video data to a decoding chip, and sends the target video data to a video networking server through the video networking interface so as to send the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal;
the decoding chip generates characteristic video data according to the received original video data and sends the characteristic video data to the projector;
and projecting the characteristic video data by a projector.
2. The method of claim 1, wherein the encoding chip has a data input interface, and wherein a peripheral input device is coupled to the data input interface, the method further comprising:
the encoding chip receives original video data and/or original audio data from the peripheral input equipment through the data input interface;
the encoding chip encodes the original audio data into target video data and sends the original audio data and the target audio data to the switch;
and the switch sends the original audio data to a decoding chip, and sends the target audio data to a video networking server through the video networking interface so as to send the target audio data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
3. The method according to claim 1 or 2, wherein the decoding chip has a data output interface, the data output interface is accessed by a peripheral output device, the video network terminal is further provided with a communication chip, and the method further comprises:
the decoding chip generates characteristic audio data according to the received original audio data;
the decoding chip sends the characteristic video data and/or the characteristic audio data to peripheral output equipment through the data output interface for playing;
or,
the decoding chip sends the characteristic audio data to a communication chip;
and the communication chip sends the characteristic audio data to a connected audio player for playing.
4. The method of claim 1, wherein the video networking terminal is further provided with a communication chip, the method further comprising:
the switch sends the received target video data and/or target audio data to a decoding chip;
the decoding chip sends the target video data and/or the target audio data to a communication chip;
and the communication chip sends the target video data and/or the target audio data to a storage server positioned in an IP network through a mobile network so as to forward the target video data and/or the target audio data to another video network terminal.
5. The method according to any one of claims 1-4, further comprising:
the switch receives target video data and/or target audio data of another video networking terminal, which are sent to the video networking terminal by the video networking server according to a downlink communication link configured for the video networking terminal, through the video networking interface;
the switch sends the target video data and/or the target audio data to a decoding chip;
and the decoding chip decodes the target video data into original video data and/or decodes the target audio data into original audio data.
6. The method of claim 5, wherein the video networking terminal is further provided with a communication chip, the method further comprising:
the communication chip receives target video data and/or target audio data of another video network terminal after the protocol conversion from the video network server to the IP network is carried out by the video network server through a storage server positioned in the IP network;
and the communication chip sends the target video data and/or the target audio data to a decoding chip.
7. A video network terminal is characterized in that a camera, a projector, a coding chip, a switch and a decoding chip are arranged in a video network, the camera is connected into the coding chip, the projector is connected into the decoding chip, the coding chip and the decoding chip are connected into the switch, and the switch is provided with a video network interface;
the camera is used for acquiring original video data and sending the original video data to the coding chip;
the encoding chip is used for encoding the original video data into target video data and sending the original video data and the target video data to the switch;
the switch is used for sending the original video data to a decoding chip, sending the target video data to a video networking server through the video networking interface, and sending the target video data to another video networking terminal according to a downlink communication link configured for the other video networking terminal;
the decoding chip is used for generating characteristic video data according to the received original video data and sending the characteristic video data to the projector;
and the projector is used for projecting the characteristic video data.
8. The video networking terminal of claim 7, wherein the encoding chip has a data input interface, and the data input interface is accessed by a peripheral input device;
the encoding chip is also used for receiving original video data and/or original audio data from the peripheral input equipment through the data input interface;
the encoding chip is also used for encoding the original audio data into target video data and sending the original audio data and the target audio data to the switch;
and the switch is also used for sending the original audio data to a decoding chip, sending the target audio data to a video networking server through the video networking interface, and sending the target audio data to another video networking terminal according to a downlink communication link configured for the other video networking terminal.
9. The video networking terminal according to claim 7 or 8, wherein the decoding chip has a data output interface, the data output interface is accessed by an external output device, and the video networking terminal is further provided with a communication chip;
the decoding chip is also used for generating characteristic audio data according to the received original audio data;
the decoding chip is also used for sending the characteristic video data and/or the characteristic audio data to peripheral output equipment through the data output interface for playing;
or,
the decoding chip is also used for sending the characteristic audio data to the communication chip;
and the communication chip is also used for sending the characteristic audio data to a connected audio player for playing.
10. The apparatus of claim 7, wherein the video networking terminal is further provided with a communication chip;
the switch is also used for sending the received target video data and/or target audio data to the decoding chip;
the decoding chip is also used for sending the target video data and/or the target audio data to a communication chip;
and the communication chip is also used for sending the target video data and/or the target audio data to a storage server positioned in an IP network through a mobile network so as to forward the data to another video network terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810443414.XA CN110475089B (en) | 2018-05-10 | 2018-05-10 | Multimedia data processing method and video networking terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810443414.XA CN110475089B (en) | 2018-05-10 | 2018-05-10 | Multimedia data processing method and video networking terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110475089A true CN110475089A (en) | 2019-11-19 |
CN110475089B CN110475089B (en) | 2021-10-19 |
Family
ID=68503928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810443414.XA Active CN110475089B (en) | 2018-05-10 | 2018-05-10 | Multimedia data processing method and video networking terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110475089B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143059A1 (en) * | 2020-12-31 | 2022-07-07 | 中兴通讯股份有限公司 | Terminal and terminal application method |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140085402A1 (en) * | 2012-09-21 | 2014-03-27 | Hon Hai Precision Industry Co., Ltd. | Conference terminal and method for processing videos from other conference terminals |
CN104184983A (en) * | 2013-05-21 | 2014-12-03 | 中兴通讯股份有限公司 | Conference terminal and service processing method thereof |
CN204090061U (en) * | 2014-10-23 | 2015-01-07 | 长沙楚风数码科技有限公司 | A Distributed Signal Switching and Remote Control System |
CN106341563A (en) * | 2015-07-06 | 2017-01-18 | 北京视联动力国际信息技术有限公司 | Terminal communication based echo suppression method and device |
CN107959818A (en) * | 2016-10-17 | 2018-04-24 | 北京视联动力国际信息技术有限公司 | The data processing method of integrated terminal and integrated terminal |
CN107968928A (en) * | 2016-10-19 | 2018-04-27 | 北京视联动力国际信息技术有限公司 | A kind of method and apparatus of terminal communication |
CN107979563A (en) * | 2016-10-21 | 2018-05-01 | 北京视联动力国际信息技术有限公司 | A kind of information processing method and device based on regarding networking |
CN107995069A (en) * | 2016-10-26 | 2018-05-04 | 北京视联动力国际信息技术有限公司 | A kind of method and apparatus of terminal video push |
-
2018
- 2018-05-10 CN CN201810443414.XA patent/CN110475089B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140085402A1 (en) * | 2012-09-21 | 2014-03-27 | Hon Hai Precision Industry Co., Ltd. | Conference terminal and method for processing videos from other conference terminals |
CN104184983A (en) * | 2013-05-21 | 2014-12-03 | 中兴通讯股份有限公司 | Conference terminal and service processing method thereof |
CN204090061U (en) * | 2014-10-23 | 2015-01-07 | 长沙楚风数码科技有限公司 | A Distributed Signal Switching and Remote Control System |
CN106341563A (en) * | 2015-07-06 | 2017-01-18 | 北京视联动力国际信息技术有限公司 | Terminal communication based echo suppression method and device |
CN107959818A (en) * | 2016-10-17 | 2018-04-24 | 北京视联动力国际信息技术有限公司 | The data processing method of integrated terminal and integrated terminal |
CN107968928A (en) * | 2016-10-19 | 2018-04-27 | 北京视联动力国际信息技术有限公司 | A kind of method and apparatus of terminal communication |
CN107979563A (en) * | 2016-10-21 | 2018-05-01 | 北京视联动力国际信息技术有限公司 | A kind of information processing method and device based on regarding networking |
CN107995069A (en) * | 2016-10-26 | 2018-05-04 | 北京视联动力国际信息技术有限公司 | A kind of method and apparatus of terminal video push |
Non-Patent Citations (2)
Title |
---|
PAN JIA-MIN等: "Construction of Extensible and Compatible Network Video Conference System", 《VIDEO ENGINEERING》 * |
王艳辉等: "视联天眼综合视频会议平台的实现", 《有线电视技术》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022143059A1 (en) * | 2020-12-31 | 2022-07-07 | 中兴通讯股份有限公司 | Terminal and terminal application method |
Also Published As
Publication number | Publication date |
---|---|
CN110475089B (en) | 2021-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106549912B (en) | A kind of playback method and system of video data | |
CN106341515B (en) | A kind of monitoring method and device of terminal | |
CN109640029B (en) | Method and device for displaying video stream on wall | |
CN108737768B (en) | Monitoring method and monitoring device based on monitoring system | |
CN109640028B (en) | Method and device for carrying out conference combining on multiple video networking terminals and multiple Internet terminals | |
CN110460804B (en) | Conference data transmitting method, system, device and computer readable storage medium | |
CN109120879B (en) | Video conference processing method and system | |
CN109547163B (en) | Method and device for controlling data transmission rate | |
CN110049273B (en) | Video networking-based conference recording method and transfer server | |
CN108809921B (en) | Audio processing method, video networking server and video networking terminal | |
CN111131743A (en) | Video call method and device based on browser, electronic equipment and storage medium | |
CN108574816B (en) | Video networking terminal and communication method and device based on video networking terminal | |
CN110769297A (en) | Audio and video data processing method and system | |
CN110149305B (en) | A method and transfer server for multi-party playing audio and video based on video networking | |
CN109905616B (en) | Method and device for switching video pictures | |
CN110087147B (en) | Audio and video stream transmission method and device | |
CN110769179B (en) | Audio and video data stream processing method and system | |
CN109743284B (en) | Video processing method and system based on video network | |
CN110611639A (en) | Audio data processing method and device for streaming media conference | |
CN110049069B (en) | Data acquisition method and device | |
CN110475089B (en) | Multimedia data processing method and video networking terminal | |
CN110661749A (en) | Video signal processing method and video networking terminal | |
CN110149306B (en) | Media data processing method and device | |
CN110460811B (en) | A kind of multimedia data processing method and system based on video networking | |
CN110475143B (en) | Switching method and device of HDMI video data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: 33rd Floor, No.1 Huasheng Road, Yuzhong District, Chongqing 400013 Patentee after: VISIONVERA INFORMATION TECHNOLOGY Co.,Ltd. Country or region after: China Address before: 100000 Beijing Dongcheng District gogoa building A1103-1113 Patentee before: VISIONVERA INFORMATION TECHNOLOGY Co.,Ltd. Country or region before: China |