CN113596571A - Screen sharing method, device, system, storage medium and computer equipment - Google Patents
Screen sharing method, device, system, storage medium and computer equipment Download PDFInfo
- Publication number
- CN113596571A CN113596571A CN202110849522.9A CN202110849522A CN113596571A CN 113596571 A CN113596571 A CN 113596571A CN 202110849522 A CN202110849522 A CN 202110849522A CN 113596571 A CN113596571 A CN 113596571A
- Authority
- CN
- China
- Prior art keywords
- frame image
- screen frame
- canvas
- screen
- sharing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 106
- 238000009877 rendering Methods 0.000 claims description 128
- 238000004590 computer program Methods 0.000 claims description 13
- 238000004806 packaging method and process Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000003321 amplification Effects 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000003199 nucleic acid amplification method Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000012856 packing Methods 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The embodiment of the application provides a screen sharing method, a screen sharing device, a screen sharing system, a storage medium and computer equipment, wherein the method comprises the following steps: the method comprises the steps that a sharing and issuing terminal responds to a trigger operation of screen sharing to obtain a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to a sharing receiving terminal; the sharing receiving terminal receives the video coded data, and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image. According to the method and the device, the stability of screen sharing can be improved, and the real-time performance of screen sharing is improved.
Description
Technical Field
The embodiment of the application relates to the field of screen sharing, in particular to a screen sharing method, device and system, a storage medium and computer equipment.
Background
With the development of video conferences, online education and the like, screen sharing is also widely used. When the screen is shared, the shared screen area may be adjusted as needed, for example, when the shared screen area is a window, that is, when the window is shared, the size of the window may be adjusted in real time as needed, so that the resolution of the window image may also be adjusted in real time, and when the window image is shared, the window image needs to be encoded into video encoded data by an encoder.
In some technologies, during the process of changing the resolution of the window image, the capturing and encoding of the image are suspended until the resolution of the window image is stabilized, and then the encoder is restarted to continue capturing and encoding the image, but this method also causes the problem of poor real-time performance of screen sharing. Correspondingly, when the receiving end decodes the window image, the decoding resolution needs to be kept consistent with the resolution of the window image, and further, the problems of poor stability and poor real-time performance of screen sharing are also caused.
Disclosure of Invention
In order to overcome the problems in the related art, the application provides a screen sharing method, device, system, storage medium and computer equipment, which can improve the stability and real-time performance of screen sharing.
According to a first aspect of an embodiment of the present application, there is provided a screen sharing method, including the steps of:
the method comprises the steps that a sharing and issuing terminal responds to a trigger operation of screen sharing to obtain a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information comprises location information of the screen frame image in the canvas;
the sharing receiving terminal receives the video coded data and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
According to a second aspect of the embodiments of the present application, there is provided a screen sharing method applied to a sharing and publishing terminal, including the following steps:
responding to the trigger operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
and sending the video coded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coded data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
According to a third aspect of the embodiments of the present application, there is provided a screen sharing method applied to a sharing receiving terminal, including the steps of:
receiving video coding data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
Decoding the video coded data according to a decoding mode corresponding to the canvas resolution to obtain the canvas and the identification information of the screen frame image on the canvas;
acquiring a screen frame image from the canvas according to the identification information;
and displaying the screen frame image.
According to a fourth aspect of the embodiments of the present application, a screen sharing system is provided, which includes a sharing publishing terminal and a sharing receiving terminal;
the sharing and issuing terminal responds to the trigger operation of screen sharing and acquires a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information comprises location information of the screen frame image in the canvas;
the sharing receiving terminal receives the video coded data and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
According to a fifth aspect of the embodiments of the present application, there is provided a screen sharing device, which is applied to a sharing and publishing terminal, the device including:
the screen frame image acquisition module is used for responding to the trigger operation of screen sharing and acquiring a screen frame image corresponding to a screen area to be shared;
the identification information acquisition module is used for creating a canvas with fixed resolution, pasting the screen frame image on the canvas and acquiring identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
the coded data acquisition module is used for coding the canvas and the identification information to acquire video coded data to be shared;
and the data sending module is used for sending the video coded data to at least one sharing receiving terminal.
According to a sixth aspect of the embodiments of the present application, there is provided a screen sharing apparatus applied to a sharing receiving terminal, the apparatus including:
the data receiving module is used for receiving video coding data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
The decoding module is used for decoding the video coded data according to a decoding mode corresponding to the canvas resolution to obtain the canvas and the identification information of the screen frame image on the canvas;
the screen frame acquisition module is used for acquiring a screen frame image from the canvas according to the identification information;
and the display module is used for displaying the screen frame image.
According to a seventh aspect of embodiments of the present application, there is provided a computer device comprising a processor and a memory; the memory stores a computer program adapted to be loaded by the processor and to perform the screen sharing method as described above.
According to an eighth aspect of embodiments of the present application, there is provided a computer-readable storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the screen sharing method as described above.
According to the method and the device, the canvas with fixed resolution is created, the screen frame image corresponding to the screen area to be shared is pasted to the canvas, and the identification information of the screen frame image on the canvas is obtained; thereby, coding is carried out based on the canvas with fixed resolution and the identification information to obtain video coding data to be shared, and then the video coding data is sent to at least one sharing receiving terminal, further, it is not necessary to adjust the encoding resolution in real time according to the resolution of the screen frame image and to adjust the decoding resolution in real time according to the resolution of the screen frame image, thereby avoiding the problem of unstable screen sharing caused by continuously restarting the encoder and the decoder, improving the stability of screen sharing, simultaneously improving the real-time performance of screen sharing, and further, in the encoding and transmission process, any compression and amplification operation on the screen frame image is not needed, and complex programming logic is not needed, so that the screen frame image captured by the sharing and publishing terminal can be completely transmitted to the sharing and receiving terminal, and the reduction degree of the screen frame image in the sharing and publishing terminal is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
For a better understanding and practice, the invention is described in detail below with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic block diagram of an application environment of a screen sharing method according to an embodiment of the present disclosure;
fig. 2 is a flowchart illustrating a screen sharing method according to a first embodiment of the present application;
FIG. 3 is a schematic diagram of a video conference interface shown in an embodiment of the present application;
fig. 4 is a schematic view of a video conference interface after a trigger operation in response to screen sharing according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a screen area to be shared determined by a user according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a position of a screen frame image in a canvas according to an embodiment of the present application;
FIG. 7 is a flow chart illustrating a method of encoding video data according to a first embodiment of the present application;
FIG. 8 is a flowchart illustrating a method of displaying a screen frame image according to a first embodiment of the present application;
FIG. 9 is a diagram illustrating a screen frame image overlaid on a rendering window to display the screen frame image according to an embodiment of the present application;
FIG. 10 is a diagram illustrating a rendering window overlaid on a screen frame image to display the screen frame image according to an embodiment of the present application;
FIG. 11 is a schematic view of the state of FIG. 10 after sliding a screen frame image within the rendering window;
fig. 12 is a schematic structural diagram of a viewing mode control according to an embodiment of the present application;
fig. 13 is a schematic effect diagram illustrating a case where a screen frame image is displayed at a resolution matched with a viewing window according to an embodiment of the present application;
fig. 14 is another schematic effect diagram of displaying a screen frame image at a resolution matched with a viewing window according to an embodiment of the present application;
fig. 15 is a diagram illustrating still another exemplary effect when a screen frame image is displayed at a resolution matched with a viewing window according to an embodiment of the present application;
fig. 16 is a flowchart illustrating a screen sharing method according to a second embodiment of the present application;
Fig. 17 is a flowchart illustrating a screen sharing method according to a third embodiment of the present application;
fig. 18 is a schematic block diagram of a screen sharing system according to a fourth embodiment of the present application;
fig. 19 is a schematic block diagram of a screen sharing apparatus according to a fifth embodiment of the present application;
fig. 20 is a schematic block diagram of a screen sharing apparatus according to a sixth embodiment of the present application;
fig. 21 is a schematic structural diagram of a computer device according to a seventh embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the embodiments described are only some embodiments of the present application, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without any creative effort belong to the protection scope of the embodiments in the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. In the description of the present application, it is to be understood that the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not necessarily used to describe a particular order or sequence, nor are they to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The word "if/if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination".
Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
As will be understood by those skilled in the art, a "terminal" as used herein includes both a wireless signal transmitter device, which is a device having only a wireless signal transmitter with transmitting capabilities, and a wireless signal receiver device, which is a device having only a wireless signal receiver with receiving capabilities, and may also include receiving and transmitting hardware devices having receiving and transmitting hardware capable of two-way communication over a two-way communication link. Such a device may include: cellular or other communication devices such as personal computers, tablets, etc. having single or multi-line displays or cellular or other communication devices without multi-line displays; PCS (personal communications Service), which may combine voice, data processing, facsimile and/or data communications capabilities; a PDA (Personal Digital Assistant), which may include a radio frequency receiver, a pager, internet/intranet access, a web browser, a notepad, a calendar and/or a GPS (Global positioning system) receiver; a conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, a "terminal" or "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or situated and/or configured to operate locally and/or in a distributed fashion at any other location(s) on earth and/or in space. The "terminal Device" and "terminal Device" used herein may also be a communication terminal, a web terminal, and a music/video playing terminal, and may be, for example, a PDA, an MID (Mobile Internet Device) and/or a Mobile phone with a music/video playing function, and may also be a smart tv, a set-top box, and the like.
The hardware referred to by the names "terminal", "terminal device", etc. is essentially a computer device with the performance of a personal computer, etc., and is a hardware device having necessary components disclosed by von neumann principles such as a central processing unit (including an arithmetic unit and a controller), a memory, an input device, an output device, etc., wherein a computer program is stored in the memory, and the central processing unit calls a program stored in an external memory into the internal memory to run, executes instructions in the program, and interacts with the input and output devices, thereby completing a specific function.
Please refer to fig. 1, which is a schematic block diagram of an application environment of a screen sharing method according to an embodiment of the present disclosure. As shown in fig. 1, an application environment of the screen sharing method includes a sharing publishing terminal 120 and at least one sharing receiving terminal 140; the sharing publishing terminal 120 and the sharing receiving terminal 140 may be directly or indirectly connected through a network, which is not limited herein. The network may be a communication medium of various connection types capable of providing a communication link between the sharing publishing terminal 120 and the sharing receiving terminal 140, for example, the network may be a wired communication link, a wireless communication link, or an optical fiber cable, and the application is not limited herein. It should be noted that the screen sharing method of the present application may be specifically applied to education and meeting scenes, and may also be applied to any scene that can use the screen sharing method of the present application, such as live broadcast.
The sharing publishing terminal 120 is one end of the sharing screen; the sharing and publishing terminal 120 may be a personal electronic device or a public electronic device such as a desktop, a notebook, a tablet computer, or a smart phone, which is not limited herein. In some examples, when the screen sharing method is used for education, the sharing publishing terminal 120 may be a personal electronic device used by a teacher; when the screen sharing method is used for a conference, the sharing publishing terminal 120 may be a personal electronic device used by conference personnel sharing a screen, such as a conference host.
The sharing receiving terminal 140 is one end that views the sharing screen; the sharing receiving terminal 140 may also be a personal electronic device or a public electronic device such as a desktop, a notebook, a tablet computer, or a smart phone, which is not limited herein. In some examples, when the screen sharing method is used for education, the sharing receiving terminal 140 may be a personal electronic device used by a student; when the screen sharing method is used for a conference, the sharing publishing terminal 120 may be a personal electronic device used by conference participants who are not sharing screens, such as conference participants.
In this embodiment, the sharing and publishing terminal 120 and the sharing and receiving terminal 140 may both run the same screen sharing application, and the screen sharing application relies on a network to realize the interaction between the sharing and publishing terminal 120 and the sharing and receiving terminal 140, wherein the screen sharing application may run as an independent application, and may also be embedded into other applications as a part of other applications, for example, in a meeting scene, the screen sharing application is embedded into a video conference application, and the screen sharing is realized according to actual needs. Specifically, the sharing and publishing terminal 120 and the sharing and receiving terminal 140 may join the same conference room through the video conference application program, and then share the screen of the sharing and publishing terminal 120 to all the sharing and receiving terminals 140 joining the conference room through the screen sharing application program, so that the sharing and publishing terminal 140 can synchronously view the screen content in the sharing and publishing terminal 120.
Optionally, the application environment of the screen sharing method may further include a server (not shown), where the sharing publishing terminal 120 and the sharing receiving terminal 140 may access the internet through a network access manner, establish a data communication link with the server, and then implement interaction between the sharing publishing terminal 120 and the sharing receiving terminal 140 through the server. The server may be used as a service server, and is responsible for further connecting a related audio streaming server, a video streaming server, and other servers providing related support, so as to form a logically associated server cluster to provide services for related terminal devices, for example, the sharing publishing terminal 120 and the sharing receiving terminal 140.
In order to better understand the solution of the present application, some technical solutions will be described below.
When the screen is shared, the shared screen area may be adjusted as needed, for example, when the shared screen area is a window, that is, when the window is shared, the size of the window may be adjusted in real time as needed, so that the resolution of the window image may also be adjusted in real time, and when the window image is shared, the window image needs to be encoded into video encoded data by an encoder. In some technologies, during the process of changing the resolution of the window image, the capturing and encoding of the image are suspended until the resolution of the window image is stabilized, and then the encoder is restarted to continue capturing and encoding the image, but this method also causes the problem of poor real-time performance of screen sharing.
Correspondingly, when the sharing receiving terminal 140 receives and decodes the video encoded data sent by the sharing publishing terminal 120, the decoding resolution needs to be kept consistent with the resolution of the window image, and further, the problems of poor stability and poor real-time performance of screen sharing may also be caused.
In addition, with the development of video conferences and online education, screen sharing is also more and more widely used, and the popularization of smart phones and tablets further enables screen sharing of slave computer equipment to gradually become a scene of mixed sharing of smart phones, tablets and computers. Due to different display resolutions or display scales of different devices, the screen images shared by the sharing publishing terminal 120 may not be consistently displayed on the sharing receiving terminal 140. For example, the sharing distribution terminal 120 uses a mobile phone, which uses a screen with a resolution of 720 × 1080 in the vertical direction and has an aspect ratio of 2:3, but the sharing reception terminal 140 uses a computer, which uses a resolution of 1920 × 720 and an aspect ratio of 8:3, that is, the screen image shared by the sharing distribution terminal 120 is not matched with the display resolution of the display screen of the sharing reception terminal 140, and then only a partial image of the screen image can be displayed on the sharing reception terminal 140; if the screen image is zoomed to adapt to different display resolutions between the sharing publishing terminal 120 and the sharing receiving terminal 140, details of the image become blurred due to the zooming, and especially in a scene where the shared screen image is a character, a webpage, or a teaching and the like, and the details are important, the zoomed screen image becomes blurred at the sharing receiving terminal 140, and the screen image cannot be clearly displayed, so that the requirement for displaying the details cannot be met.
Based on the above, the application provides a screen sharing method, device, system, storage medium and computer equipment.
Referring to fig. 2, fig. 2 is a flowchart of a screen sharing method according to a first embodiment of the present application, which includes the following steps,
step S101: the method comprises the steps that a sharing and issuing terminal responds to a trigger operation of screen sharing to obtain a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information includes location information of the screen frame image in the canvas.
The screen sharing triggering operation refers to a request operation for sharing the desktop of the sharing and publishing terminal.
In an embodiment, the screen sharing triggering operation may be a request operation for desktop sharing of a sharing and publishing terminal of a teacher terminal in an education scene. For example, when an education scene is that a teacher gives lessons in a live broadcast mode, the teacher clicks a sharing control on a sharing and publishing terminal to achieve screen sharing triggering operation, and the sharing and publishing terminal where the teacher is located responds to the screen sharing triggering operation to perform desktop sharing.
In another embodiment, the screen sharing triggering operation may be a request operation for performing desktop sharing on a sharing and publishing terminal of the host when the conference scene is performed. For example, in a conference, a host clicks a sharing control on a sharing and publishing terminal to implement a screen sharing triggering operation, and the sharing and publishing terminal where the host is located responds to the screen sharing triggering operation to perform desktop sharing. Specifically, as shown in fig. 3, the sharing and publishing terminal embeds a screen sharing application program into a conference application program, sets a screen sharing control in a toolbar 11 of the conference display interface 10, and realizes a trigger operation of screen sharing by clicking the screen sharing control. As shown in fig. 3, it can be understood that conventional controls used in a video conference, such as a microphone control, a camera control, a recording control, a member inviting control, and a live broadcast opening control, are still provided in the toolbar 11 of the conference display interface 10 in the conference application, and the screen sharing control and the above-mentioned controls may exist independently from each other, so as to implement corresponding functions.
The screen area to be shared may be a full-screen desktop of the sharing and issuing terminal, may also be a partial area of the desktop, or may also be a virtual desktop, and the like. The desktop can be a window opened by the sharing and publishing terminal or a split screen area. Specifically, after a user clicks a screen sharing control to realize a trigger operation of screen sharing, the sharing and issuing terminal responds to the trigger operation of screen sharing, and displays all windows opened at the sharing and issuing terminal, windows preset in advance and/or screen areas of a split screen desktop in a list form for the user to select. As shown in fig. 4, after the sharing and publishing terminal responds to the trigger operation of screen sharing, a window to be shared is displayed on the sharing and publishing terminal for a user to select. When the user clicks one of the options in the list from the list to select and determine the screen area to be shared, for example, after clicking a certain window to determine the screen area to be shared, the sharing publishing terminal pops up a prompt box around the window to prompt the user of the area being shared, as shown in fig. 5, the bold black box below the "area being shared" is the prompted area being shared. Optionally, in order to better display the screen area to be shared, after determining the screen area to be shared, the conference display desktop 10 displays the screen area to be shared, and the toolbar 11 is hidden.
The screen frame image can be an image obtained after the screen area to be shared is captured at preset time intervals by the sharing and releasing terminal, and optionally, the screen frame image can be an image obtained after the screen area to be shared is captured at preset time intervals by the sharing and releasing terminal. It can be understood that when the size of the screen area to be shared changes, the size of the screen frame image will also change, that is, the resolution of the screen frame image will also change, for example, when the screen area to be shared is a window, and a user drags the window to change its size, the resolution of the screen frame image corresponding to the window will also change.
The canvas is used for encoding, can be created based on the canvas, and particularly can be used for carrying out graphic drawing by defining canvas elements and creating canvas objects. In this embodiment, the screen frame image is pasted onto the canvas through a canvas object. Optionally, the canvas is a blank canvas to reduce data transmission.
It should be noted that, for each frame of screen frame image, it corresponds to a canvas, so that each frame of screen frame image can be pasted on the canvas, and then, for each frame of screen frame image, the encoding is performed based on the canvas. Because the resolution ratio of the canvas is fixed, when the encoding is carried out based on the canvas, the encoding is carried out only by adopting the fixed resolution ratio, namely the resolution ratio of the canvas, and the encoding resolution ratio is not required to be adjusted according to the resolution ratio of the screen frame image, so that the problem of unstable screen sharing caused by continuously restarting the encoder can be avoided, and meanwhile, the real-time performance of screen sharing can also be improved.
The location information is used for identifying the location of the screen frame image in the canvas so as to identify the screen frame image in the canvas according to the location information.
In one embodiment, the location information includes a resolution of the screen frame image and coordinates of the screen frame image pre-set points in the canvas. The resolution of the screen image comprises a first horizontal pixel point number and a first vertical pixel point number. A coordinate system may be established for the canvas, for example, a rectangular coordinate system may be established with a point at the lower right corner of the canvas as a circular point, a direction in which the canvas is horizontally located as an X-axis, and a direction in which the canvas is vertically located as a Y-axis, and when the screen frame image is pasted on the canvas, any point of the screen frame image has a corresponding coordinate on the canvas, so that a coordinate of the screen frame image preset point in the canvas may be obtained. Optionally, a point at the top left corner of the screen frame image may be used as a preset point, and then the coordinates of the point at the top left corner of the screen frame image in the canvas may be obtained. As shown in fig. 6, the screen frame image is pasted on the canvas, the coordinates of the upper left corner of the screen frame image are (x, y), and the number of the first horizontal pixel dots and the number of the first vertical pixel dots are w and h, respectively.
The sharing receiving terminal is a terminal which establishes a screen sharing channel with the sharing publishing terminal. Specifically, in a conference scene, the sharing receiving terminal and the sharing publishing terminal may both install a video conference application program, wherein a screen sharing application program is embedded in the video conference application program; the sharing and publishing terminal where the host is located can establish a conference room in a video conference application program to form a conference ID, each participant can enter the same conference room to participate in the conference by opening the video conference application program in the respective sharing and receiving terminal and inputting the conference ID, at the moment, the sharing and receiving terminal establishes a screen sharing channel with the sharing and publishing terminal, and then the sharing and publishing terminal can send video coding data corresponding to screen frame images to all sharing and receiving terminals which are added into the conference room.
Step S102: the sharing receiving terminal receives the video coded data and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
After the sharing receiving terminal receives the video coded data, the canvas and the identification information can be obtained by decoding according to a preset decoding algorithm. And because the identification information comprises the position information of the screen frame image, the position of the screen frame image can be determined from the canvas, and the screen frame image is obtained. Because the resolution of the canvas is fixed, the decoding resolution of the decoder during decoding is also fixed, so that the problem of unstable screen sharing caused by continuous restarting of the decoder is avoided. It can be understood that the sharing issuing terminal and the sharing receiving terminal may set a specific coding and decoding algorithm in advance in a unified manner to implement the coding and decoding operations of the screen frame image, and the coding and decoding algorithm may be any algorithm that can implement the method of the present application, which is not limited in the present application.
According to the method and the device, the canvas with fixed resolution is created, the screen frame image corresponding to the screen area to be shared is pasted to the canvas, and the identification information of the screen frame image on the canvas is obtained; thereby, coding is carried out based on the canvas with fixed resolution and the identification information to obtain video coding data to be shared, and then the video coding data is sent to at least one sharing receiving terminal, further, it is not necessary to adjust the encoding resolution in real time according to the resolution of the screen frame image and to adjust the decoding resolution in real time according to the resolution of the screen frame image, thereby avoiding the problem of unstable screen sharing caused by continuously restarting the encoder and the decoder, improving the stability of screen sharing, simultaneously improving the real-time performance of screen sharing, and further, in the encoding and transmission process, any compression and amplification operation on the screen frame image is not needed, and complex programming logic is not needed, so that the screen frame image captured by the sharing and publishing terminal can be completely transmitted to the sharing and receiving terminal, and the reduction degree of the screen frame image in the sharing and publishing terminal is ensured.
In an embodiment, the resolution of the canvas is a resolution corresponding to a full-screen desktop of the sharing and issuing terminal. Considering that the size of the screen area to be shared is changed, and the maximum screen area to be shared is a full screen desktop, in order to paste all screen frame images corresponding to the screen area to be shared on the canvas, the resolution of the canvas is set to the resolution corresponding to the full screen desktop of the sharing and issuing terminal.
Referring to fig. 7, in an embodiment, in step S101, the encoding, by the sharing and publishing terminal, the canvas and the identification information according to an encoding method corresponding to a resolution of the canvas includes:
step S1011: the canvas is encoded by adopting a fixed encoding resolution ratio to obtain first encoded data; wherein the encoding resolution is the same size as the resolution of the canvas.
The resolution of the canvas is fixed, so the encoding resolution is also fixed, and when the canvas is encoded, the encoding is also performed at the fixed resolution, so that the problem of instability caused by continuous restarting of an encoder is avoided, and the real-time performance of encoding can be improved.
Step S1012: and coding the identification information by adopting an SEI mode to obtain second coded data.
SEI (Supplemental Enhancement Information) is a bit stream coding mode in a code stream category, and provides a method for adding Information into video coding data, specifically, the identification Information is coded by the SEI mode to form second coding data, and then the first coding data corresponding to the canvas is combined to form the second coding data which is added into the video coding data, so that a basis for identifying and displaying the screen frame image in the canvas can be provided. It should be noted that, when the resolution of the screen frame image changes, the method and complexity of the SEI method for encoding the identification information are not affected.
It can be understood that before the identification information is encoded in the SEI manner to obtain the second encoded data for encoding, the identification information also needs to be serialized into a character string for converting into data that can be recognized by an encoder for encoding.
Step S1013: and packaging the first coded data and the second coded data to obtain the video coded data to be shared.
This application is through right the canvas carries out the first coded data that the code obtained, and right identification information adopts the SEI mode to encode the second coded data packing that obtains and retransmit to sharing receiving terminal, can realize the synchronous transmission of canvas and identification information, and then make sharing receiving terminal can be according to with identification information that the canvas corresponds obtains the screen frame image from the canvas.
It can be understood that the sharing receiving terminal receives the video coded data, decodes the video coded data in a decoding manner corresponding to the resolution of the canvas, and obtains the canvas and the identification information, where the decoding process is an inverse process to the encoding process, and specifically may include: decompressing the video coding data to obtain first coding data and second coding data; encoding the first encoded data by adopting a fixed decoding resolution to obtain a canvas; wherein a decoding resolution is the same size as a resolution of the canvas; and decoding the second coded data in an SEI mode to obtain identification information.
In one embodiment, when the position information includes the resolution of the screen frame image and the coordinates of the preset point of the screen frame image in the canvas, the obtaining, by the sharing and receiving terminal in step S102, the screen frame image from the canvas according to the identification information includes: and intercepting the screen frame image from the canvas according to the coordinates of the preset points of the screen frame image in the canvas and the resolution of the screen frame image. The position of the screen frame image in the canvas can be uniquely determined due to the resolution of the screen frame image and the coordinates of the preset point of the screen frame image in the canvas, so that the screen frame image can be conveniently and accurately obtained from the canvas.
In one embodiment, when the position information includes a resolution of the screen frame image, the displaying the screen frame image by the sharing receiving terminal in step S102 includes:
step S1021: and creating a rendering window, and adjusting the stacking relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
The display resolutions of the sharing and issuing terminal and the sharing and receiving terminal are different, so that the resolution of the screen frame image and the resolution of the rendering window are possibly different, the screen frame image shared by the sharing and issuing terminal is not matched with the size of the rendering window of the sharing and receiving terminal, and therefore the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image by adjusting the stacking relation between the screen frame image and the rendering window.
The resolution of the screen frame image comprises a first horizontal pixel point number and a first vertical pixel point number; the resolution of the rendering window comprises a second transverse pixel point number and a second longitudinal pixel point number; referring to fig. 8, in step S1021, the sharing and receiving terminal adjusts a stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, including:
Step S10211: and when the number of the second transverse pixel points is greater than or equal to that of the first transverse pixel points and the number of the second longitudinal pixel points is greater than or equal to that of the first longitudinal pixel points, overlapping the screen frame image on the rendering window, and displaying the screen frame image in the rendering window.
As shown in fig. 9, when the number of the second horizontal pixel points is greater than or equal to the number of the first horizontal pixel points and the number of the second vertical pixel points is greater than or equal to the number of the first vertical pixel points, it is indicated that the rendering window is large enough to completely display the screen frame image.
Optionally, when the screen frame image is overlaid on the rendering window, and the screen frame image is displayed in the rendering window, because the screen frame image is smaller than the rendering window, the screen frame image does not completely cover the rendering window when overlaid on the rendering window, and therefore, a region of the rendering window that is not covered by the screen frame image may be filled with a preset color to improve a viewing effect, for example, a region of the rendering window that is not covered by the screen frame image may be filled with a color such as black or white.
Step S10212: when the number of the second transverse pixel points is less than that of the first transverse pixel points and/or the number of the second longitudinal pixel points is less than that of the first longitudinal pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; displaying, within the rendering window, a corresponding position of the slid screen frame image when responding to sliding of the screen frame image within the rendering window.
When the number of the second horizontal pixel points is less than the number of the first horizontal pixel points and/or the number of the second vertical pixel points is less than the number of the first vertical pixel points, it is indicated that the rendering window is not large enough, and partial images of the screen frame image may not be displayed. Specifically, as shown in fig. 10, only a part of the screen frame image is displayed in the rendering window, for example, an "XX" character is displayed, and a "YY" character located at the right side of the rendering window is not displayed because the rendering window and the screen mode frame image are not overlapped. When the user slides the screen frame image in the rendering window, the corresponding position of the slid screen frame image is displayed in the rendering window, so that the image of other positions in the screen frame image can be seen in the rendering window, as shown in fig. 11, after the rendering window is slid to the right, the right YY character can be displayed in the rendering window, and therefore the screen frame image can be seen in the rendering window completely and clearly, and the detail loss caused by zooming the screen frame image can be prevented.
In one embodiment, the sharing receiving terminal is provided with a viewing mode selection control for adjusting the display resolution of the screen frame image; optionally, when the sharing receiving terminal is started or just receives the video encoded data, the viewing mode selection control may be popped up, so that the user may select whether to trigger the viewing mode selection control. Further, after the screen frame image is displayed, the viewing mode selection control may also be displayed in the toolbar 11 of the rendering window, so that the user may select to adjust the display effect of the screen frame image by triggering the viewing mode control.
Optionally, the viewing mode selection control comprises a first mode control; the first mode control is to instruct display of the screen frame image without scaling according to a resolution of the screen frame image; the sharing receiving terminal adjusts the stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is not zoomed according to the resolution of the screen frame image before being displayed in the rendering window, and the sharing receiving terminal further comprises: and receiving the triggering operation of the first mode control. Optionally, when the sharing receiving terminal is started or just receives the video encoded data, the viewing mode selection control may be popped up, so that a user may select whether to trigger the first mode control. Optionally, the first mode control may also be used as a default trigger operation, so that when the sharing receiving terminal receives the video encoded data, the screen frame image is displayed without zooming according to the resolution of the screen frame image by default. As shown in fig. 12, on the user side, the first mode control is a control corresponding to the "100%" identifier.
Optionally, the viewing mode selection control further includes a second mode control, where the second mode control is configured to instruct to enlarge and display the screen frame image; the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the second mode control, stacks the screen frame image on the rendering window, keeps the ratio of the number of the first horizontal pixels to the number of the first longitudinal pixels unchanged, and enlarges the screen frame image according to a first preset scaling coefficient to display the screen frame image in the rendering window in an enlarged manner. After the screen frame image is displayed, the viewing mode selection control may also be displayed on the toolbar 11 of the rendering window, so that the user may select whether to enlarge and display the screen frame image by triggering the second mode control, so as to obtain more details. For example, when the number of the second horizontal pixel points is greater than or equal to the number of the first horizontal pixel points, and when the number of the second vertical pixel points is greater than or equal to the number of the first vertical pixel points, the screen frame image is stacked on the rendering window, so that when the screen frame image is displayed in the rendering window, the viewing effect may be poor due to the fact that the screen frame image is too small, and further the second mode control may be triggered to enlarge the screen frame image, thereby improving the viewing effect. As shown in fig. 12, on the user side, the first mode control is a control corresponding to the label "Zoom +". It can be understood that, each time the second mode control is triggered, the screen frame image is amplified once according to a first preset scaling factor, so that the screen frame image is amplified in an equal proportion.
Optionally, the viewing mode selection control further includes a third mode control, and the third mode control is configured to instruct to display the screen frame image in a reduced size; the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame image on the rendering window, keeps the proportion of the first horizontal pixel point and the first longitudinal pixel point unchanged, reduces the screen frame image according to a second preset scaling coefficient, and reduces and displays the screen frame image in the rendering window. After the screen frame image is displayed, the viewing mode selection control can also be displayed on a toolbar of the rendering window, so that a user can select whether to reduce and display the screen frame image by triggering the third mode control, and the whole display effect of the screen frame image is obtained. For example, when the number of the second horizontal pixels is less than the number of the first horizontal pixels, and/or the number of the second vertical pixels is less than the number of the first vertical pixels, the rendering window is overlapped on the screen frame image in a sliding manner, so that when the screen frame image is partially displayed in the rendering window, the third mode control can be triggered to reduce the screen frame image, thereby viewing the screen frame image as a whole. As shown in fig. 12, on the user side, the first mode control is a control identified as "Zoom-" corresponding to the first mode control. It can be understood that, each time the third mode control is triggered, a reduction operation is performed on the screen frame image according to a second preset scaling factor, so that the screen frame image is reduced in an equal proportion.
Optionally, the viewing mode selection control further includes a fourth mode control, where the fourth mode control is configured to instruct to display the screen frame image at a preset ratio; the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame image on the rendering window, keeps the ratio of the number of the first horizontal pixels to the number of the first longitudinal pixels unchanged, reduces or enlarges the screen frame image according to a third preset scaling factor, and displays the screen frame image in the rendering window according to the preset ratio. Similarly, after the screen frame image is displayed, the viewing mode selection control may also be displayed on the toolbar 11 of the rendering window, so that the user may select whether to adjust and display the screen frame image by triggering the fourth mode control, so as to obtain an expected display effect of the screen frame image. When the preset proportion is that the screen frame image is displayed in a manner of zooming by 50%, as shown in fig. 12, on the user side, the fourth mode control is a control corresponding to a mark of "50%". It is to be understood that, each time the fourth mode control is triggered, the screen frame image is reduced or enlarged according to the third preset scaling factor, and the screen frame image is reduced or enlarged according to the preset scale and displayed in the rendering window.
Optionally, the viewing mode selection control further comprises a fifth mode control; the fifth mode control is to instruct display of the screen frame image at a resolution matching the viewing window; the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the fifth mode control, stacks the screen frame image on the rendering window, keeps the proportion of the first horizontal pixel point and the first longitudinal pixel point unchanged, enlarges or reduces the screen frame image, and displays the screen frame image in the rendering window at a resolution ratio matched with that of the rendering window. Optionally, when the sharing receiving terminal is started or just receives the video encoded data, the viewing mode selection control may be popped up, so that a user may select whether to trigger a fifth mode control. Optionally, the fifth mode control may be used as a default trigger operation, so that when the sharing receiving terminal receives the video encoded data, the screen frame image is displayed with a resolution matched with the viewing window by default. As shown in fig. 12, on the user side, the fifth mode control is a control corresponding to the control identified as "Max".
Specifically, the step of maintaining the ratio of the first horizontal pixel point to the first vertical pixel point unchanged, and expanding or reducing the screen frame image to display the screen frame image in the rendering window at a resolution matched with the rendering window includes: calculating the ratio of the number of the first transverse pixel points and the number of the first longitudinal pixel points of the screen frame image to obtain first ratio data; and calculating the ratio of the second transverse pixel point number and the second longitudinal pixel point number of the rendering window to obtain second ratio data. As shown in fig. 13, when the first ratio data is greater than the second ratio data, the screen frame image is enlarged or reduced until the number of first horizontal pixel points of the screen frame image and the number of second horizontal pixel points of the rendering window are reached, so that the enlarged or reduced screen frame image is displayed in the rendering window. As shown in fig. 14, when the first ratio data is smaller than the second ratio data, the screen frame image is enlarged or reduced until the number of first vertical pixel points of the screen frame image is equal to the number of second vertical pixel points of the rendering window, so that the enlarged or reduced screen frame image is displayed in the rendering window. As shown in fig. 15, when the first ratio data is equal to the second ratio data, the screen frame image is enlarged or reduced until the number of first horizontal pixel points of the screen frame image is the same as the number of second horizontal pixel points of the rendering window and the number of first vertical pixel points of the screen frame image is the same as the number of second vertical pixel points of the rendering window, so that the enlarged or reduced screen frame image is displayed in the rendering window.
Referring to fig. 16, fig. 16 is a schematic flow chart of a screen sharing method according to a second embodiment of the present application, where the screen sharing method is applied to a sharing publishing terminal, and includes the following steps:
step S201: responding to the trigger operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
step S202: creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
step S203: coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
step S204: and sending the video coded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coded data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
According to the method and the device, the canvas with fixed resolution is created, the screen frame image corresponding to the screen area to be shared is pasted to the canvas, and the identification information of the screen frame image on the canvas is obtained; therefore, the canvas and the identification information based on the fixed resolution ratio are encoded to obtain video encoding data to be shared, the video encoding data are sent to at least one sharing receiving terminal, the encoding resolution ratio does not need to be adjusted in real time according to the resolution ratio of the screen frame image, the problem of unstable screen sharing caused by continuous restarting of an encoder can be solved, the stability of screen sharing is improved, meanwhile, the real-time performance of screen sharing can be improved, further, any compression and amplification operation on the screen frame image is not needed in the encoding and transmission process, complex programming logic is not needed, the screen frame image captured by the sharing publishing terminal can be completely transmitted to the sharing receiving terminal, and the reduction degree of the screen frame image in the sharing publishing terminal is guaranteed.
The screen sharing method is described from the sharing publishing terminal side. For specific implementation manners, reference may be made to the description related to the step executed by the sharing and issuing terminal in the first embodiment, which is not described herein again.
Referring to fig. 17, fig. 17 is a schematic flowchart of a screen sharing method according to a third embodiment of the present application, where the screen sharing method is applied to a sharing receiving terminal, and includes the following steps:
step S301: receiving video coding data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information includes position information of the screen frame image in the canvas.
Step S302: and decoding the video coded data according to a decoding mode corresponding to the canvas resolution to obtain the canvas and the identification information of the screen frame image on the canvas.
Step S303: and obtaining a screen frame image from the canvas according to the identification information.
Step S304: and displaying the screen frame image.
The embodiment of the application decodes the canvas with the fixed resolution to obtain the screen frame image, and then does not need to adjust the decoding resolution in real time according to the resolution of the screen frame image, so that the problem of unstable screen sharing caused by continuously restarting a decoder can be avoided, the stability of screen sharing is improved, meanwhile, the real-time performance of screen sharing can be improved, and the reduction degree of the screen frame image is ensured.
In one embodiment, the position information includes a resolution of the screen frame image;
in step S302, the displaying the screen frame image by the sharing receiving terminal includes:
step S3021: and creating a rendering window, and adjusting the stacking relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
In one embodiment, the resolution of the screen frame image includes a first number of horizontal pixel points and a first number of vertical pixel points; the resolution of the rendering window comprises a second transverse pixel point number and a second longitudinal pixel point number;
in step S3021, the adjusting, by the sharing receiving terminal, a stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without being scaled according to the resolution of the screen frame image, including:
step S30211: when the number of the second transverse pixel points is larger than or equal to that of the first transverse pixel points and the number of the second longitudinal pixel points is larger than or equal to that of the first longitudinal pixel points, the screen frame image is overlapped on the rendering window, so that the screen frame image is displayed in the rendering window;
Step S30212: when the number of the second transverse pixel points is less than that of the first transverse pixel points and/or the number of the second longitudinal pixel points is less than that of the first longitudinal pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; displaying, within the rendering window, a corresponding position of the slid screen frame image when responding to sliding of the screen frame image within the rendering window.
According to the embodiment of the application, the screen frame image can be completely and clearly seen in the rendering window by adjusting the stacking relation between the screen frame image and the rendering window, and the detail loss caused by zooming the screen frame image is prevented.
The screen sharing method is described in this embodiment from the side of the sharing receiving terminal. For specific implementation manners, reference may be made to the description related to the step executed by the sharing and issuing terminal in the first embodiment, which is not described herein again.
Referring to fig. 18, fig. 18 is a schematic block diagram of a screen sharing system according to a fourth embodiment of the present application, in which the screen sharing system 400 includes a sharing publishing terminal 401 and a sharing receiving terminal 402.
The sharing and issuing terminal 401 responds to the trigger operation of screen sharing, and acquires a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information comprises location information of the screen frame image in the canvas;
the sharing receiving terminal 402 receives the video coded data, and decodes the video coded data in a decoding manner corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
The present embodiment describes a screen sharing method from the perspective of a system. For specific implementation manners, reference may be made to relevant descriptions of the steps performed by the screen sharing method in the first embodiment, which are not described herein again.
Referring to fig. 19, fig. 19 is a schematic block diagram of a screen sharing device according to a fifth embodiment of the present application, in which the screen sharing device 500 is applied to a sharing publishing terminal, and the screen sharing device 500 includes:
a screen frame image obtaining module 501, configured to respond to a trigger operation of screen sharing, and obtain a screen frame image corresponding to a screen area to be shared;
an identification information obtaining module 502, configured to create a canvas with a fixed resolution, paste the screen frame image onto the canvas, and obtain identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
the coded data obtaining module 503 is configured to code the canvas and the identification information according to a coding mode corresponding to a resolution of the canvas to obtain video coded data to be shared;
a data sending module 504, configured to send the video encoded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video encoded data according to a decoding manner corresponding to the canvas resolution, and obtains a screen frame image.
It should be noted that, when the screen sharing apparatus in live broadcasting provided in the foregoing embodiment executes the screen sharing method, only the division of the functional modules is used for illustration, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the screen sharing device provided in the above embodiment and the method for sharing the execution of the publishing terminal in the screen sharing method in the first embodiment belong to the same concept, and details of the implementation process are shown in the method embodiment and are not described herein again.
Specifically, referring to fig. 20, fig. 20 is a schematic block diagram of a screen sharing device according to a sixth embodiment of the present application, where the screen sharing device may be implemented as all or a part of a computer device through software, hardware, or a combination of the software and the hardware. The screen sharing apparatus 600 is applied to a sharing receiving terminal, and the screen sharing apparatus 500 includes:
a data receiving module 601, configured to receive video coded data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
A decoding module 602, configured to decode the video encoded data according to a decoding manner corresponding to the canvas resolution, to obtain the canvas and the identification information of the screen frame image on the canvas;
a screen frame obtaining module 603, configured to obtain a screen frame image from the canvas according to the identification information;
a display module 604, configured to display the screen frame image.
It should be noted that, when the screen sharing apparatus in live broadcasting provided in the foregoing embodiment executes the screen sharing method, only the division of the functional modules is used for illustration, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the screen sharing apparatus provided in the above embodiment and the method for sharing execution of the receiving terminal in the screen sharing method in the first embodiment belong to the same concept, and details of implementation processes thereof are referred to in the method embodiments, and are not described herein again.
The embodiments of the screen sharing device according to the fifth embodiment and the sixth embodiment of the present application may be applied to computer equipment, and the embodiments of the device may be implemented by software, or implemented by hardware, or implemented by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor in which the file processing is located. From a hardware perspective, the computer device may include a processor, a network interface, a memory, and a non-volatile memory, which are connected to each other via a data bus or in other known manners.
Please refer to fig. 21, which is a schematic structural diagram of a computer apparatus according to a seventh embodiment of the present application. As shown in fig. 21, the computer device 700 may include: the processor 701, the network interface 702, the memory 703, and the nonvolatile memory 704 are connected in common with each other via a data bus 705. In addition to the processor 701, the network interface 702 memory 703 and the nonvolatile memory 704 shown in fig. 21, the actual functions of the computer device described in this application may also include other hardware, which is not described again. The memory 703 or the nonvolatile memory 704 is running a computer program, such as: a screen sharing method; the processor 701 implements the steps in the first to third embodiments described above when executing the computer program. Wherein, the computer device is also used as a carrier of the screen device of the fifth embodiment and the sixth embodiment.
The processor 701 may include one or more processing cores, among others. The processor 701 is connected to various parts of the computer device 700 by various interfaces and lines, executes various functions of the computer device 700 and processes data by operating or executing instructions, programs, code sets or instruction sets stored in the memory 703 or the nonvolatile memory 704, and calls data in the memory 703 or the nonvolatile memory 704, and optionally, the processor 701 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), Programmable Logic Array (PLA). The processor 701 may integrate one or a combination of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing contents required to be displayed by the touch display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 701, and may be implemented by a single chip.
The Memory 704 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 704 may be used to store instructions, programs, code sets, or instruction sets. The memory 704 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 704 may alternatively be at least one memory device located remotely from the processor 701.
An eighth embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the method steps in the foregoing embodiments, and a specific execution process may refer to specific descriptions of the foregoing embodiments and is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium and used by a processor to implement the steps of the above-described embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc.
The present invention is not limited to the above-described embodiments, and various modifications and variations of the present invention are intended to be included within the scope of the claims and the equivalent technology of the present invention if they do not depart from the spirit and scope of the present invention.
Claims (18)
1. A screen sharing method is characterized by comprising the following steps:
the method comprises the steps that a sharing and issuing terminal responds to a trigger operation of screen sharing to obtain a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information comprises location information of the screen frame image in the canvas;
the sharing receiving terminal receives the video coded data and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
2. The screen sharing method according to claim 1, wherein:
and the resolution ratio of the canvas is the resolution ratio corresponding to the full-screen desktop of the sharing and issuing terminal.
3. The screen sharing method of claim 1,
the sharing and publishing terminal encodes the canvas and the identification information according to an encoding mode corresponding to the resolution of the canvas to obtain video encoding data to be shared, and the method comprises the following steps:
the canvas is encoded by adopting a fixed encoding resolution ratio to obtain first encoded data; wherein the encoding resolution is the same size as the resolution of the canvas;
coding the identification information in an SEI mode to obtain second coded data;
and packaging the first coded data and the second coded data to obtain the video coded data to be shared.
4. The screen sharing method according to claim 1, wherein:
the position information includes a resolution of the screen frame image;
the sharing receiving terminal displays the screen frame image, and the method comprises the following steps:
and creating a rendering window, and adjusting the stacking relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
5. The screen sharing method of claim 4, wherein:
the resolution of the screen frame image comprises a first transverse pixel point number and a first longitudinal pixel point number;
the resolution of the rendering window comprises a second transverse pixel point number and a second longitudinal pixel point number;
the sharing receiving terminal adjusts the stacking relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image, and the method comprises the following steps:
when the number of the second transverse pixel points is larger than or equal to that of the first transverse pixel points and the number of the second longitudinal pixel points is larger than or equal to that of the first longitudinal pixel points, the screen frame image is overlapped on the rendering window, so that the screen frame image is displayed in the rendering window;
when the number of the second transverse pixel points is less than that of the first transverse pixel points and/or the number of the second longitudinal pixel points is less than that of the first longitudinal pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; displaying, within the rendering window, a corresponding position of the slid screen frame image when responding to sliding of the screen frame image within the rendering window.
6. The screen sharing method of claim 5, wherein:
the sharing receiving terminal is provided with a watching mode selection control; the viewing mode selection control comprises a first mode control; the first mode control is to instruct display of the screen frame image without scaling according to a resolution of the screen frame image;
the sharing receiving terminal adjusts the stacking relationship between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is not zoomed according to the resolution of the screen frame image before being displayed in the rendering window, and the sharing receiving terminal further comprises: and receiving the triggering operation of the first mode control.
7. The screen sharing method of claim 6, wherein:
the viewing mode selection control further comprises a second mode control for instructing enlarged display of the screen frame image;
the screen sharing method further includes: the sharing receiving terminal receives the triggering operation of the second mode control, overlaps the screen frame image on the rendering window, keeps the proportion of the first horizontal pixel point and the first longitudinal pixel point unchanged, and enlarges the screen frame image according to a first preset scaling factor to display the screen frame image in the rendering window in an enlarged manner;
And/or the viewing mode selection control further comprises a third mode control for instructing reduced display of the screen frame image;
the screen sharing method further includes: the sharing receiving terminal receives the triggering operation of the third mode control, overlaps the screen frame image on the rendering window, keeps the proportion of the first horizontal pixel point and the first longitudinal pixel point unchanged, reduces the screen frame image according to a second preset scaling coefficient, and reduces and displays the screen frame image in the rendering window;
and/or the viewing mode selection control further comprises a fourth mode control for instructing to display the screen frame image at a preset scale;
the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the fourth mode control, stacks the screen frame image on the rendering window, keeps the ratio of the number of the first horizontal pixels to the number of the first longitudinal pixels unchanged, reduces or enlarges the screen frame image according to a third preset scaling factor, and displays the screen frame image in the rendering window according to a preset ratio.
8. The screen sharing method of claim 6, wherein:
the viewing mode selection control further comprises a fifth mode control; the fifth mode control is to instruct display of the screen frame image at a resolution that matches the rendering window;
the screen sharing method further includes: and the sharing receiving terminal receives the triggering operation of the fifth mode control, stacks the screen frame image on the rendering window, keeps the proportion of the first horizontal pixel point and the first longitudinal pixel point unchanged, enlarges or reduces the screen frame image, and displays the screen frame image in the rendering window at a resolution ratio matched with that of the rendering window.
9. The screen sharing method according to any one of claims 4 to 8,
the position information further comprises coordinates of preset points of the screen frame image in the canvas;
the sharing receiving terminal obtains the screen frame image from the canvas according to the identification information, and the method comprises the following steps:
and intercepting the screen frame image from the canvas according to the coordinates of the preset points of the screen frame image in the canvas and the resolution of the screen frame image.
10. A screen sharing method is applied to a sharing and publishing terminal and is characterized by comprising the following steps:
responding to the trigger operation of screen sharing, and acquiring a screen frame image corresponding to a screen area to be shared;
creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared;
and sending the video coded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coded data according to a decoding mode corresponding to the canvas resolution, and a screen frame image is obtained.
11. A screen sharing method is applied to a sharing receiving terminal and is characterized by comprising the following steps:
receiving video coding data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
Decoding the video coded data according to a decoding mode corresponding to the canvas resolution to obtain the canvas and the identification information of the screen frame image on the canvas;
acquiring a screen frame image from the canvas according to the identification information;
and displaying the screen frame image.
12. The screen sharing method of claim 11, wherein:
the position information includes a resolution of the screen frame image;
the displaying the screen frame image includes:
and creating a rendering window, and adjusting the stacking relation between the screen frame image and the rendering window according to the resolution of the screen frame image and the resolution of the rendering window, so that the screen frame image is displayed in the rendering window without scaling according to the resolution of the screen frame image.
13. The screen sharing method of claim 12, wherein:
the resolution of the screen frame image comprises a first transverse pixel point number and a first longitudinal pixel point number;
the resolution of the rendering window comprises a second transverse pixel point number and a second longitudinal pixel point number;
the adjusting, according to the resolution of the screen frame image and the resolution of the rendering window, the stacking relationship between the screen frame image and the rendering window to enable the screen frame image to be displayed in the rendering window without scaling according to the resolution of the screen frame image includes:
When the number of the second transverse pixel points is larger than or equal to that of the first transverse pixel points and the number of the second longitudinal pixel points is larger than or equal to that of the first longitudinal pixel points, the screen frame image is overlapped on the rendering window, so that the screen frame image is displayed in the rendering window;
when the number of the second transverse pixel points is less than that of the first transverse pixel points and/or the number of the second longitudinal pixel points is less than that of the first longitudinal pixel points, the rendering window is overlapped on the screen frame image in a sliding mode, and the screen frame image is partially displayed in the rendering window; displaying, within the rendering window, a corresponding position of the slid screen frame image when responding to sliding of the screen frame image within the rendering window.
14. The utility model provides a screen sharing system, is including sharing release terminal and sharing receiving terminal, its characterized in that:
the sharing and issuing terminal responds to the trigger operation of screen sharing and acquires a screen frame image corresponding to a screen area to be shared; creating a canvas with a fixed resolution, pasting the screen frame image on the canvas, and obtaining identification information of the screen frame image on the canvas; coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coding data to be shared; sending the video coding data to at least one sharing receiving terminal; wherein the identification information comprises location information of the screen frame image in the canvas;
The sharing receiving terminal receives the video coded data and decodes the video coded data in a decoding mode corresponding to the resolution of the canvas to obtain the canvas and the identification information; and obtaining the screen frame image from the canvas according to the identification information, and displaying the screen frame image.
15. The utility model provides a screen sharing device, is applied to and shares release terminal, its characterized in that, the device includes:
the screen frame image acquisition module is used for responding to the trigger operation of screen sharing and acquiring a screen frame image corresponding to a screen area to be shared;
the identification information acquisition module is used for creating a canvas with fixed resolution, pasting the screen frame image on the canvas and acquiring identification information of the screen frame image on the canvas; wherein the identification information comprises location information of the screen frame image in the canvas;
the coded data acquisition module is used for coding the canvas and the identification information according to a coding mode corresponding to the resolution of the canvas to obtain video coded data to be shared;
and the data sending module is used for sending the video coded data to at least one sharing receiving terminal, so that the sharing receiving terminal decodes the video coded data according to a decoding mode corresponding to the canvas resolution to obtain a screen frame image.
16. The utility model provides a screen sharing device, is applied to and shares receiving terminal, its characterized in that, the device includes:
the data receiving module is used for receiving video coding data; wherein the video encoding data comprises a canvas and identification information of a screen frame image on the canvas; the canvas resolution is fixed, and the identification information comprises position information of the screen frame image in the canvas;
the decoding module is used for decoding the video coded data according to a decoding mode corresponding to the canvas resolution to obtain the canvas and the identification information of the screen frame image on the canvas;
the screen frame acquisition module is used for acquiring a screen frame image from the canvas according to the identification information;
and the display module is used for displaying the screen frame image.
17. A computer device comprising a processor and a memory; characterized in that said memory stores a computer program adapted to be loaded by said processor and to execute the screen sharing method according to any one of claims 1 to 13.
18. A computer-readable storage medium, on which a computer program is stored, the computer program, when being executed by a processor, implementing a screen sharing method according to any one of claims 1 to 13.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110849522.9A CN113596571B (en) | 2021-07-27 | 2021-07-27 | Screen sharing method, device, system, storage medium and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110849522.9A CN113596571B (en) | 2021-07-27 | 2021-07-27 | Screen sharing method, device, system, storage medium and computer equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113596571A true CN113596571A (en) | 2021-11-02 |
CN113596571B CN113596571B (en) | 2024-03-12 |
Family
ID=78250298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110849522.9A Active CN113596571B (en) | 2021-07-27 | 2021-07-27 | Screen sharing method, device, system, storage medium and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113596571B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114356263A (en) * | 2021-12-29 | 2022-04-15 | 威创集团股份有限公司 | Bar screen information display method, bar screen information display device, bar screen information display equipment and readable storage medium |
CN115209117A (en) * | 2022-07-20 | 2022-10-18 | 北京字跳网络技术有限公司 | Screen projection method and device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130169644A1 (en) * | 2011-08-11 | 2013-07-04 | Dexdyne Limited | Apparatus and Method for Displaying Telemetry Data |
US9451197B1 (en) * | 2010-04-12 | 2016-09-20 | UV Networks, Inc. | Cloud-based system using video compression for interactive applications |
CN110770785A (en) * | 2017-06-29 | 2020-02-07 | 皇家Kpn公司 | Screen sharing for display in VR |
CN110806846A (en) * | 2019-10-11 | 2020-02-18 | 北京字节跳动网络技术有限公司 | Screen sharing method, screen sharing device, mobile terminal and storage medium |
CN110852946A (en) * | 2019-10-30 | 2020-02-28 | 北京字节跳动网络技术有限公司 | Picture display method and device and electronic equipment |
CN111694603A (en) * | 2019-03-12 | 2020-09-22 | 腾讯科技(深圳)有限公司 | Screen sharing method and device, computer equipment and storage medium |
CN112367521A (en) * | 2020-10-27 | 2021-02-12 | 广州华多网络科技有限公司 | Display screen content sharing method and device, computer equipment and storage medium |
-
2021
- 2021-07-27 CN CN202110849522.9A patent/CN113596571B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9451197B1 (en) * | 2010-04-12 | 2016-09-20 | UV Networks, Inc. | Cloud-based system using video compression for interactive applications |
US20130169644A1 (en) * | 2011-08-11 | 2013-07-04 | Dexdyne Limited | Apparatus and Method for Displaying Telemetry Data |
CN110770785A (en) * | 2017-06-29 | 2020-02-07 | 皇家Kpn公司 | Screen sharing for display in VR |
US20200401362A1 (en) * | 2017-06-29 | 2020-12-24 | Koninklijke Kpn N.V. | Screen sharing for display in vr |
CN111694603A (en) * | 2019-03-12 | 2020-09-22 | 腾讯科技(深圳)有限公司 | Screen sharing method and device, computer equipment and storage medium |
CN110806846A (en) * | 2019-10-11 | 2020-02-18 | 北京字节跳动网络技术有限公司 | Screen sharing method, screen sharing device, mobile terminal and storage medium |
CN110852946A (en) * | 2019-10-30 | 2020-02-28 | 北京字节跳动网络技术有限公司 | Picture display method and device and electronic equipment |
CN112367521A (en) * | 2020-10-27 | 2021-02-12 | 广州华多网络科技有限公司 | Display screen content sharing method and device, computer equipment and storage medium |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114356263A (en) * | 2021-12-29 | 2022-04-15 | 威创集团股份有限公司 | Bar screen information display method, bar screen information display device, bar screen information display equipment and readable storage medium |
CN115209117A (en) * | 2022-07-20 | 2022-10-18 | 北京字跳网络技术有限公司 | Screen projection method and device |
CN115209117B (en) * | 2022-07-20 | 2024-06-18 | 北京字跳网络技术有限公司 | Screen projection method and device |
Also Published As
Publication number | Publication date |
---|---|
CN113596571B (en) | 2024-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107168674B (en) | Screen casting annotation method and system | |
US9723359B2 (en) | Low latency wireless display for graphics | |
CN112235626B (en) | Video rendering method and device, electronic equipment and storage medium | |
CN108108140B (en) | Multi-screen cooperative display method, storage device and equipment supporting 3D display | |
US20090257730A1 (en) | Video server, video client device and video processing method thereof | |
EP4231650A1 (en) | Picture display method and apparatus, and electronic device | |
CN110874959A (en) | Multi-terminal same-screen teaching system and teaching method | |
CN112261434A (en) | Interface layout control and processing method and corresponding device, equipment and medium | |
CN103974007A (en) | Superposition method and device of on-screen display (OSD) information | |
CN113596571B (en) | Screen sharing method, device, system, storage medium and computer equipment | |
CN107331222B (en) | Image data processing method and device | |
CN114710633A (en) | Display apparatus and video signal display method | |
EP3764216B1 (en) | Display device and control method thereof | |
CN112866784A (en) | Large-screen local playback control method, control system, equipment and storage medium | |
CN112583821A (en) | Display method, display system, electronic device, and computer-readable storage medium | |
CN114374853B (en) | Content display method, device, computer equipment and storage medium | |
CN102770827A (en) | Method for displaying multimedia content on a screen of a terminal | |
CN110944140A (en) | Remote display method, remote display system, electronic device and storage medium | |
CN110990109A (en) | Spliced screen redisplay method, terminal, system and storage medium | |
CN114095772B (en) | Virtual object display method, system and computer equipment under continuous wheat direct sowing | |
CN116248889A (en) | Image encoding and decoding method and device and electronic equipment | |
CN114630184A (en) | Video rendering method, apparatus, device, and computer-readable storage medium | |
CN115396717B (en) | Display device and display image quality adjusting method | |
CN119629491A (en) | Image optimization method, device, equipment and storage medium | |
CN115756709A (en) | Remote control method, device and equipment of display screen input source and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |