[go: up one dir, main page]

CN118689427A - Multi-screen collaborative control method, electronic device and system - Google Patents

Multi-screen collaborative control method, electronic device and system Download PDF

Info

Publication number
CN118689427A
CN118689427A CN202310312265.4A CN202310312265A CN118689427A CN 118689427 A CN118689427 A CN 118689427A CN 202310312265 A CN202310312265 A CN 202310312265A CN 118689427 A CN118689427 A CN 118689427A
Authority
CN
China
Prior art keywords
application
electronic device
event
window
control information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310312265.4A
Other languages
Chinese (zh)
Inventor
赵笑天
倪维江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202310312265.4A priority Critical patent/CN118689427A/en
Priority to PCT/CN2024/078622 priority patent/WO2024193301A1/en
Publication of CN118689427A publication Critical patent/CN118689427A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种多屏协同控制方法、电子设备及系统,涉及终端技术领域。实现了多端的同时操作输入,解决了多屏协同场景下只能响应一端操作输入的问题。其中,第一电子设备显示第一窗口,第一窗口中包括第一应用的应用界面,第二电子设备显示第二窗口,第二窗口包括第一电子设备投射的第一应用的应用界面;第一电子设备检测到作用于第一窗口的第一操作,并接收到来自第二电子设备的第一控制信息,第一控制信息为第二电子设备根据作用于第二窗口上的第二操作生成;第一电子设备通过第一应用对应的第一程序,对第一操作和第一控制信息进行预设处理,得到第一事件;响应第一事件,第一电子设备对第一应用执行与第一事件对应的操作指令。

The present application discloses a multi-screen collaborative control method, electronic device and system, and relates to the field of terminal technology. It realizes simultaneous operation input of multiple terminals, and solves the problem that only one terminal operation input can be responded to in a multi-screen collaborative scenario. Among them, the first electronic device displays a first window, and the first window includes an application interface of a first application, and the second electronic device displays a second window, and the second window includes an application interface of the first application projected by the first electronic device; the first electronic device detects a first operation acting on the first window, and receives a first control information from the second electronic device, and the first control information is generated by the second electronic device according to the second operation acting on the second window; the first electronic device performs preset processing on the first operation and the first control information through the first program corresponding to the first application, and obtains a first event; in response to the first event, the first electronic device executes an operation instruction corresponding to the first event on the first application.

Description

Multi-screen cooperative control method, electronic equipment and system
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a multi-screen cooperative control method, electronic equipment and a system.
Background
With the development of electronic devices, users may have multiple electronic devices such as a mobile phone and a personal computer (personal computer, PC) at the same time, and data sharing between the multiple electronic devices may be realized.
In order to more conveniently realize data sharing among different electronic devices, one possible way may be realized by a multi-screen collaboration technology. Taking multi-screen cooperation between the mobile phone and the PC as an example, after the mobile phone and the PC are connected in a wired or wireless mode, a window of a mobile phone interface can be displayed on a display screen of the PC, which can be called as a mobile phone window. In the operation process of the multi-screen coordination, the operation interface of the mobile phone can be displayed on the display screen of the PC through a mobile phone window. The user can use the application program installed by the mobile phone in the mobile phone window of the PC through the keyboard, the mouse or the touch screen of the PC. Or the PC can further simultaneously run a plurality of application programs installed on the mobile phone in a plurality of independent windows, namely different application interfaces are displayed through the plurality of mobile phone windows.
At present, in the operation process of multi-screen collaboration, although a user can use related applications in a mobile phone by operating a PC (personal computer), and can also use related applications in the mobile phone by directly operating the mobile phone, when operation inputs exist at both ends of the PC and the mobile phone, the related applications in the mobile phone cannot be used simultaneously. For example, when the operation input performed by the PC end is not yet finished and the mobile phone end starts to perform the operation input, the ongoing operation input event may be preempted or interrupted, so that multi-end simultaneous operation input cannot be implemented, and user experience is affected.
Disclosure of Invention
The application provides a multi-screen cooperative control method, electronic equipment and a system, which can realize multi-terminal simultaneous operation input and solve the problem that only one terminal operation input can be responded under a multi-screen cooperative scene.
In order to achieve the above purpose, the embodiment of the application adopts the following technical scheme:
In a first aspect, a multi-screen cooperative control method is provided, which is applied to a first electronic device with a multi-screen cooperative function, and the method includes: displaying a first window, wherein the first window comprises an application interface of a first application; detecting a first operation acting on a first window, and receiving first control information from second electronic equipment, wherein the first control information is generated by the second electronic equipment according to a second operation acting on the second window, and the second window comprises an application interface of a first application projected by the first electronic equipment; the method comprises the steps that through a first program corresponding to a first application, first operation and first control information are subjected to preset processing to obtain a first event, and the preset processing is used for indicating first electronic equipment to respond to the first operation and the first control information; and responding to the first event, and executing an operation instruction corresponding to the first event on the first application.
That is, in the multi-screen collaboration scenario, when a user operates an application through a first electronic device and also operates the application through a second electronic device, the user can trigger the first electronic device to perform preset processing on all input conditions of the application through a specific program (such as a service device, a thread or a process) corresponding to the application, so as to ensure that the first electronic device can respond to the input generated simultaneously. Therefore, multi-terminal simultaneous operation input is realized, and the problem that only one terminal operation input can be responded under a multi-screen collaborative scene is solved. The multi-terminal simultaneous touch control requirements of users are met, for example, a plurality of users need to cooperatively operate a game, a plurality of users need to mutually supplement file materials, and a plurality of users need to annotate a document together.
In one possible implementation manner, the multi-screen cooperative control method further includes: and when the application interface of the first application is projected to a second window of the second electronic device for display, creating a first program corresponding to the first application. Thus, when the first electronic device screens the interface of the started application to the second electronic device for display, the specific program of the application can be correspondingly created.
In one possible implementation manner, before the first program corresponding to the first application is created, the multi-screen cooperative control method further includes: it is detected that the first application supports a multi-touch mode. The multi-touch mode may be a mode that allows multiple operation events to be input simultaneously. In this way, a corresponding specific program can be created for an application supporting the multi-touch mode.
In one possible implementation, the first operation is a touch operation corresponding to a first operation position in the first application, the first control information is a touch operation corresponding to a second operation position in the first application, and the first event includes the first operation acting on the first operation position and the first control information acting on the second operation position. In this way, the above-described specific program can be passed when the user is different in the operation position in one application by the first electronic device and the operation position in the application by the second electronic device. The two operations are treated as two operation events (equivalent to multi-touch) which are performed concurrently.
In one possible implementation, in response to a first event, executing, on a first application, an operation instruction corresponding to the first event, including: and responding to the first operation acted on the first operation position and the first control information acted on the second operation position, displaying a first application updated application interface in the first window, and projecting the first application updated application interface to a second window of the second electronic device for display. Thus, after the specific program processes the operations at two ends into two operation events which are performed concurrently, the first application can respond to the two operations at the same time, and a corresponding update interface is displayed in an application window of the first application.
For example, when the mobile phone and the notebook computer perform collaborative display of a document application of the mobile phone, if the user 1 uses the mobile phone to insert the annotation 1 at the document position 1 in the document application, and meanwhile, the user 2 uses the notebook computer to insert the annotation 2 at the document position 2 in the collaborative document application, the document application of the mobile phone can respond to the two annotation insertion operations at the same time, the mobile phone can display the annotation 1 inserted by the user 1 at the document position 1 and the annotation 2 inserted by the user 2 at the document position 2 in an interface of the document application, and synchronously display the annotation 1 inserted by the user 1 at the document position 1 and the annotation 2 inserted by the user 2 at the document position 2 in an interface of the collaborative document application on the notebook computer. Giving the user 1 and user 2 an experience that their own inputs are responded to.
In one possible implementation, when the first operation and the first control information are the same touch operation acting on the same operation position in the first application, the first event includes the first operation or the first control information acting on the same operation position. Thus, when the user operates the first electronic device in one application and the second electronic device in the application in the same manner (e.g., the same operation position and the same operation type), both operations can be kept one by the above-described specific program (corresponding to the operation of deleting the repeated input at one position).
In one possible implementation, in response to a first event, executing, on a first application, an operation instruction corresponding to the first event, including: and responding to the first operation or first control information acting on the same operation position, displaying a first application updated application interface in the first window, and projecting the first application updated application interface to a second window of the second electronic device for display. Thus, after the specific program reserves one operation for the operations at both ends, the first application may respond to the reserved operation and display a corresponding update interface in an application window of the first application.
For example, when the mobile phone and the notebook computer perform collaborative display of a game application of the mobile phone, if the user 1 performs a click operation using the control position 1 of the mobile phone in the game application and meanwhile the user 2 also performs a click operation using the notebook computer in the control position 1 of the collaborative game application, the game application of the mobile phone may only respond to one of the two click operations, the mobile phone may display an updated interface in an interface of the game application, and the updated interface is synchronously displayed in an interface of the collaborative game application on the notebook computer, which also gives the user 1 and the user 2 an experience that their own inputs are responded.
In one possible implementation manner, the first operation is a touch operation corresponding to a third operation position in the first application and the first pressure information, the first control information includes a touch operation corresponding to the third operation position in the first application and the second pressure information, the first event is a new touch operation acting on the third operation position and the third pressure information, and the third pressure information is pressure information obtained by superposing the first pressure information and the second pressure information. In this way, when the user operates in one application through the first electronic device and the second electronic device at the same position as the operation in the application, and the operation carries pressure, the above-described specific program can be passed. The two operations are handled as one new operation, that is, an operation in which the pressure of the two operations is superimposed (equivalent to a new operation in which new pressure information is carried) is carried at the operation position.
In one possible implementation manner, the first operation is a touch operation corresponding to a third operation position in the first application and the first color information, the first control information includes a touch operation corresponding to the third operation position in the first application and the second color information, the first event is a new touch operation acting on the third operation position and the third color information, and the third color information is color information obtained by fusing the first color information and the second color information. In this way, when the user operates in one application through the first electronic device and the second electronic device at the same position in the application, and the operation carries a color, the above-described specific program can be passed. The two operations are handled as one new operation, i.e., an operation in which the color of the two operations is fused is carried at the operation position (equivalent to a new operation carrying new color information).
In one possible implementation, in response to a first event, executing, on a first application, an operation instruction corresponding to the first event, including: and responding to a first event acting on the third operation position, displaying a first application updated application interface in the first window, and projecting the first application updated application interface to a second window of the second electronic device for display. Thus, after the specific program processes the operations at the two ends into a new operation after the information amounts carried by the two operations are overlapped, the first application can respond to the new operation, and a corresponding update interface is displayed in an application window of the first application.
In one possible implementation manner, the first application does not support the multi-touch mode, and after detecting the first operation on the first window and receiving the first control information from the second electronic device, the multi-screen cooperative control method further includes: selecting one from the first operation and the first control information as a target event; and responding to the target event, and executing an operation instruction corresponding to the target event on the first application. In this way, for an application that does not support the multi-touch mode, the first electronic device may not create a specific program of the application, and for multi-terminal simultaneous input of the application, preemption may still be performed at the application framework layer, so that the first application can only respond to one input.
In one possible implementation manner, the multi-screen cooperative control method further includes: detecting a third operation acting on the first window, and receiving second control information from second electronic equipment, wherein the second control information is generated by the second electronic equipment according to a fourth operation acting on a third window, and the third window comprises an application interface of a second application projected by the first electronic equipment; the third operation is subjected to preset processing through a first program corresponding to the first application to obtain a second event, wherein the preset processing is used for indicating the first electronic equipment to respond to the third operation; the second control information is subjected to preset processing through a second program corresponding to the second application to obtain a third event, wherein the preset processing is used for indicating the first electronic equipment to respond to the second control information; and responding to the second event and the third event, executing the operation instruction corresponding to the second event on the first application, and executing the operation instruction corresponding to the third event on the second application. When the first electronic device starts a plurality of applications to be displayed in the second electronic device, the first electronic device can create a corresponding specific program for each application to be displayed, so that input conditions received by the application are respectively subjected to preset processing, and responses of the multiple inputs can be accurately executed when the multiple inputs are simultaneously input.
In one possible implementation, the second application does not support the multi-touch mode, and after detecting the third operation on the first window and receiving the second control information from the second electronic device, the method further includes: the third operation is subjected to preset processing through a first program corresponding to the first application, so that a second event is obtained; and responding to the second event and the second control information, executing an operation instruction corresponding to the second event on the first application, and executing an operation instruction corresponding to the second control information on the second application. Thus, when the first electronic device is to start a plurality of applications to be displayed in the second electronic device, the first electronic device can select whether to create a corresponding specific program according to whether each of the applications to be displayed supports a multi-touch mode. If there is a screen-cast application supporting the multi-touch mode, a corresponding specific program can be created for the application to perform preset processing on the input of the application. If there is an application that is screen-projected and does not support the multi-touch mode, the prior art flow is still adopted for responding to the input of the application.
In a possible implementation manner, the first program corresponds to the first queue, and the preset processing is performed on the first operation and the first control information through the first program corresponding to the first application to obtain a first event, which includes: storing the first operation and the first control information to a first queue; and carrying out preset processing on the first operation and the first control information in the first queue through a first program corresponding to the first application to obtain a first event. In this way, the first electronic device may store the multi-terminal inputs of the same time period (equivalent to the same time) into the queue, so that a specific program may perform preset processing on the multi-terminal input of each time period from the queue.
In a second aspect, a multi-screen cooperative control method is provided, and the method is applied to a multi-screen cooperative system including a first electronic device and a second electronic device, and includes: the first electronic device displays a first window, wherein the first window comprises an application interface of a first application; the second electronic device displays a second window, wherein the second window comprises an application interface of the first application projected by the first electronic device; the first electronic device detects a first operation acting on the first window and receives first control information from the second electronic device, wherein the first control information is generated by the second electronic device according to a second operation acting on the second window; the first electronic equipment performs preset processing on the first operation and the first control information through a first program corresponding to the first application to obtain a first event, wherein the preset processing is used for indicating the first electronic equipment to respond to the first operation and the first control information; in response to the first event, the first electronic device executes an operation instruction corresponding to the first event on the first application.
In one possible implementation manner, the multi-screen cooperative control method further includes: when the first electronic device projects an application interface of the first application to a second window of the second electronic device for display, the first electronic device creates a first program corresponding to the first application.
In one possible implementation manner, before the first program corresponding to the first application is created, the multi-screen cooperative control method further includes: the first electronic device detects that the first application supports a multi-touch mode.
In one possible implementation, the first operation is a touch operation corresponding to a first operation position in the first application, the first control information is a touch operation corresponding to a second operation position in the first application, and the first event includes the first operation acting on the first operation position and the first control information acting on the second operation position.
In one possible implementation, when the first operation and the first control information are the same touch operation acting on the same operation position in the first application, the first event includes the first operation or the first control information acting on the same operation position.
In one possible implementation manner, the first operation is a touch operation corresponding to a third operation position in the first application and first pressure information, the first control information is a touch operation corresponding to the third operation position in the first application and second pressure information, the first event is a new touch operation acting on the third operation position and third pressure information, and the third pressure information is pressure information obtained by superposing the first pressure information and the second pressure information;
and/or
The first operation is a touch operation corresponding to a third operation position and first color information in the first application, the first control information is a touch operation corresponding to the third operation position and second color information in the first application, the first event is a new touch operation acting on the third operation position and the third color information, and the third color information is color information obtained by fusing the first color information and the second color information.
In one possible implementation manner, the multi-screen cooperative control method further includes: the second electronic device displays a third window, wherein the third window comprises an application interface of a second application projected by the first electronic device; the first electronic device detects a third operation acting on the first window and receives second control information from the second electronic device, wherein the second control information is generated by the second electronic device according to a fourth operation acting on the third window; the first electronic device performs preset processing on the third operation through a first program corresponding to the first application to obtain a second event, wherein the preset processing is used for indicating the first electronic device to respond to the third operation. The first electronic equipment performs preset processing on the second control information through a second program corresponding to the second application to obtain a third event, wherein the preset processing is used for indicating the first electronic equipment to respond to the second control information; and responding to the second event and the third event, the first electronic equipment executes the operation instruction corresponding to the second event on the first application, and executes the operation instruction corresponding to the third event on the second application.
In a third aspect, an electronic device is provided, the electronic device including a display module, a detection module, a processing module, and an execution module. The display module is used for displaying a first window, and the first window comprises an application interface of a first application. The detection module is used for detecting a first operation acting on the first window and receiving first control information from the second electronic device, wherein the first control information is generated by the second electronic device according to a second operation acting on the second window, and the second window comprises an application interface of a first application projected by the first electronic device. The processing module is used for carrying out preset processing on the first operation and the first control information through a first program corresponding to the first application to obtain a first event, and the preset processing is used for indicating the first electronic equipment to respond to the first operation and the first control information. And the execution module is used for responding to the first event and executing the operation instruction corresponding to the first event on the first application.
In one possible implementation, the electronic device further includes a creation module for creating a first program corresponding to the first application when the application interface of the first application is projected to be displayed in the second window of the second electronic device.
In one possible implementation, the electronic device further includes a determining module configured to detect whether the first application supports the multi-touch mode before creating the first program corresponding to the first application. And when the multi-touch mode is detected to be supported, the creation module is operated.
In one possible implementation manner, when the first operation is a touch operation corresponding to a first operation position in the first application and the first control information is a touch operation corresponding to a second operation position in the first application, the first event processed by the processing module includes the first operation acting on the first operation position and the first control information acting on the second operation position. At this time, the execution module may be configured to respond to the first operation acting on the first operation position and the first control information acting on the second operation position, display an application interface updated by the first application in the first window, and project the application interface updated by the first application to the second window of the second electronic device for display.
In one possible implementation manner, when the first operation and the first control information are the same touch operation applied to the same operation position in the first application, the first event processed by the processing module includes the first operation or the first control information applied to the same operation position. At this time, the execution module may be configured to respond to the first operation or the first control information acting on the same operation position, display the application interface updated by the first application in the first window, and project the application interface updated by the first application to the second window of the second electronic device for display.
In one possible implementation manner, the first operation is a touch operation corresponding to a third operation position and first pressure information in the first application, the first control information includes a touch operation corresponding to the third operation position and second pressure information in the first application, the first event obtained by processing by the processing module is a new touch operation acting on the third operation position and the third pressure information, and the third pressure information is pressure information obtained by overlapping the first pressure information and the second pressure information;
and/or
The first operation is touch operation corresponding to a third operation position and first color information in the first application, the first control information comprises touch operation corresponding to the third operation position and second color information in the first application, the first event obtained through processing by the processing module is new touch operation acting on the third operation position and the third color information, and the third color information is color information obtained through fusion of the first color information and the second color information.
At this time, the execution module may be configured to respond to a first event acting on the third operation position, display an application interface updated by the first application in the first window, and project the application interface updated by the first application to display in the second window of the second electronic device.
In one possible implementation manner, the first application does not support the multi-touch mode, and the execution module is also configured to select one from the first operation and the first control information as a target event; and responding to the target event, and executing an operation instruction corresponding to the target event on the first application.
In one possible implementation, the detection module may also be configured to detect a third operation acting on the first window, and receive second control information from the second electronic device, where the second control information is generated by the second electronic device according to a fourth operation acting on a third window, and the third window includes an application interface of the second application projected by the first electronic device.
At this time, the processing module may be configured to perform preset processing on the third operation through a first program corresponding to the first application, to obtain a second event, where the preset processing is used to instruct the first electronic device to respond to the third operation; and carrying out preset processing on the second control information through a second program corresponding to the second application to obtain a third event, wherein the preset processing is used for indicating the first electronic equipment to respond to the second control information.
And the execution module is used for responding to the second event and the third event, executing the operation instruction corresponding to the second event on the first application and executing the operation instruction corresponding to the third event on the second application.
In one possible implementation manner, when the second application does not support the multi-touch mode, the execution module may be configured to respond to the second event and the second control information, execute an operation instruction corresponding to the second event on the first application, and execute an operation instruction corresponding to the second control information on the second application.
In one possible implementation, the first program corresponds to a first queue, and the processing module is configured to store the first operation and the first control information to the first queue; and carrying out preset processing on the first operation and the first control information in the first queue through a first program corresponding to the first application to obtain a first event.
In a fourth aspect, a multi-screen collaboration system is provided, where the system includes a first electronic device and a second electronic device as in the first aspect, where the first electronic device and the second electronic device may perform, through interaction, a multi-screen collaboration control method as in any one of the possible implementations of the first aspect.
In a fifth aspect, an electronic device is provided, where the electronic device may be the first electronic device in the foregoing first aspect, or may be the second electronic device in the foregoing first aspect. The electronic device includes: a memory, a display screen, a communication module, and one or more processors; the memory, display screen and processor are coupled. Wherein the memory is for storing computer program code, the computer program code comprising computer instructions; the processor is configured to execute the one or more computer instructions stored in the memory when the electronic device is running, to cause the electronic device to perform the method performed in any one of the possible implementations of the first aspect.
In a sixth aspect, a multi-screen collaboration apparatus is provided, where the apparatus is included in an electronic device, and the apparatus has a function of implementing the behavior of the electronic device in any one of the foregoing first aspect and possible implementation manners of the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or units corresponding to the functions described above.
In a seventh aspect, a chip system is provided, the chip system being applied to an electronic device. The system-on-chip includes one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a wire. The interface circuit is for receiving a signal from a memory of the electronic device and transmitting the signal to the processor, the signal including computer instructions stored in the memory. When the processor executes the computer instructions, the electronic device performs the multi-screen cooperative control method in any one of the possible implementation manners of the first aspect.
In an eighth aspect, there is provided a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the multi-screen cooperative control method in any of the possible implementations of the first aspect.
In a ninth aspect, a computer program product is provided which, when run on an electronic device, causes the electronic device to perform the multi-screen cooperative control method in any of the possible implementations of the first aspect.
It will be appreciated that the benefits achieved by the above-provided multi-screen cooperative control method according to the second aspect and any one of the possible implementations thereof, the electronic device according to the third aspect, the multi-screen cooperative control system according to the fourth aspect, the electronic device according to the fifth aspect, the apparatus according to the sixth aspect, the chip system according to the seventh aspect, the computer storage medium according to the eighth aspect, and the computer program product according to the ninth aspect may refer to the benefits in the first aspect and any one of the possible implementations thereof, and are not repeated herein.
Drawings
Fig. 1 is a schematic diagram of a multi-screen collaboration system according to an embodiment of the present application;
Fig. 2 is a schematic diagram of an application scenario of a multi-screen cooperative control method according to an embodiment of the present application;
fig. 3 is a second application scenario schematic diagram of a multi-screen cooperative control method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a first electronic device according to an embodiment of the present application;
Fig. 5 is a schematic structural diagram of a second electronic device according to an embodiment of the present application;
Fig. 6 is a schematic architecture diagram of an operating system in a first electronic device and a second electronic device according to an embodiment of the present application;
Fig. 7 is a schematic diagram of a multi-screen collaboration process according to an embodiment of the present application;
Fig. 8 is a schematic diagram two of a multi-screen collaboration process according to an embodiment of the present application;
fig. 9 is a schematic flow chart of a multi-screen cooperative control method according to an embodiment of the present application;
fig. 10 is a schematic diagram of program creation in a multi-screen cooperative control method according to an embodiment of the present application;
fig. 11 is a schematic diagram illustrating a program processing in a multi-screen cooperative control method according to an embodiment of the present application;
fig. 12 is a schematic diagram of a fusion process in a multi-screen cooperative control method according to an embodiment of the present application.
Detailed Description
In the description of the present application, unless otherwise indicated, "/" means that the objects associated in tandem are in a "or" relationship, e.g., A/B may represent A or B; the "and/or" in the present application is merely an association relationship describing the association object, and indicates that three relationships may exist, for example, a and/or B may indicate: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural.
In the description of the present application, unless otherwise indicated, "a plurality" means two or more than two. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a and b, a and c, b and c, a and b and c, wherein a, b and c can be single or multiple.
In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, "first electronic device" and "second electronic device" are used only to indicate different electronic devices, "first window" and "second window" are used only to indicate different display windows, "first operation" and "second operation" are used only to indicate operations at different times or for different purposes. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
The particular features, structures, or characteristics of the application may be combined in any suitable manner in one or more embodiments. In various embodiments of the present application, the sequence number of each process does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
Some optional features of the embodiments of the present application may be implemented independently without depending on other features in some scenarios, so as to solve corresponding technical problems, achieve corresponding effects, and may also be combined with other features according to requirements in some scenarios.
In the present application, the same or similar parts between the embodiments may be referred to each other unless specifically stated otherwise. In the various embodiments of the application, if there is no specific description or logical conflict, terms and/or descriptions between the various embodiments will be consistent and will reference each other. The embodiments of the present application do not limit the scope of the present application.
In addition, the service scenario described in the embodiment of the present application is for more clearly describing the technical solution of the embodiment of the present application, and does not constitute a limitation on the technical solution provided in the embodiment of the present application, and as a person of ordinary skill in the art can know that, with the evolution of the network architecture and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is applicable to similar technical problems.
With the development of terminal technology, various electronic devices start playing an important role in life of people, such as mobile phones in pockets and PCs (including desktop computers or notebook computers) on desks, which are not anytime providing more convenient, efficient and intelligent services for people. At present, in order to further integrate and promote the capabilities of the electronic devices, the capabilities of migrating video streams, audio streams, input streams and the like of one electronic device to another electronic device can be realized through a multi-screen collaborative technology, and the situation that the electronic devices work relatively independently is broken.
However, in the multi-screen collaborative operation process, if the touch operation performed by the user at one end (such as the PC end) is not yet finished, the touch operation started by the user at the other end (such as the mobile phone end) can preempt the ongoing touch event, so that the user cannot truly perform simultaneous touch input of different ends, and the problem that part of users need simultaneous touch is solved, for example, a plurality of users need to annotate the same document together, and a plurality of users need to operate the same game together is solved.
In order to solve the above problems, an embodiment of the present application provides a multi-screen collaborative control method, which can create a corresponding program according to an application type of multi-screen collaboration, so as to use the program to centralize and fuse input conditions of multiple ends to the application, and then inject a result after the fusion processing into the application for processing. The multi-screen cooperative control method provided by the embodiment of the application can realize simultaneous operation input of multiple ends, and solves the problem that only one end can respond to operation input under a multi-screen cooperative scene.
It can be understood that, when the multi-screen collaboration relates to a plurality of applications, the multi-screen collaboration control method provided by the embodiment of the application can create a corresponding program according to the application type of each application, so as to perform fusion processing on the input conditions of the corresponding application by multiple ends by using respective independent program sets, and then inject the result after the fusion processing into the corresponding application for processing. Therefore, under the multi-application screen projection scene of multi-screen cooperation, the multi-terminal simultaneous touch control of each of the plurality of applications can be processed through different established programs. Therefore, the multi-terminal simultaneous touch control of each application can be fused without mutual interference, the operation input of one application can be carried out simultaneously by the multi-terminal under the multi-application screen-throwing scene, and the problem that the operation input of one terminal can only be responded under the multi-application screen-throwing scene is solved.
The multi-screen cooperative control method provided by the embodiment of the application is described below with reference to the accompanying drawings.
The multi-screen cooperative control method provided by the embodiment of the application can be applied to the multi-screen cooperative system 100 shown in fig. 1. As shown in fig. 1, a first electronic device 101 and a second electronic device 102 may be included in the multi-screen collaboration system 100.
The first electronic device 101 and the second electronic device 102 may be a mobile phone, a PC (including a desktop computer or a notebook computer), a tablet computer, a smart television, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a vehicle-mounted device, a virtual reality device, and the like, which have a display function, and the specific types of the first electronic device and the second electronic device are not limited in the embodiments of the present application.
The first electronic device 101 and the second electronic device 102 may be the same type of electronic device or different types of electronic devices. For example, the first electronic device 101 and the second electronic device 102 may be mobile phones, or the first electronic device 101 and the second electronic device 102 may be notebook computers, or the first electronic device 101 may be a mobile phone or a tablet computer, the second electronic device 102 may be a notebook computer or a smart television, or the like.
In some embodiments, the first electronic device 101 may establish a communication connection with the second electronic device 102 to enable a multi-screen collaborative interaction function across devices, across systems, through the established communication connection.
In one scenario, a wireless communication connection may be established between the first electronic device 101 and the second electronic device 102 by means of "bump-and-bump", "sweep-and-sweep" (e.g., scanning a two-dimensional code or bar code), "near auto-discovery" (e.g., via bluetooth or wireless fidelity (WIRELESS FIDELITY, wi-Fi)), and so on.
Wherein the first electronic device 101 and the second electronic device 102 may follow a wireless transmission protocol, and transmit information through a wireless connection transceiver. The wireless transmission protocol may include, but is not limited to, bluetooth (BT) transmission protocol, wi-Fi transmission protocol, or the like. For example, the Wi-Fi transport protocol can be a Wi-Fi P2P transport protocol. The wireless connection transceiver includes, but is not limited to, a Bluetooth, wi-Fi, etc. transceiver. Information transmission between the first electronic device 101 and the second electronic device 102 is achieved through wireless pairing. In the embodiment of the present application, the information transmitted between the first electronic device 101 and the second electronic device 102 includes, but is not limited to, content data (such as a standard video stream) and control instructions that need to be displayed.
In one scenario, a wired communication connection may also be established between the first electronic device 101 and the second electronic device 102. For example, a wired communication connection is established between the first electronic device 101 and the second electronic device 102 through a video adapter (video GRAPHICS ARRAY, VGA), a digital video interface (digital visual interface, DVI), a high-definition multimedia interface (high definition multimedia interface, HDMI), a data transmission line, or the like. Information transmission is achieved between the first electronic device 101 and the second electronic device 102 through the established wired communication connection. The present application is not limited to a specific communication connection manner between the first electronic device 101 and the second electronic device 102.
In some embodiments, after the first electronic device 101 establishes a communication connection with the second electronic device 102, the second electronic device 102 may display an interface of the first electronic device 101. For example, the second electronic device 102 displays an interface of a desktop application of the first electronic device 101 through one window, or the second electronic device 102 displays interfaces of a plurality of applications of the first electronic device 101 through a plurality of windows.
In one scenario, the first electronic device 101 may be a source device (or referred to as a source), and the second electronic device 102 may be a target device (or referred to as a sink) for the first electronic device 101. The first electronic device 101 may project content (such as a picture, a video, an audio, a document, an application, or a task in an application) in the first electronic device 101 to a display screen of the second electronic device 102 through a screen projection protocol such as a Digital Living Network Alliance (DLNA) protocol, a MiraCast protocol, a ChromeCast protocol, etc., so as to display the content, and implement a multi-screen collaborative display function. So that the user can use the relevant functions provided by the first electronic device 101 in the second electronic device 102.
Optionally, the second electronic device 102 is one or more. The first electronic device 101 may project the content in the first electronic device 101 to one or more display screens of the second electronic device 102 for display, so as to implement a multi-terminal multi-screen collaborative display function. So that different users can use the related functions provided by the first electronic device 101 in different second electronic devices 102.
In a scenario, the first electronic device 101, as the source device, may not only project the interface of the application being displayed to the display screen of the second electronic device 102 for display, so that the second electronic device 102 can synchronously display the interface of the application currently displayed by the first electronic device 101 in the form of a window. The window displayed by the second electronic device 102 may be referred to as a drop screen window or a collaborative window.
In a scenario, the first electronic device 101 may be used as a source device to project interfaces of other applications not displayed in the first electronic device 101 to a display screen of the second electronic device 102 for display, so that the second electronic device 102 may display interfaces of a plurality of applications projected by the first electronic device 101 in a form of a plurality of windows. I.e. the number of projection windows or co-windows displayed on the second electronic device 102 may be one or more.
In one scenario, a change in the content currently displayed by the first electronic device 101 may be synchronously displayed on the second electronic device 102. Meanwhile, the user can also operate the content projected by the first electronic device 101 through the second electronic device 102, so that the first electronic device 101 responds to the operation of the second electronic device 102 to display the corresponding screen content.
For example, the user may operate on the projected content via hardware (e.g., keyboard, mouse, touch screen) of the second electronic device 102, etc. The second electronic device 102 may send the operation of the user to the first electronic device 101, trigger the first electronic device 101 to respond to the operation of the user, update the related content, and project the updated content to the display screen of the second electronic device 102 again for display.
The following will exemplarily describe a multi-screen collaboration process by taking the first electronic device 101 as a mobile phone and the second electronic device 102 as a PC.
In one scenario, when a user needs to project data such as an application and a document in a mobile phone (i.e., the first electronic device 101) to a PC for display, the NFC function of the mobile phone may be turned on, so that the mobile phone approaches or contacts an electronic tag (or NFC chip) on the PC. In this way, the mobile phone can read the device information of the PC from the electronic tag (or NFC chip) by transmitting a near field signal in a state where the mobile phone and the electronic tag (or NFC chip) are close to each other. Furthermore, the mobile phone can establish wireless communication connection with the PC according to the equipment information of the PC. The wireless communication connection may be a bluetooth connection, a Wi-Fi connection, etc., which the embodiments of the present application do not limit in any way.
Of course, besides establishing wireless communication connection with the PC by touching the electronic tag on the PC, those skilled in the art may also design other ways to establish communication connection between the mobile phone and the PC, which is not limited in the embodiment of the present application. For example, a user may connect a cell phone to a PC using a data line, thereby establishing a communication connection between the cell phone and the PC. For another example, the mobile phone may acquire device information of the PC by scanning a two-dimensional code or a bar code displayed on the PC, and establish a wireless communication connection with the PC.
After the communication connection for multi-screen collaboration is established between the handset and the PC, the handset may initiate an interface of one or more applications projected to the second electronic device 102 on the handset in the form of a free-floating window in response to a user's initiation of one or more applications on the handset. The first electronic device 101 may render the one or more free floating window application interfaces onto a virtual screen (virtual display). The first electronic device 101 may then encode the Surface corresponding to the virtual screen into a standard video stream for transmission to the second electronic device 102 to complete a multi-screen collaborative display of one or more application interfaces on the first electronic device 101 on the second electronic device 102.
In one scenario, after a communication connection for multi-screen collaboration is established between the handset and the PC, a screen window consistent with the current display interface of the handset may be displayed on the PC. Optionally, as shown in fig. 2 (a), when the interface currently displayed by the mobile phone is the interface of the desktop application, the interface of the desktop application displayed on the mobile phone is synchronously displayed on the PC in the form of a free floating window 103. And then, the user can input operations on the free floating window 103 on the PC and/or the interface on the mobile phone, so that the cooperative work of the PC and the mobile phone is realized.
Optionally, when the user clicks an icon of the chat application on the interface of the desktop application displayed on the mobile phone to start the chat application, in response to the start operation of the chat application, as shown in fig. 2 (b), the mobile phone may update the interface of the chat application currently displayed on the mobile phone, and simultaneously, the free-floating window 103 on the PC may also update the interface of the chat application displayed on the mobile phone synchronously.
Optionally, the user's initiation of the chat application on the handset may also occur on the PC. Illustratively, the user may click on an icon of the chat application to launch the chat application in the free floating window 103 of the interface where the desktop application of the mobile phone is displayed on the PC by clicking on an external input device such as a mouse or clicking on a touch screen. And responding to clicking of the icon of the chat application, the PC can send a reverse control command to the mobile phone through a communication link between the PC and the mobile phone, the reverse control command can be used for indicating to start the chat application, and after the mobile phone receives the reverse control command, the interface content to be displayed can be determined to be the interface of the chat application according to the reverse control command. The handset may then update the interface displaying the chat application and instruct the PC to also update the interface displaying the chat application synchronously in the free floating window 103.
That is, the user can operate the interface of the desktop application on the mobile phone through any one of the mobile phone and the PC. It will be appreciated that when the mobile phone and the PC cooperatively display interfaces of other applications, the user may also operate the interfaces of other applications through any one of the devices in the mobile phone and the PC.
In one scenario, after a communication connection for multi-screen collaboration is established between the handset and the PC, a plurality of screen-drop windows consistent with the interfaces of a plurality of applications launched by the handset may also be displayed on the PC. As shown in fig. 3 (a), when the interface of the desktop application currently displayed by the mobile phone is synchronously displayed on the PC in the form of a free floating window 103, the interface of the gallery application started on the mobile phone is also displayed in the form of a free floating window 104. Then, the user can input operation on the free floating window 103 or the free floating window 104 on the PC and/or interface input operation on the mobile phone, so as to realize the cooperative work of the PC and the mobile phone.
Alternatively, after the handset and the PC establish a communication connection for multi-screen collaboration, the interface of the desktop application currently displayed by the handset screen may be automatically displayed on the display screen of the PC in the form of a free floating window 103. The user can click on the icon of the gallery application in the free floating window in a mode of clicking by an external input device such as a mouse or clicking by a touch screen.
As one embodiment, in response to a click operation of an icon of the gallery application, the free floating window 104 on the PC on which the interface of the gallery application is displayed may be popped up directly.
As another embodiment, in response to the clicking operation of the icons of the gallery application, the PC may also switch the interface of the desktop application in the floating-free window 103 to the interface of the gallery application, and the user may separate the floating-free window 104, on which the interface of the gallery application is displayed, from the floating-free window 103 through an input window separation operation.
In one scenario, in response to a click operation of an icon of the pair of gallery applications, the free floating window 104 with the interface of the gallery application displayed may be displayed in the free floating window 103 in the form of a tab. The user can separate the independent free floating window 104 from the free floating window 103 through an input window separation operation, such as a drag operation by an external input device such as a mouse or a touch screen selection drag operation.
It can be understood that a plurality of application interfaces can be displayed in the free floating window 103 in the form of tabs, when a user clicks on a tab of each application, the area of the free floating window 103 currently used for displaying the screen content of the mobile phone can display the application interface corresponding to the tab, and meanwhile, the mobile phone can synchronously update and display the application interface corresponding to the tab.
In a scenario, in response to the clicking operation of the icons of the gallery application, as shown in fig. 3 (b), the PC may switch the interface of the desktop application in the floating window 103 to the interface of the gallery application, and at this time, the mobile phone may update and display the interface of the gallery application synchronously. The free floating window 103 has a first control 105 ("+" button) displayed thereon for indicating window separation. The user can click on the first control through a mode of clicking an external input device such as a mouse or clicking a touch screen. In response to the clicking operation, the PC may separate the interface of the gallery application currently displayed in the floating free window 103 from the floating free window 103 in the form of an independent floating free window 104, as shown in (a) of fig. 3, while the PC may display the previous interface in the floating free window 103, at which time the mobile phone may update and display the previous interface synchronously. The previous interface may refer to the interface displayed last, i.e., the interface of the desktop application.
It will be appreciated that the interface displayed last in the floating free window 103 may be an interface of another application, and that when the PC separates the interface currently displayed in the floating free window 103 from the floating free window 103 in the form of a separate floating free window, the PC may display the interface of the other application displayed last in the floating free window 103.
In one scenario, the user may continue clicking on the chat application's icon in the free floating window 103 to launch the chat application in the manner described above, and in response to the clicking operation on the chat application, the PC may also pop up a new free floating window 106 with the chat application's interface displayed, as shown in fig. 3 (c).
Or in response to the clicking operation of the pair of chat applications, as shown in fig. 3 (d), the PC may also switch the interface of the desktop application in the free-floating window 103 to the interface of the chat application, and the free-floating window 103 has a first control 105 ("+" button) displayed thereon for indicating window separation. At this time, the mobile phone can synchronously update and display the interface of the chat application. Thereafter, in response to the clicking operation of the first control 105 by the user, the PC may separate the interface of the chat application currently displayed in the free floating window 103 from the free floating window 103 in the form of an independent free floating window 106, as shown in (c) in fig. 3, and at the same time, the PC may display the interface of the previous display, that is, the interface of the desktop application, in the free floating window 103, and at this time, the mobile phone may update the interface of the display of the desktop application synchronously.
In one scenario, the user may also click on the chat application icon directly on the desktop application interface displayed by the mobile phone to start the chat application, and in response to the start operation of the chat application, the mobile phone may update the chat application interface displayed by the mobile phone, and simultaneously the free floating window 103 on the PC may also update the chat application interface displayed by the mobile phone, and the free floating window 103 may display a first control 105 ("+" button) for indicating window separation, as shown in fig. 3 (d). At this time, the user may perform an operation input to the chat application in the free floating window 103 on the PC, may perform an operation input to the gallery application in the free floating window 104, and may perform an operation input to the chat application on the mobile phone.
When the user needs the PC to display the screen-throwing window of more applications, the user can click the first control 105 of the free floating window 103 on the PC to separate the interface of the chat application currently displayed in the free floating window 103 from the free floating window 103 in the form of an independent free floating window 106, as shown in (c) in fig. 3, and at the same time, the PC can display the interface of the previous display, that is, the interface of the desktop application, in the free floating window 103, and at the same time, the mobile phone can synchronously update and display the interface of the desktop application. After that, the user can continue clicking the icon of the new application on the interface of the desktop application displayed by the mobile phone in the above manner, so as to start the new application to be projected on the screen to be displayed on the PC.
It should be noted that the above-listed possible manners in which one or more screen shots may be displayed by a PC are only for illustration, and the manner in which one or more screen shots may be displayed by a PC is not limited by the embodiments of the present application.
It can be appreciated that through multi-screen collaboration, a convenient use experience can be provided for a user. For example, since the size of the display screen of the PC is often larger than that of the display screen of the mobile phone, the viewing experience of the user can be improved, or a plurality of application interfaces on the mobile phone are displayed through the PC, so that the user can conveniently complete data dragging transmission between different applications, such as quickly dragging pictures in a gallery application to a chat interface in a chat application. As another example, the mouse of the PC may act as a finger of the user, enabling more accurate touch operations on the interface of the chat application. For another example, the large-size physical keyboard of the PC can replace a small-size virtual input method window on a mobile phone display screen, so that better character input experience is realized. For another example, the multi-channel stereo speaker of the PC may replace the speaker of the mobile phone, and output audio from the mobile phone (such as a voice message from an interface of a chat application, a music file being played by the mobile phone, etc.), so as to achieve enhancement of volume and tone quality.
Referring to fig. 4, fig. 4 is a schematic diagram of a hardware structure of a first electronic device and a second electronic device according to an embodiment of the present application, taking a mobile phone as an example.
As shown in fig. 4, the first electronic device may include a processor 410, an external memory interface 420, an internal memory 421, a universal serial bus (universal serial bus, USB) interface 430, a charge management module 440, a power management module 441, a battery 442, an antenna 1, an antenna 2, a mobile communication module 450, a wireless communication module 460, an audio module 470, a speaker 470A, a receiver 470B, a microphone 470C, an earphone interface 470D, a sensor module 480, keys 490, a motor 491, an indicator 492, a camera 493, a display screen 494, and a subscriber identity module (subscriber identification module, SIM) card interface 495, etc. Among other things, the sensor module 480 may include a pressure sensor 480A, a gyroscope sensor 480B, an air pressure sensor 480C, a magnetic sensor 480D, an acceleration sensor 480E, a distance sensor 480F, a proximity light sensor 480G, a fingerprint sensor 480H, a temperature sensor 480J, a touch sensor 480K, an ambient light sensor 480L, a bone conduction sensor 480M, and the like.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the first electronic device. In other embodiments, the first electronic device may include more or fewer components than shown, or may combine certain components, or may split certain components, or may have a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In the embodiment of the present application, the processor 410 may control the mobile communication module 450 and the wireless communication module 460 to send the content to be screened in the first electronic device, such as a picture, a video, an audio, a document, an application, or a task in the application, to the second electronic device. In one scenario, the processor 410 may obtain an interface currently displayed on the display screen 494, or obtain an interface of one or more applications started by the first electronic device, perform encoding, and then transmit the encoded data to the second electronic device through the mobile communication module 450 and the wireless communication module 460, and display the encoded data through the display screen of the second electronic device, so as to realize collaborative display of the first electronic device and the second electronic device.
The display screen 494 is used to display a graphical user interface (GRAPHICAL USER INTERFACE, GUI). For convenience of description, the graphical user interface is simply referred to as a user interface. The first electronic device presents or displays corresponding content, such as video, text, images, etc., to the user by displaying a user interface on the display screen 494. In an embodiment of the present application, the display screen 494 may be used to display an application interface, such as an interface of a desktop application, an interface of a gallery application, or an interface of another application.
The pressure sensor 480A is used for sensing a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 480A may be disposed on display screen 494. The pressure sensor 480A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. When a touch operation is applied to the display screen 494, the first electronic device detects the touch operation intensity according to the pressure sensor 480A. The first electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor 480A.
In an embodiment of the present application, pressure sensor 480A may be used to detect a user's manipulation of an application window.
The touch sensor 480K, also referred to as a "touch panel". The touch sensor 480K may be disposed on the display screen 494, and the touch sensor 480K and the display screen 494 form a touch screen, which is also called a "touch screen". The touch sensor 480K is used to detect touch operations (e.g., long press, up slide, left slide, single click, double click, etc.) acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. In an embodiment of the present application, the touch sensor 480K may be used to detect a touch operation acting on an application window. In the embodiment of the present application, the touch operation detected by the touch sensor 580K may be an operation performed on or near the touch screen by a finger, or an operation performed on or near the touch screen by a user using a stylus, a touch pen, a touch ball, or other touch auxiliary tools, which is not limited by the present application.
Referring to fig. 5, fig. 5 is a schematic diagram of a hardware structure of a second electronic device according to an embodiment of the present application, taking a PC as an example.
As shown in fig. 5, the second electronic device may include a processor 510, an external memory interface 520, an internal memory 521, a usb interface 530, a charge management module 540, a power management module 541, a battery 542, an antenna 3, an antenna 4, a mobile communication module 550, a wireless communication module 560, an audio module 570, a speaker 570A, a receiver 570B, a microphone 570C, an earphone interface 570D, a sensor module 580, keys 590, a motor 591, an indicator 592, a camera 593, a display screen 594, a SIM card interface 595, and the like. The sensor module 580 may include a pressure sensor 580A, a gyroscope sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, an ambient light sensor 580L, a bone conduction sensor 580M, and the like.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the second electronic device. In other embodiments, the second electronic device may include more or fewer components than shown, or may combine certain components, or may split certain components, or may have a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In an embodiment of the present application, the processor 510 may control the mobile communication module 550 and the wireless communication module 560 to receive the content to be screened from the first electronic device, and then decode the content to be screened and display the content on the display screen 594.
The USB interface 530 may be used to transfer data between the second electronic device and the input/output device. Wherein the input/output devices are such as a wired mouse, a wired keyboard, etc.
In the embodiment of the present application, the processor 510 may obtain an input operation (such as a clicking operation) of the user on the application window through the wired mouse through the USB interface 530, or obtain an input operation of the user on the application window through the wired keyboard.
In some embodiments, the second electronic device may interact with the input device through the wireless communication module 560 based on wireless communication technology, such as receiving an input operation triggered by a user through the input device, and so on. Illustratively, an input device such as a wireless mouse, wireless keyboard, or the like. A user input operation such as an operation in which a user clicks a control (button).
The display screen 594 is used to display a user interface to present or display corresponding content, such as video, text, images, etc., to a user. The display screen 594 includes a display panel. The display panel may employ LCD, OLED, AMOLED, FLED, miniled, microLed, micro-oLed, QLED, etc.
In the embodiment of the present application, the display screen 594 may be used to display the content to be screened sent by the first electronic device. Optionally, the display screen 594 is used to display an interface of an application currently displayed by the first electronic device in the form of one window, or to display interfaces of a plurality of applications started on the first electronic device in the form of a plurality of independent windows. That is, the display screen 594 may be used to display interfaces for other applications not displayed by the first electronic device.
In embodiments of the present application, key 590 may comprise a keyboard, which may comprise a physical keyboard, a touch keyboard, or the like. The second electronic device may receive input from a keyboard, generating key signal inputs related to user settings and function control of the second electronic device.
For example, the software systems of the first electronic device and the second electronic device provided by the embodiments of the present application may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. For example, the software system may include, but is not limited to, sebanAndroid device Apple treeBlackberry (Black)Operating systems such as hong Meng (Harmony), and the like, the present application is not limited.
The software architecture of the first electronic device and the second electronic device provided by the application will be described below by taking the first electronic device as a mobile phone and the second electronic device as a PC as an example.
As an example, fig. 6 is a software architecture block diagram of an example of a mobile phone and a PC according to an embodiment of the present application. The software structure block diagram shows the software structure of the Android system with the layered architecture of the mobile phone and the Windows system of the PC, and can support multi-screen cooperative operation between the mobile phone and the PC.
For the software structure of the mobile phone, the layered structure divides the software into a plurality of layers, and each layer has clear roles and division. The layers communicate with each other through a software interface. In some embodiments, the Android system includes, from top to bottom, an Application (APP) layer, an application framework (frame) layer, and a kernel layer (also referred to as a driver layer), respectively.
The application layer may include a series of application packages (application package, APK). For example, the application packages may include collaborative applications, memos, music, video, gallery, bluetooth, short messages, desktop (desktop) applications, and the like. For convenience of description, an application program will be hereinafter simply referred to as an application. The application on the mobile phone may be a native application (for example, an application installed in the mobile phone when the operating system is installed before the mobile phone leaves the factory), or may be a third party application (for example, an application installed by a user through an application store), which is not limited in the embodiment of the present application.
A collaborative application is illustrated in fig. 6. The collaboration application may be used to implement multi-screen collaboration between multiple electronic devices. In a scene, when the mobile phone and the PC perform multi-screen collaboration, the collaborative application can be used for realizing information interaction between the mobile phone and the PC, window layout display, monitoring operation of a user sent by the PC in a certain collaborative window and the like.
In some embodiments, the collaborative application may be an application in a cell phone, which may be a multi-screen collaborative assistant. The collaborative application may also be program code preset for the mobile phone, and there may be no corresponding desktop application icon. The collaborative application may also be a service provided in an application program in the handset, which may be a handset manager, a computer assistant, or other application program, and embodiments of the present application are not limited.
In one scenario, the collaborative application includes a window state monitoring module, which may be configured to monitor an operation state of an application on a mobile phone, or monitor an operation state change of the application in a certain collaborative window sent by a PC, so as to keep synchronization between interface content of the application on the mobile phone and interface content of the collaborative window corresponding to the application on the PC.
Optionally, the window state monitoring module may be further configured to monitor an input event of a user on the mobile phone, or monitor an input event of a user in a certain collaboration window sent by the PC, so as to respond to a user operation input on the mobile phone and/or a user operation input on the PC, and keep the interface content of the corresponding collaboration window on the PC synchronous with the interface content after the response on the mobile phone.
In one scenario, the collaborative application further includes a data input processing module that may be configured to process an input event reported by the framework layer in response to the input event. For example, in response to the input event, an Activity (Activity) window corresponding to the input event is started, and the Activity window can be understood as an application window.
In some embodiments, the collaborative application further includes a video codec module that may perform encoding, decoding, etc. of video. In the embodiment of the application, the information between the mobile phone and the PC can be interacted in the form of video stream. For example, the video coding and decoding module encodes display data of the interface of the mobile phone desktop application to be displayed by the screen to a video stream, and sends the video stream to the PC, and then the video coding and decoding module of the PC 200 decodes the video stream to obtain the display data of the interface of the mobile phone desktop application to be displayed by the screen, and further displays the interface of the mobile phone desktop application in a multi-screen collaborative window of the PC according to the display data of the interface of the mobile phone desktop application.
The application framework layer may provide an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 6, the application framework layer may include a multi-screen framework that may be used to manage multi-screen collaboration related transactions for control and implementation of inter-device multi-screen collaboration functions.
In the embodiment of the application, the multi-screen framework can comprise a program input processing module which can be used for independently starting a program for each application related to multi-screen cooperation in the mobile phone. The program may be a service or a lifecycle-managed process or thread. Each application-initiated program may be configured to fuse input events of the application, where the input events are fused by multiple ends (e.g., a mobile phone end and a PC end), and transfer the fused input events to an application layer.
In one scenario, each application-initiated program may pass the fused input event to the data input processing module of the application layer, which responds to the fused input event.
In the embodiment of the application, the multi-screen framework can also comprise an application type identification module used for identifying the type of the application related to the multi-screen cooperation in the mobile phone. Optionally, the application type identifying module is configured to identify whether each application involved in the multi-screen collaboration supports a multi-touch mode. The multi-touch mode may refer to a mode that allows multiple input events to occur simultaneously. For example, a plurality of users cooperate to operate a game, a plurality of users supplement file materials to each other, a plurality of users annotate a document, and the like, and a scene in which a plurality of users operate at the same time exists.
In the embodiment of the application, the multi-screen framework can also comprise a multi-display management module for managing mirror image display of a plurality of displays with various display types, including a local display type of a mobile phone, an HDMI display type, a display type supporting Wi-FI DISPLAY protocol and the like, so that the logic display of the multi-screen framework is controlled according to the currently connected external display equipment. That is, the multi-display management module may create and manage one or more logical displays (which may be referred to hereinafter as virtual screens, display modules). One display module can store display data of an application Activity window on the mobile phone.
In some embodiments, the multi-display management module may obtain the size of a window to be displayed on the mobile phone, obtain the interface content or display data of the window to be displayed, and so on. It should be understood that the to-be-displayed window of the mobile phone may include an Activity window being displayed on the screen of the mobile phone, and may also include an Activity window of one or more applications running in the background of the mobile phone.
In some embodiments, the multi-screen framework may further include an application launch management module for managing related transactions of application launches. For example, determining which applications are currently started by the mobile phone, and displaying the application window in a mode (such as a floating window, a full screen, etc.) of starting the application window, a stack of the starting window, a width and height information in an application starting config, etc.
In some embodiments, the application framework layer may also include an input event management server (IMS), content provider, view system, phone manager, resource manager, notification manager, etc. (not shown in fig. 6).
The input event management server (input MANAGER SERVICE, IMS) may be configured to translate, package, etc. the original input event, to obtain an input event containing more information.
In the embodiment of the application, the IMS can call the application type identification module of the multi-screen framework to judge the application type of the application aimed at by the input event, and when judging that the application supports the multi-touch mode, the IMS can send the input event to the program input processing module of the multi-screen framework, and the program input processing module is used for specially processing the input event aimed at the application and distributing the processed input event to the corresponding application. Optionally, when it is determined that the application does not support the multi-touch mode, the IMS may directly distribute the input event to the corresponding application.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications.
The kernel layer is a layer between hardware and software that can provide WLAN, bluetooth capabilities, and basic communication protocols. In the embodiment of the application, the kernel layer can comprise a bottom layer capability module for multi-screen transmission, and the bottom layer capability module is used for providing the transmission capability of reverse control and the transmission capability of video stream in the multi-screen cooperation process. As shown in fig. 6, the multi-screen transmission underlying capability module may include an input transmission module and a video streaming transmission module.
The input transmission module is used for transmitting input events in the multi-screen cooperation process. In the embodiment of the application, when the mobile phone and the PC are in multi-screen coordination, the input transmission module can be used for receiving the input event of the user in a certain coordination window, which is sent by the PC, in the multi-screen coordination process.
The video stream transmission module is used for transmitting the content to be screened of the mobile phone in the multi-screen cooperation process. In the embodiment of the application, after the video coding and decoding module in the cooperative application codes the display data of the content to be screen-cast of the mobile phone into the video stream, the video stream transmission module in the bottom layer capability module can be called to transmit the video stream to the mobile phone through the screen-cast protocol. Wherein, the screen-drop protocol may include, but is not limited to: airlay technology, DLNA protocol, miraCast protocol, chromeCast protocol, cast+ protocol.
In some embodiments, the kernel layer may also contain display drivers, input/output device drivers (e.g., keyboard, touch screen, headphones, speakers, microphones, etc.), audio drivers, etc. (not shown in FIG. 6).
It should be noted that the software architecture of the mobile phone shown in fig. 6 is only one implementation manner of the embodiment of the present application, and in practical application, the mobile phone may further include more or fewer software modules, which is not limited herein. For example, the software architecture of the handset may also include a system library. The system library may include a plurality of functional modules. For example: layer integrator (Surface Flinger), surface Manager (Surface Manager), media library (Media Libraries), three-dimensional graphics processing library (e.g., openGL ES), 2D graphics engine (e.g., SGL), etc.
For a software architecture of a PC, a multi-layer architecture may be included, such as a PC software layer, a kernel layer, an Operating System (OS) layer, and the like. Wherein the PC software layer may run executable files (executable file, exe). For example, in the present application, PC 200 may preset an exe program for implementing multi-screen collaboration, which may be embodied on the PC in the form of an App, such as EMUI desktop, PC manager, etc. Or the PC has the function of displaying the window on the mobile phone by running the multi-screen collaborative exe program. The application refers to an exe program provided by a PC for displaying a window function on a mobile phone as a cooperative application.
The collaborative application on the PC may include an input processing module, configured to receive an input event from a user on the PC, and send the input event to the mobile phone for processing.
Since the coordinate information of the original input event received by the PC is the coordinate information of the original input event relative to the display screen of the PC, the input processing module may be further configured to send the original input event and the coordinate information to the input transmission module (the bottom capability module) for preprocessing and transmitting to the mobile phone when determining that the original input event is acted on the display content of the mobile phone displayed in the display screen of the PC. Upon determining that the original input event is acting on the display content of the PC displayed in the display screen of the PC, the original input event and the coordinate information are transmitted to the processor of the PC to process the input event.
The collaborative application on the PC may further include a window status synchronization module configured to determine an operating status of an application in a collaborative window on a display of the PC, so as to keep interface content of the collaborative window on the PC synchronized with interface content of a corresponding application on the mobile phone.
The collaborative application on the PC may further include a multi-window layout management module, configured to receive display data of an application window sent by the mobile phone, and display one or more collaborative windows according to the display data of the application window, where the display data of the application window may include coordinate information of the application window on the mobile phone, content information of a display interface of the application window on the mobile phone, and so on. The multi-window layout management module can determine to display one or more collaborative windows consistent with the application window on the mobile phone on the display screen of the PC according to the coordinate information of the application window on the mobile phone.
In some embodiments, the collaborative application further includes a video codec module that may perform encoding, decoding, etc. of video. In the embodiment of the application, the video encoding and decoding module of the PC can be used for decoding the video stream of the content to be projected on the screen, which is sent by the mobile phone, so as to obtain the display data of the mobile phone content to be projected on the screen, and further, the display data of the mobile phone content is displayed on the display screen of the PC in a window mode.
In some embodiments, the software architecture of the PC may also include an underlying capability module that provides the transmission capability of the reverse control and the transmission capability of the video stream in a multi-screen collaboration process. As shown in fig. 6, the PC may include an underlying capability module of a multi-screen transmission, which may include an input transmission module and a video streaming transmission module.
The input transmission module is used for transmitting input events in the multi-screen cooperation process. In the embodiment of the application, when the mobile phone and the PC are in multi-screen coordination, the input transmission module can preprocess the original input event received by the PC so as to process the original input event into the input event which can be identified by the mobile phone, and then the processed input event is transmitted to the mobile phone end for processing through a screen throwing protocol.
In some embodiments, when the original input event received by the PC carries the operation information such as pressure intensity and color, the input transmission module may also transmit the operation information such as pressure intensity and color and the processed input event to the mobile phone end for processing through the screen projection protocol.
The video stream transmission module can be used for receiving the video stream of the content to be screened sent by the mobile phone, sending the video stream to the PC software layer, and decoding the video stream by the video encoding and decoding module of the PC software layer.
It should be noted that the software architecture of the PC shown in fig. 6 is only one implementation manner of the embodiment of the present application, and in practical applications, the PC may further include more or fewer software modules, which is not limited herein.
Based on the software modules, the flow of the collaboration between the mobile phone and the PC multi-screen provided by the embodiment of the application can be seen in fig. 7.
When the user performs input operation on the mobile phone, for example, the user clicks an icon of an application on the mobile phone to open the application, or the user instructs to open an interface through voice, the kernel layer can generate a corresponding input event according to the input operation, and report the event to the IMS of the application framework layer. The IMS distributes incoming events to the corresponding applications. The application calls the window management module to start the Activity corresponding to the input event and start the drawing flow of the window corresponding to the Activity.
In one scenario, the drawing process of the window corresponding to the Activity may be that an Activity creates a window, loads a main layout container (DecorView) into the window to cover various components that may appear under the application, where the components are usually the containing relationship of a control container (ViewGroup) and a control (View), calls three processes of measurement (measurement), layout (layout), drawing (draw) in sequence, calculates the width and height of the View, and draws the View on the screen at a specific position in ViewGroup. The screen herein refers to a canvas surface, which synthesizes the final picture, i.e., the interface of the application, through surfaceflinger and presents the interface of the application on the corresponding LocalDeviceDisplay display device DEVICEDISPLAY. LocalDeviceDisplay can be understood as a locally existing physical display device, referred to herein as the display of the handset.
After the handset and the PC multi-screen cooperate successfully, the handset can copy the application interface rendering on LocalDeviceDisplay to the virtual screen VirtualDisplay. And then the mobile phone codes the Surface corresponding to the application interface rendered on the virtual screen into a video stream, and transmits the video stream to the PC through a screen throwing technology. After the PC decodes the video stream, a mobile phone window consistent with an application interface on the mobile phone can be displayed on a PC display screen.
In some scenarios, when a user initiates multiple applications on the handset, the handset may also copy application interface renderings of the multiple applications onto the virtual screen VirtualDisplay. The application interfaces of the applications can include interfaces of the applications which are being displayed by the mobile phone display screen, and also can include interfaces of the applications which are not displayed by the mobile phone display screen, namely interfaces of the applications running in the background of the mobile phone. And then the mobile phone codes the surfaces corresponding to the application interfaces of the plurality of applications rendered on the virtual screen into video streams, and transmits the video streams to the PC through a screen projection technology. After the PC decodes the video streams of the plurality of applications, a plurality of screen throwing windows consistent with the interfaces of the plurality of applications on the mobile phone can be displayed on a PC display screen. As shown in fig. 3 (d), the mobile phone displays an interface of the chat application, the PC display displays two screen-drop windows, one displays an interface of the chat application of the mobile phone, and one displays an interface of the gallery application of the mobile phone.
After one or more screen shots are displayed on the PC, the user may perform input operations on the handset and the PC. For example, the user can perform input operation on an application window currently displayed on a mobile phone display screen, and also perform input operation on a screen throwing window on a PC.
It should be understood that when an input operation for a certain application occurs, for example, when an input operation of a PC on a screen-throwing window is transferred to an application framework layer of a mobile phone through a screen-throwing technology, or when an input operation of a mobile phone is transferred to an application framework layer of a mobile phone through an IMS, a drawing process of the window of the application is re-triggered, that is, viewGroup and its subordinate views start to re-measure views from three processes of measure, layout, drawing, and the result of drawing is transferred to surface, and a final picture synthesized by surfaceflinger is displayed on LocalDeviceDisplay or on a virtual screen, and is displayed on a display screen of the mobile phone or on the screen-throwing window of the PC.
In some scenarios, as shown in fig. 8, when one user or multiple users perform input operations on a mobile phone and a PC at the same time, an input event of the mobile phone is transferred to an application framework layer of the mobile phone through IMS, and an input event of the PC is also transferred to the application framework layer of the mobile phone through a bottom capability module of multi-screen transmission. When the input events of the mobile phone end and the PC end are input to different applications at the same time, the two input events are respectively transmitted to the corresponding applications through the application framework layer for processing.
For example, the application acted by the input event of the mobile phone end synthesizes and renders the content corresponding to the input event to LocalDeviceDisplay, copies the content to the corresponding virtual screen and transmits the content to the PC through the screen projection technology, so that the display content of the mobile phone display screen is updated, and the collaborative window of the application on the PC synchronously updates the content. The application acted by the input event of the PC end renders the content corresponding to the input event to the corresponding virtual screen and transmits the content to the PC through the screen projection technology, so that the background of the mobile phone updates the content of the application, and the collaborative window of the application on the PC synchronously updates the content of the application.
However, when the input events of the mobile phone end and the PC end are input to the same application at the same time, the two input events will preempt, for example, the input event of the latter will preempt the input event of the former, and finally only the input event of one end is transmitted to the application. And then the application synthesizes and renders the content corresponding to the input event at the end to LocalDeviceDisplay, copies the content to the corresponding virtual screen and transmits the content to the PC through a screen projection technology. While the other input event is interrupted and not responded to. That is, the application can only receive the input event of one end for processing, and can not realize the simultaneous input of the different end to the same application, so that the multi-end cooperative operation can be realized only by waiting for one end to stop operation.
In order to solve the above-mentioned problems, an embodiment of the present application provides a multi-screen collaboration control method, after a first electronic device (such as a mobile phone) and a second electronic device (such as a PC) are successfully multi-screen collaboration, an application framework layer of the first electronic device creates a program for each application in response to a user's start operation of one or more applications on the first electronic device. The program can fuse input events of the application corresponding to the program at the two ends of the first electronic equipment and the second electronic equipment at the same time, and then the fused input events are transmitted to the application. Therefore, the two ends of the first electronic equipment and the second electronic equipment can input the same application at the same time, and the requirement of a user for multi-terminal simultaneous touch control is met.
The multi-screen cooperative control method provided by the embodiment of the application can be applied to a scene which needs multi-terminal simultaneous operation under a multi-screen cooperative scene. For example, a game is operated by cooperation among friends, file materials are mutually supplemented among colleagues, a teacher and students endorse an article together, and the method is suitable for scenes of work, education, entertainment and the like, and can be used for cooperation with different devices by using a mobile phone at will.
The multi-screen cooperative control method provided by the embodiment of the application is described in detail below with reference to the accompanying drawings.
After the first electronic device and the second electronic device establish a communication connection for multi-screen collaboration, when the first electronic device starts one or more applications, the first electronic device can project an application interface generated when the one or more applications are run to the second electronic device for display, so that multi-screen collaborative display of the interface of the one or more applications on the first electronic device on the second electronic device is completed.
In the embodiment of the application, the application program framework layer of the first electronic device can create a corresponding program for each projected application. All operations performed on an application, such as the operation of a user on the application at a first electronic device and the operation of a user on the application at a second electronic device, enter a program of the application, and are subjected to centralized processing.
The process of multi-screen cooperative control according to the embodiment of the present application will be described below by taking a first electronic device to start a first application and a second application as an example.
Fig. 9 is a schematic flow chart of a multi-screen cooperative control method between a first electronic device and a second electronic device according to an embodiment of the present application, as shown in fig. 9. The method may include:
s900, the first electronic device displays a first window, wherein the first window comprises an application interface of a first application.
The first application is an application running in the foreground of the first electronic device, which may be a desktop application, or may be any application on the first electronic device, such as a gallery application, and the embodiment of the present application does not limit the type of the first application.
Optionally, when the first electronic device starts the first application, the first electronic device runs the first application in the foreground. At this time, the first electronic device may display an application window of the first application, that is, a first window, on the display screen according to display data of the application interface generated when the first application is running. The first window includes an application interface of the first application.
Optionally, the first electronic device initiates the other application before the first electronic device initiates the first application. The first electronic device may run the other application in the background when the other application is not closed or is not exited. At this time, the first electronic device may cache display data of the application interface generated when running other applications.
In one scenario, a first electronic device initiates a second application before the first electronic device initiates the first application. When the first electronic device runs the first application in the foreground, the first electronic device may run the second application in the background.
In one scenario, when a user desires to multi-screen co-operate a first electronic device with a second electronic device, the user illustratively clicks on the second electronic device to agree to initiate a multi-screen co-operation function option. The second electronic device responds to the operation behaviors of the user, scans the nearby electronic devices (such as the first electronic device) capable of establishing the multi-screen cooperative connection, and initiates a multi-screen cooperative connection request to the scanned electronic devices. If the found electronic device is not the first electronic device which the user wants to perform multi-screen collaboration, the user can click on a 'scan code connection' option in a prompt box of the second electronic device, so that the second electronic device initiates a multi-screen collaboration connection request to the designated first electronic device in a code scanning mode. After receiving the multi-screen collaboration request sent by the second electronic device, the first electronic device displays a prompt box on a display interface of the first electronic device side. Illustratively, the prompt box may include, but is not limited to: options for devices to establish a multi-screen collaborative connection, "cancel" option, and "connect" option. The user clicks a connection option, and the view system of the first electronic device responds to the operation behavior of the user to establish multi-screen cooperative connection with the second electronic device.
In the embodiment of the application, after the first electronic device and the second electronic device establish multi-screen cooperative connection, the first electronic device can create a virtual screen (virtual display) and render the application interface of the first application running on the front stage of the first electronic device on the virtual screen, so that the first electronic device can project the application interface of the first application rendered on the virtual screen to the second electronic device, and the second electronic device can display the application interface of the first application.
When the first electronic device needs to project a plurality of applications onto the second electronic device, the first electronic device can create a corresponding virtual screen according to each application to be projected. The first electronic device can render the drawn application interface according to each virtual screen and project the application interface onto the second electronic device, so that the second electronic device correspondingly displays the application interface of each application.
Alternatively, the first electronic device may create the first virtual screen display1 through the multi-display management module. The display1 corresponds to the display screen of the first electronic device, that is, an interface displayed by the display screen of the first electronic device is consistent with a display interface drawn in the display1. The first electronic device may render an application interface rendering of the first application that is running in the foreground to display1. At this point the first application binds with display1. That is, when the content of the application interface of the first application is updated, the first electronic device synchronously re-renders and draws the application interface of the updated first application to the display1.
Alternatively, the first electronic device may create the second virtual screen display2 through the multi-display management module. The first electronic device may render an application interface rendering of the second application running in the background to display2. The second application is now bound to display2. That is, when the content of the application interface of the second application is updated, the first electronic device synchronously re-renders and draws the application interface of the second application after the update to the display2.
It may be appreciated that an application or applications initiated by the first electronic device may bind a virtual screen, respectively, such that the first electronic device may transfer rendering display data on each virtual screen to the second electronic device via a Cast protocol, such as Cast protocol. The method and the device realize that the application interfaces on the virtual screens of the first electronic equipment are projected to the second electronic equipment for display.
S910, the second electronic device displays a second window and a third window, wherein the second window comprises an application interface of a first application projected by the first electronic device, and the third window comprises an application interface of a second application projected by the first electronic device.
In the embodiment of the application, after the first electronic device and the second electronic device are connected in a multi-screen cooperative manner, the first electronic device can send the first display data of the application interface generated when the foreground is running the first application to the second electronic device. After receiving the first display data from the first electronic device, the second electronic device may display a second window according to the first display data, where the second window includes an application interface of a first application that is running in the foreground of the first electronic device.
For example, when the first application being run by the foreground of the mobile phone (first electronic device) is a desktop application, that is, when the mobile phone currently displays a desktop interface, the mobile phone may project the desktop application being run by the foreground to the PC (second electronic device). At this time, the mobile phone can send display data of a desktop interface generated when the desktop application is currently running to the PC. After the PC receives the display data, a desktop interface of the desktop application can be drawn through a GPU, a display card and other display modules in the PC. At this time, as shown in fig. 2 (a), the PC may draw the desktop interface 103 of the desktop application on the screen through the display data, which may also be referred to as a screen-throwing window of the desktop application on the mobile phone on the PC, where the screen-throwing window is the second window. Because the application interface can be updated in real time (such as switching desktop left and right) when the desktop application is operated by the mobile phone, the mobile phone can send new display data to the PC in real time, so that the PC can continuously draw the application interface displayed by the desktop application in real time on the screen.
In one scenario, the first electronic device may also send second display data of an application interface generated when the second application is run in the background to the second electronic device. After receiving the second display data from the first electronic device, the second electronic device may display a third window according to the second display data, where the third window includes an application interface of a second application running in the background of the first electronic device. Thus, the first electronic device can display the started multiple applications on the second electronic device in a screen-projection mode.
For example, a mobile phone (first electronic device) projects a gallery application example running in the background to a PC (second electronic device), if display data sent by the mobile phone is an application interface generated when the gallery application is running in the background, the PC can draw an application interface of the gallery application through a GPU, a graphics card and other display modules in the PC after receiving the display data. At this time, as shown in fig. 3 (a), the PC may draw the application interface 104 of the gallery application on the screen through the display data, which may also be referred to as a screen-throwing window of the gallery application running in the background on the mobile phone on the PC, where the screen-throwing window is a third window.
S920, the first electronic device creates a first program corresponding to the first application and a second program corresponding to the second application.
In the embodiment of the application, the first electronic device can create a program for each application to be screened, and the program is used for intensively processing the operation event aiming at the application window activity, including the operation event of the user on the first electronic device for the application window activity and also including the operation event of the user on the second electronic device for the application window activity. In this way, when there is an operation input to the application from the first electronic device and an operation input to the application from the second electronic device at the same time, the operation inputs at both ends will not be preempted but will be centrally processed by the created specific program.
In one scenario, when a first electronic device starts a first application and determines that the first application is projected onto a second electronic device for collaborative display, an application framework layer of the first electronic device may create a first program corresponding to the first application. The first program is used for intensively processing the operation event aiming at the first application window activity 1. The method comprises the steps of carrying out operation events on the first application window activity1 by a user on the first electronic equipment, and also comprises the step of carrying out operation events on the first application window activity1 by the user on the second electronic equipment.
In one scenario, when the first electronic device determines that the started second application is projected onto the second electronic device for collaborative display, the application framework layer of the first electronic device may also create a second program corresponding to the second application. The second program is used for intensively processing the operation event aiming at the second application window activity 2. The method comprises the steps of performing operation events on the second application window activity2 on the first electronic device by a user, and performing operation events on the second application window activity2 on the second electronic device by the user.
Optionally, the first program and the second program are a system-managed process or thread, such as a service (service), for the lifecycle. The first program is taken as a first service1, and the second program is taken as a second service 2.
In one scenario, an application framework layer of the first electronic device may bind service1 with the first application and a virtual screen on which the first application projects, that is, bind service1 with windows activity1 and display1 of the first application.
In one scenario, an application framework layer of a first electronic device may bind service2 with a second application and a virtual screen on which the second application projects, that is, bind service2 with windows activity2 and display2 of the second application.
It can be appreciated that when the first electronic device determines that more applications, such as a third application, are projected onto the second electronic device for collaborative display, the application framework layer of the first electronic device may also create a third service3 corresponding to the third application, where the service3 is bound to the window activity3 of the third application and the virtual screen display3 corresponding to the third application. As shown in fig. 10, each of the applications being screened corresponds to one service, and a virtual screen display for screening is performed.
In some embodiments, before the first electronic device creates the service, an application framework layer of the first electronic device may determine whether the screened application supports the multi-touch mode. The multi-touch mode may be a mode that allows multiple operation events to be input simultaneously. For example, a plurality of users cooperate to operate a game, a plurality of users supplement file materials to each other, a plurality of users annotate a document, and the like, and a scene in which a plurality of users operate at the same time exists.
Optionally, if the application framework layer of the first electronic device determines that the first application supports the multi-touch mode, a service1 corresponding to the first application may be created. If the first application is judged not to support the multi-touch mode, the service1 corresponding to the first application may not be created. The embodiment of the application is described by taking the first application as an application supporting a multi-touch mode as an example. For example, the first application may be a game application commonly operated by a plurality of users, or may be a document application commonly operated by a plurality of users.
Optionally, if the application framework layer of the first electronic device determines that the second application supports the multi-touch mode, a service2 corresponding to the second application may be created. If the second application is judged not to support the multi-touch mode, the service2 corresponding to the second application may not be created. The embodiment of the application is described by taking the second application as an application supporting the multi-touch mode as an example. For example, the second application may be a game-like application operated by a plurality of users, or may be a document-like application operated by a plurality of users.
It will be appreciated that when the application being screened does not support the multi-touch mode, it is indicated that the application can only receive and respond to one operation event at a time, and it is not necessary to receive and respond to multiple operation events at the same time, so that a service of the application may not be created. At this time, if there is an operation input to the application from the first electronic device and an operation input to the application from the second electronic device at the same time, the operation inputs of the two electronic devices are still preempted in the application framework layer of the first electronic device, and then the application framework layer will transmit the operation input of the preempted one-end electronic device to the application for processing.
S930, the first electronic device detects a first operation acting on the first window, and receives first control information from the second electronic device, where the first control information is generated by the second electronic device according to a second operation acting on the second window.
In the embodiment of the application, after the application program framework layer of the first electronic device successfully creates the first service1 corresponding to the first application, the application program framework layer of the first electronic device can receive the operation input of the first application from the first electronic device and the second electronic device at the same time.
Optionally, when the input device driver of the first electronic device detects a first operation of the user acting on the first window, the input device driver reports the first operation event to the application framework layer through IMS. The first operation may be a click operation, a sliding operation, a drag operation in the first window, or a click operation, a long press operation, etc. of a control in the first window.
Wherein the first operation event may include an operation type, an operation time, a position coordinate, and the like of the first operation. The operation type includes clicking operation, long-press operation, sliding operation, and the like, and the operation time can be understood as the time when the operation occurs, and the position coordinates of the operation refer to the position coordinates of the operation on the display screen.
Alternatively, the first operation event may also carry parameter information such as pressure intensity, color, and the like.
Optionally, the first operation event may further carry identification information of a first application corresponding to the first window operated by the first operation event, so that an application framework layer of the first electronic device may learn which application window the first operation event is an operation event for.
Optionally, when the input device driver of the second electronic device detects a second operation of the user on the second window, the input device driver may send a second operation event to an application layer of the second electronic device, such as a collaborative application. The collaborative application of the second electronic device may send the first control information to the first electronic device via a Cast protocol, such as Cast protocol, according to the second operational event.
In one scenario, the collaborative application of the second electronic device may send the original second operation event to the input transmission module, so as to pre-process the original second operation event by calling the bottom layer capability module, obtain first control information identifiable by the first electronic device, and then send the first control information to the first electronic device through a screen projection protocol. The first control information comprises a second operation event which can be identified by the first electronic equipment.
The original second operation event may include an operation type, operation time, position coordinates, etc. of the second operation, may also include parameter information such as pressure intensity, color, etc., and may also include identification information of the first application corresponding to the second window operated by the original second operation event.
It will be appreciated that the location coordinates in the original second operation event generally refer to the location coordinates of the second operation on the display screen of the second electronic device, and therefore, the location coordinates need to be converted into location coordinates on the display screen of the first electronic device, so that the first electronic device can accurately identify the active location of the second operation within the application window of the first application.
For example, the second electronic device may establish a rectangular coordinate system with the upper left corner of the display screen as the origin O. Because the application interface displayed on the screen of the second electronic device corresponds to the application interface when the mobile phone runs the first application, the second electronic device can convert any coordinate in the screen into the corresponding coordinate when the mobile phone runs the application interface of the first application according to the corresponding relation. When the second electronic device detects that a user inputs operations such as clicking, double clicking, dragging, mouse scrolling or keyboard inputting in the display screen, the second electronic device can generate a corresponding control message and send the corresponding control message to the first electronic device. For example, when the operation type of the second operation is a click operation, the second electronic device may carry the coordinates of the click operation in its second window in the corresponding control message instead of the coordinates of the click operation on the screen of the second electronic device.
Since the first control information transmitted by the second electronic device through the screen-casting protocol is also transmitted to the application framework layer of the first electronic device, when there is an operation input to the first application from the first electronic device and an operation input to the first application from the second electronic device at the same time, the application framework layer of the first electronic device may receive the reverse control information (first control information) transmitted by the second electronic device through the screen-casting protocol and the operation event (first operation event) transmitted by the first electronic device through the IMS. At this time, the application framework layer of the first electronic device may transfer the operations for the first application to the first service corresponding to the first application.
S940, the first electronic device performs fusion processing on the first operation and the first control information through a first program corresponding to the first application to obtain a first fusion event.
In the embodiment of the application, after receiving the first control information from the second electronic device and the first operation event from the first electronic device, the first service device corresponding to the first application may perform fusion processing on the first operation event and the first control information, so as to obtain a first fusion operation.
In one scenario, when the position coordinates of the first operation event and the second operation event in the first control information in the second window are at the same position, the first service performs fusion processing on the first operation event and the second operation event, which may be to reserve one operation event from the first service, that is, reserve one operation event from a plurality of operation events at the same position. Thus, the obtained first fusion operation is one of the first operation event and the second operation event. Which may be a first operational event or a second operational event, not limited herein. For example, one may be reserved in sequence according to the operation time of the operation event. Thereby achieving the effect of pruning the overlapping operation inputs.
In one scenario, when the position coordinates of the first operation event and the second operation event in the first control information in the second window are at the same position, the first service performs fusion processing on the first operation event and the second operation event, which may be to fuse parameter information of the two operation events, that is, superimpose parameter information of a plurality of operation events at the same position. For example, when the first and second operational events carry pressure intensity information, the pressure intensity values of the two operational events may be superimposed, thereby giving the user's resultant force to both ends an experience of pressing the operation at the same location. For another example, when the first operation event and the second operation event carry color RGB information, the color RGB values of the two operation events may be superimposed, thereby giving the user at both ends an experience of mixing the operation colors at the same location. Thus, the obtained first fusion operation is an operation event carrying new information after the first operation event and the second operation event are fused.
In one scenario, when the position coordinates of the first operation event and the second operation event in the first control information in the second window are different positions, the first service device performs fusion processing on the first operation event and the second operation event, which may be to process two operation events into two operation events that are performed by a user concurrently on the first electronic device, so that there is only one operation input source from an application perspective.
In some embodiments, the first service may store operational events within the same time period in a queue, and may create a thread that processes the operational events stored within each time period in the queue. The time period may be a relatively short unit time period, and the operation events in the time period may be considered to be operation events occurring at the same time. It will be appreciated that the operational events are typically less in a short period of time and the processing time of the thread is faster.
As shown in fig. 11, the first service may store the operation events from the first electronic device and/or the second electronic device per unit time in a queue. Meanwhile, the first service creates a thread, processes the operation events in each unit time in parallel, and fuses the operation events.
In some embodiments, the thread performs fusion processing on the operation events stored in the same time period, which may be to simulate a canvas for matrix representation editing, where points on the canvas represent pixels in a certain range and are used to receive the operation events stored in the matrix in a unit time period. The canvas thus maps coordinate points of each operational event on the display screen (display screen of the first electronic device or application window of the first application) onto points of the canvas. In this way, the operational events can be deposited using a matrix of simulated screens through a completely new processing logic.
As shown in FIG. 12, during thread processing, the canvas may record the increment of the operational event from the first electronic device and/or the second electronic device in this unit time period. The operational events at one end may be preserved as the incremental amount of operational events is processed, such as the same operational location. If the operational event is accompanied by a pressure parameter, an algorithm may be used to superimpose the pressure parameter. If the operation event has color information, the operation events with different color information can be fused from physical aspects to be combined into a new operation event. For another example, different operation positions may be combined, as in the previous embodiment, into two operation events that are performed concurrently.
It will be appreciated that since the number of operational events per unit time is small, and the matrix is large, it is possible that the thread is a sparse matrix when processing, and various existing optimization algorithms can be used to optimize the storage and calculation related problems.
Therefore, the operation event fusion processing provided by the embodiment of the application can realize the duplication elimination of the operation event of the overlapping position of the multi-terminal simultaneous input, and can also realize the compression enhancement of the compression event of the overlapping position of the multi-terminal simultaneous input, and simultaneously, the parallel processing of two-dimensional data is lower in time delay and smoother than the simple one-dimensional data.
In some embodiments, after the first service obtains the first fused event after the fusion processing through the thread, the first fused event may be passed to the application layer. The first fusion event is processed and responded to by a first application of the application layer.
S950, the first electronic device executes the operation instruction corresponding to the first fusion event on the first application.
In the embodiment of the application, after the application program framework layer of the first electronic device obtains the first fusion event after the first operation and the first control information are fused through the first program, the application program framework layer can report the first fusion event to the corresponding application of the application program layer, namely the first application. At this time, the first application may call a corresponding function to execute an operation instruction corresponding to the first fusion event.
In some embodiments, the first application may generate new display data (e.g., a new application interface) when executing the operation instruction corresponding to the first fusion event, and at this time, the first electronic device may render the first virtual screen display1 according to the new display data. The updated application interface of the drawn first application rendered on the first virtual screen may be displayed on a display screen of the first electronic device. Meanwhile, the first electronic device can also continuously send new display data to the second electronic device according to the screen projection method, and trigger the second electronic device to update the application interface of the first application in the second window.
In some embodiments, the first electronic device may also receive only the first control information from the second electronic device at the same time. At this time, the first electronic device may process the first control information through a first program corresponding to the first application. Because no operation event of other electronic equipment exists, the processing is not fused, and the second operation event in the first control information is directly output to the first application in the application program layer for processing.
In some embodiments, at the same time, the first electronic device receives operations from the first electronic device and the second electronic device at the same time, but when windows of different applications are operated, the first electronic device may process operations of the respective applications by using programs corresponding to the respective applications.
In one scenario, when the first electronic device detects a third operation acting on the first window and receives second control information from the second electronic device, the second control information is generated by the second electronic device according to a fourth operation acting on the third window. The first electronic device can perform fusion processing on the third operation through a first program corresponding to the first application to obtain a second fusion operation. And the first electronic equipment performs fusion processing on the second control information through a second program corresponding to the second application to obtain a third fusion operation. And then the first electronic equipment executes the operation instruction corresponding to the second fusion operation on the first application, and executes the operation instruction corresponding to the third fusion operation on the second application.
It will be appreciated that, since the operation input for the first application is only from the first electronic device (i.e. there is only the operation input to the first window) at the same time, no other source input is fused, and therefore, when the operation input of the first electronic device performs the first program corresponding to the first application, the first program may directly transfer the operation input to the first application without further processing the operation input.
Similarly, since the operation input for the second application is only from the second electronic device (i.e. there is only the operation input for the third window) at the same time, and no other source input is fused, when the operation input of the second electronic device performs the second program corresponding to the second application, the second program may directly transfer the operation input to the second application without further processing the operation input.
Therefore, when the first electronic device projects a plurality of applications onto the second electronic device for display, the first electronic device can accurately and mutually noninterfere fusion the multi-terminal operation inputs of the plurality of applications through the programs corresponding to the plurality of applications, so that the multi-terminal simultaneous operation input in a multi-application screen projection scene is realized, the requirement of users on multi-terminal simultaneous touch is met, for example, a plurality of users need to annotate the same document together, and a plurality of users need to operate the same game together.
In some embodiments, when the first application is an application that does not support the multi-touch mode and the second application is an application that supports the multi-touch mode, the first electronic device may create a corresponding second program for only the second application.
In one scenario, when operation inputs from a first electronic device and a second electronic device are both directed at a first application at the same time, the operation inputs of the two electronic devices are still preempted in an application framework layer of the first electronic device, and then the application framework layer transmits the operation input of the preempted one-end electronic device to the first application for processing.
In another scenario, when the operation input from the first electronic device is directed to the first application at the same time, but the operation input from the second electronic device is directed to the second application, the operation input from the first electronic device is directly transferred to the first application through the application framework layer of the first electronic device for processing. And the operation input from the second electronic equipment is processed through a second program corresponding to the second application in the application program framework layer of the first electronic equipment and then is transmitted to the second application for processing.
In some embodiments, the first application is an application supporting the multi-touch mode, and the second application is an application not supporting the multi-touch mode, the first electronic device may create the corresponding second program for the first application only.
In a scene, when operation inputs from a first electronic device and a second electronic device are both aimed at a first application at the same time, the operation inputs of the two electronic devices are input to a first program corresponding to the first application at an application program framework layer of the first electronic device to be fused, and then the processed result is transmitted to the first application to be processed. In another scenario, when the operation input from the first electronic device is directed to the first application at the same time, but the operation input from the second electronic device is directed to the second application, the operation input from the first electronic device is input to a first program corresponding to the first application at an application framework layer of the first electronic device for processing, and then the processed result is transferred to the first application for processing. The operation input from the second electronic device is directly transmitted to the first application for processing through the application framework layer of the first electronic device.
In this way, the application framework layer of the first electronic device may determine whether to create a corresponding specific program according to the difference of the application types projected to the second electronic device. After that, the application framework layer of the first electronic device may determine, according to the source of the received operation input and the target application (corresponding display) on which the operation input acts, whether the input from which source needs to be processed by the specific program and then transferred to the application, or determine which source of input may be directly transferred to the application according to the existing or original procedure. Therefore, under the multi-application screen projection scene, the report processing flow differentiation of the operation input of different applications can be realized according to the application types of the different applications.
The embodiment of the application provides a multi-screen cooperative control method, which can create corresponding programs according to the application type of each application of multi-screen cooperation so as to perform fusion processing on the input conditions of the same application by multiple ends by utilizing independent program sets of the respective applications, and then inject the result after the fusion processing into the same application for processing. The multi-terminal operation input method and device can be used for simultaneously performing operation input at multiple terminals, and the problem that only one terminal operation input can be responded under a multi-screen collaborative scene is solved.
Under the multi-application screen projection scene of multi-screen cooperation, the multi-terminal simultaneous touch control of each of a plurality of applications can be processed through different established programs. Therefore, the fusion of the operation input is established on the basis of aiming at the application and the screen display, and the fusion of the heterologous (sink) input displayed by the heterologous equipment (source) can be carried out, and the fusion is not limited to the content displayed by the source equipment. Therefore, the multi-terminal simultaneous touch control of each application can be fused without mutual interference, the operation input of one application can be simultaneously carried out by the multi-terminal under the multi-application screen-throwing scene, and the problem that the operation input of one terminal can only be responded under the multi-application screen-throwing scene is solved.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the electronic device according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
The embodiment of the application also provides electronic equipment, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein when the processor executes the computer program, the electronic equipment realizes the functions or steps executed by the electronic equipment in the above method embodiments.
The embodiment of the application also provides a multi-screen cooperative control device which can be applied to the electronic equipment. The apparatus is configured to perform the functions or steps performed by the electronic device in the method embodiments described above.
The embodiment of the application also provides a chip system which comprises at least one processor and at least one interface circuit. The processors and interface circuits may be interconnected by wires. The interface circuit may read the instructions stored in the memory and send the instructions to the processor. The instructions, when executed by the processor, may cause the electronic device to perform the various functions or steps described above as being performed by the vehicle in the method embodiments. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
The embodiment of the application also provides a computer storage medium, which comprises computer instructions, when the computer instructions are run on the electronic device, the electronic device is caused to execute the functions or steps executed by the electronic device in the embodiment of the method.
The embodiment of the application also provides a computer program product, which when run on an electronic device, causes the electronic device to execute the functions or steps executed by the electronic device in the above-mentioned method embodiment.
The electronic device, the multi-screen cooperative control apparatus, the computer storage medium, the computer program product or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the multi-screen cooperative control apparatus, the computer storage medium, the computer program product or the chip can refer to the beneficial effects in the corresponding method provided above, and are not repeated herein.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (24)

1. A multi-screen cooperative control method, characterized in that it is applied to a first electronic device, the method comprising:
Displaying a first window, wherein the first window comprises an application interface of a first application;
Detecting a first operation acting on the first window, and receiving first control information from second electronic equipment, wherein the first control information is generated by the second electronic equipment according to a second operation acting on a second window, and the second window comprises an application interface of the first application projected by the first electronic equipment;
The first operation and the first control information are subjected to preset processing through a first program corresponding to the first application to obtain a first event, and the preset processing is used for indicating the first electronic equipment to respond to the first operation and the first control information;
and responding to the first event, and executing an operation instruction corresponding to the first event on the first application.
2. The method according to claim 1, wherein the method further comprises:
And when the application interface of the first application is projected to the second window of the second electronic device for display, creating a first program corresponding to the first application.
3. The method of claim 2, wherein prior to the creating the first program corresponding to the first application, the method further comprises:
The first application is detected to support a multi-touch mode.
4. A method according to any of claims 1-3, wherein the first operation is a touch operation corresponding to a first operation position in the first application, the first control information is a touch operation corresponding to a second operation position in the first application, and the first event comprises the first operation acting on the first operation position and the first control information acting on the second operation position.
5. The method of claim 4, wherein executing the operation instruction corresponding to the first event on the first application in response to the first event comprises:
And responding to the first operation acted on the first operation position and the first control information acted on the second operation position, displaying the application interface updated by the first application in the first window, and projecting the application interface updated by the first application to the second window of the second electronic device for display.
6. A method according to any of claims 1-3, wherein the first event comprises the first operation or the first control information acting on the same operational location in the first application when the first operation and the first control information are the same touch operation acting on the same operational location.
7. The method of claim 6, wherein executing the operation instruction corresponding to the first event on the first application in response to the first event comprises:
and responding to the first operation or the first control information acting on the same operation position, displaying the application interface updated by the first application in the first window, and projecting the application interface updated by the first application to the second window of the second electronic device for display.
8. A method according to any one of claims 1-3, wherein the first operation is a touch operation corresponding to a third operation position in the first application and first pressure information, the first control information includes a touch operation corresponding to the third operation position in the first application and second pressure information, the first event is a new touch operation acting on the third operation position and third pressure information, and the third pressure information is pressure information obtained by superimposing the first pressure information and the second pressure information;
and/or
The first operation is a touch operation corresponding to a third operation position and first color information in the first application, the first control information comprises a touch operation corresponding to the third operation position and second color information in the first application, the first event is a new touch operation acting on the third operation position and third color information, and the third color information is color information obtained by fusing the first color information and the second color information.
9. The method of claim 8, wherein executing the operation instruction corresponding to the first event on the first application in response to the first event comprises:
And responding to the first event acted on the third operation position, displaying the application interface updated by the first application in the first window, and projecting the application interface updated by the first application to the second window of the second electronic device for display.
10. The method of any of claims 1-9, wherein the first application does not support multi-touch mode, the method further comprising, after the detecting the first operation on the first window and receiving first control information from the second electronic device:
Selecting one from the first operation and the first control information as a target event;
and responding to the target event, and executing an operation instruction corresponding to the target event on the first application.
11. The method according to any one of claims 1-10, further comprising:
detecting a third operation acting on the first window, and receiving second control information from the second electronic device, wherein the second control information is generated by the second electronic device according to a fourth operation acting on a third window, and the third window comprises an application interface of a second application projected by the first electronic device;
the third operation is subjected to preset processing through a first program corresponding to the first application to obtain a second event, wherein the preset processing is used for indicating the first electronic equipment to respond to the third operation;
the second control information is subjected to preset processing through a second program corresponding to the second application to obtain a third event, wherein the preset processing is used for indicating the first electronic equipment to respond to the second control information;
And responding to the second event and the third event, executing an operation instruction corresponding to the second event on the first application, and executing an operation instruction corresponding to the third event on the second application.
12. The method of claim 11, wherein the second application does not support multi-touch mode, and wherein after the detecting the third operation on the first window and receiving the second control information from the second electronic device, the method further comprises:
the third operation is subjected to preset processing through a first program corresponding to the first application, so that a second event is obtained;
And responding to the second event and the second control information, executing an operation instruction corresponding to the second event on the first application, and executing an operation instruction corresponding to the second control information on the second application.
13. The method according to any one of claims 1-12, wherein the first program corresponds to a first queue, and the performing, by the first program corresponding to the first application, a preset process on the first operation and the first control information to obtain a first event includes:
Storing the first operation and the first control information to the first queue;
And carrying out preset processing on the first operation and the first control information in the first queue through a first program corresponding to the first application to obtain a first event.
14. A multi-screen cooperative control method, characterized in that the method comprises:
The method comprises the steps that first electronic equipment displays a first window, wherein the first window comprises an application interface of a first application;
A second window is displayed by second electronic equipment, and the second window comprises an application interface of the first application projected by the first electronic equipment;
the first electronic device detects a first operation acting on the first window and receives first control information from the second electronic device, wherein the first control information is generated by the second electronic device according to a second operation acting on the second window;
The first electronic device performs preset processing on the first operation and the first control information through a first program corresponding to the first application to obtain a first event, wherein the preset processing is used for indicating the first electronic device to respond to the first operation and the first control information;
And responding to the first event, and executing an operation instruction corresponding to the first event on the first application by the first electronic equipment.
15. The method of claim 14, wherein the method further comprises:
when the first electronic device projects the application interface of the first application to the second window of the second electronic device for display, the first electronic device creates a first program corresponding to the first application.
16. The method of claim 15, wherein prior to the creating the first program corresponding to the first application, the method further comprises:
The first electronic device detects that the first application supports a multi-touch mode.
17. The method of any of claims 14-16, wherein the first operation is a touch operation corresponding to a first operational location in the first application, the first control information is a touch operation corresponding to a second operational location in the first application, and the first event includes the first operation acting on the first operational location and the first control information acting on the second operational location.
18. The method according to any of claims 14-16, wherein when the first operation and the first control information are the same touch operation acting on the same operation position in the first application, the first event comprises the first operation or the first control information acting on the same operation position.
19. The method according to any one of claims 14 to 16, wherein the first operation is a touch operation corresponding to a third operation position in the first application and first pressure information, the first control information is a touch operation corresponding to the third operation position in the first application and second pressure information, the first event is a new touch operation acting on the third operation position and third pressure information, and the third pressure information is pressure information obtained by superimposing the first pressure information and the second pressure information;
and/or
The first operation is a touch operation corresponding to a third operation position and first color information in the first application, the first control information is a touch operation corresponding to the third operation position and second color information in the first application, the first event is a new touch operation acting on the third operation position and third color information, and the third color information is color information obtained by fusing the first color information and the second color information.
20. The method according to any one of claims 14-19, further comprising:
the second electronic device displays a third window, wherein the third window comprises an application interface of a second application projected by the first electronic device;
The first electronic device detects a third operation acting on the first window and receives second control information from the second electronic device, wherein the second control information is generated by the second electronic device according to a fourth operation acting on the third window;
The first electronic device performs preset processing on the third operation through a first program corresponding to the first application to obtain a second event, wherein the preset processing is used for indicating the first electronic device to respond to the third operation;
The first electronic device performs preset processing on the second control information through a second program corresponding to the second application to obtain a third event, wherein the preset processing is used for indicating the first electronic device to respond to the second control information;
and responding to the second event and the third event, the first electronic equipment executes the operation instruction corresponding to the second event on the first application, and executes the operation instruction corresponding to the third event on the second application.
21. An electronic device comprising a memory and one or more processors; the memory is coupled to the processor; the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1-20.
22. A chip system, wherein the chip system is applied to an electronic device; the system-on-chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a circuit; the interface circuit is configured to receive a signal from a memory of the electronic device and to send the signal to the processor, the signal including computer instructions stored in the memory; the electronic device, when executing the computer instructions, performs the method of any of claims 1-20.
23. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-20.
24. A computer program product, characterized in that the computer program product, when run on an electronic device, causes the electronic device to perform the method of any of claims 1-20.
CN202310312265.4A 2023-03-21 2023-03-21 Multi-screen collaborative control method, electronic device and system Pending CN118689427A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310312265.4A CN118689427A (en) 2023-03-21 2023-03-21 Multi-screen collaborative control method, electronic device and system
PCT/CN2024/078622 WO2024193301A1 (en) 2023-03-21 2024-02-26 Multi-screen collaborative control method, electronic device, and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310312265.4A CN118689427A (en) 2023-03-21 2023-03-21 Multi-screen collaborative control method, electronic device and system

Publications (1)

Publication Number Publication Date
CN118689427A true CN118689427A (en) 2024-09-24

Family

ID=92763403

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310312265.4A Pending CN118689427A (en) 2023-03-21 2023-03-21 Multi-screen collaborative control method, electronic device and system

Country Status (2)

Country Link
CN (1) CN118689427A (en)
WO (1) WO2024193301A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119645338B (en) * 2025-02-19 2025-06-24 深圳市维图实业有限公司 Display screen projection control method, display screen and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104506650B (en) * 2015-01-04 2018-07-03 华为技术有限公司 A kind of user equipment cooperative control method, user equipment and communication system
US10564918B2 (en) * 2017-11-01 2020-02-18 Amzetta Technologies, Llc Techniques of remotely providing user input to thin client
CN111880870B (en) * 2020-06-19 2024-06-07 维沃移动通信有限公司 Method and device for controlling electronic equipment and electronic equipment
CN114764318A (en) * 2021-01-14 2022-07-19 上海擎感智能科技有限公司 Interaction method, terminal and computer readable storage medium
CN114217754A (en) * 2021-11-05 2022-03-22 维沃移动通信有限公司 Screen projection control method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2024193301A1 (en) 2024-09-26

Similar Documents

Publication Publication Date Title
US12413687B2 (en) Providing side conversations within a virtual conferencing system
US12088962B2 (en) Configuring participant video feeds within a virtual conferencing system
US11979244B2 (en) Configuring 360-degree video within a virtual conferencing system
US12185028B2 (en) Providing contact information within a virtual conferencing system
US11973613B2 (en) Presenting overview of participant conversations within a virtual conferencing system
US11722535B2 (en) Communicating with a user external to a virtual conference
US11943072B2 (en) Providing a room preview within a virtual conferencing system
US20250260730A1 (en) Automatically navigating between rooms within a virtual conferencing system
US12061833B2 (en) Multi-window display method, electronic device, and system
EP4130963A1 (en) Object dragging method and device
CN110602805B (en) Information processing method, first electronic device and computer system
WO2021072926A1 (en) File sharing method, apparatus, and system, interactive smart device, source end device, and storage medium
US11003353B2 (en) Method and system of enhanced interaction with a shared screen
CN104035683A (en) Split-screen multitask interaction method for communication terminal
US12199784B2 (en) Configuring broadcast media quality within a virtual conferencing system
US12069409B2 (en) In-person participant interaction for hybrid event
US20240340192A1 (en) Coordinating side conversations within virtual conferencing system
US12348575B2 (en) Dynamically assigning participant video feeds within virtual conferencing system
CN114793485A (en) Screen projection interaction method, screen projection system and terminal equipment
CN118689427A (en) Multi-screen collaborative control method, electronic device and system
US20240248576A1 (en) Virtual screens using wearable devices
US20230108152A1 (en) Providing a takeable item within a virtual conferencing system
US20240069708A1 (en) Collaborative interface element within a virtual conferencing system
WO2025066640A1 (en) Method for transmitting data, and electronic device and system
WO2024078306A1 (en) Banner notification message display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination