[go: up one dir, main page]

CN107643884B - Method and system for interaction between at least two computers and a screen - Google Patents

Method and system for interaction between at least two computers and a screen Download PDF

Info

Publication number
CN107643884B
CN107643884B CN201710790514.5A CN201710790514A CN107643884B CN 107643884 B CN107643884 B CN 107643884B CN 201710790514 A CN201710790514 A CN 201710790514A CN 107643884 B CN107643884 B CN 107643884B
Authority
CN
China
Prior art keywords
screen
control signal
server
computers
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710790514.5A
Other languages
Chinese (zh)
Other versions
CN107643884A (en
Inventor
宋志标
周志云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zen Ai Technology Co ltd
Original Assignee
Beijing Zen Ai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zen Ai Technology Co ltd filed Critical Beijing Zen Ai Technology Co ltd
Priority to CN201710790514.5A priority Critical patent/CN107643884B/en
Publication of CN107643884A publication Critical patent/CN107643884A/en
Application granted granted Critical
Publication of CN107643884B publication Critical patent/CN107643884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了用于至少两台计算机与一个屏幕交互的方法,该方法包括:传感器采集用户的第一手势操作后将采集到的信号发送给服务器;服务器对信号进行图像处理,生成与第一手势操作对应的第一控制信号,并根据第一控制信号控制至少两台计算机的内容在屏幕上的显示。用户可以通过一台屏幕来与多个计算机交互,将分散于多地或多个计算机的信息集中在一个屏幕上进行显示和交互,极大地提高了用户处理数据的效率,而且整个操作过程简单,明了,可以快速实现。

Figure 201710790514

The invention discloses a method for at least two computers to interact with one screen. The method includes: after a sensor collects a first gesture operation of a user, the collected signal is sent to a server; the server performs image processing on the signal, and generates a The gesture operates the corresponding first control signal, and controls the display of the contents of the at least two computers on the screen according to the first control signal. Users can interact with multiple computers through one screen, and concentrate information scattered in multiple places or multiple computers on one screen for display and interaction, which greatly improves the efficiency of users in processing data, and the entire operation process is simple. Clearly, it can be implemented quickly.

Figure 201710790514

Description

Method and system for interaction between at least two computers and a screen
Technical Field
The invention relates to the technical field of information interaction, in particular to a method and a system for realizing interaction between a plurality of computers and one screen.
Background
In the prior art, one computer needs one display device, and N computers need N display devices, which causes waste of resources and space. In addition, information content located in multiple locations or content of multiple computers cannot be shared and retrieved efficiently.
Disclosure of Invention
In view of the above problems, the present application provides the following inventions, which solve the problems of resource waste and unreasonable utilization in the prior art, and the problem that signals from multiple locations or multiple computers cannot be shared and called well.
Some embodiments of the invention provide a method for at least two computers to interact with a screen, the method comprising: the method comprises the steps that a sensor collects a first gesture operation of a user and then sends a collected signal to a server; the server performs image processing on the signals, generates a first control signal corresponding to the first gesture operation, and controls the display of the content of at least two computers on the screen according to the first control signal.
Other embodiments of the present invention provide a system for at least two computers to interact with a screen, the system comprising: at least two computers; a screen; a server; a sensor; the sensor collects a first gesture operation and then sends the collected signal to the server; the server performs image processing on the signals, generates a first control signal corresponding to the first gesture operation, and controls the display of the content of at least two computers on the screen according to the first control signal.
By the invention, a user can interact with a plurality of computers through one screen, and the information of the computers or the computers scattered in multiple places is concentrated on one screen for displaying and calling, so that the data processing efficiency of the user is greatly improved, and the whole operation process is simple and clear and can be quickly realized.
Drawings
FIG. 1 is a system for at least two computers interacting with a screen according to some embodiments of the invention;
FIG. 2 is a system for at least two computers interacting with a screen according to some embodiments of the invention;
FIG. 3 is a method for at least two computers to interact with a screen according to some embodiments of the invention.
Detailed Description
The technical means of the present invention will be described in further detail with reference to specific embodiments. It should be understood that the detailed description and specific examples, while indicating the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
In this application, those skilled in the art will appreciate from the context that a "connection" between some components referred to herein may refer to either a wired connection or a wireless connection.
FIG. 1 is a system for at least two computers to interact with a screen according to some embodiments of the invention. As shown in FIG. 1, the interactive system of the present invention comprises at least two computers (three computers 21,22 and 23 in the figure); a screen 10; a projector 20, the projector 20 projecting contents to be displayed by the computer onto a screen; a server 30; and the sensor 9 is used for acquiring gesture operation on the screen 10 and transmitting acquired signals to the server 30. After the sensor 9 collects the gesture operation of the user on the screen 10, the collected signal is transmitted to the server 30, the server 30 recognizes the gesture operation through an image recognition algorithm, and generates a corresponding control signal according to the recognition result, activates a corresponding computer, so that the content of the computer is output to the projector and is displayed on the screen through the projector. For example, in some embodiments, a correspondence table between gesture operations and control signals is stored in advance in the server (for example, gesture operations 1,2, and 3 may respectively indicate that computers 1,2, and 3 are to be activated), and the server queries the correspondence table to generate corresponding control signals. The control signal may also include operating on content displayed on the screen, including paging, zooming in, zooming out, deleting, scribing, etc. the content.
In the above embodiment, the operation position and operation gesture of the user on the screen are captured by the sensor, wherein the gesture can also leave the screen; in contrast, in some embodiments, the user's touch location and gesture on the screen may be captured by the screen itself. For example, when the screen is a capacitive, resistive, infrared, or surface acoustic wave touch screen. In this case, the content of the activated one of the three computers can be displayed on the screen by one projector as well. In some embodiments, the screen itself may be used as a display screen that can accept input signals, and in this case, activated computer content is directly input to the display device for display through the server without a projector.
Accordingly, some embodiments of the present invention provide methods for at least two computers to interact with a screen. The method comprises the following steps: after the sensor 9 collects the gesture operation of the user on the screen 10, the collected signal is transmitted to the server 30, the server 30 recognizes the gesture operation through an image recognition algorithm, and generates a corresponding control signal according to the recognition result, activates a corresponding computer, so that the content of the computer is output to the projector and is displayed on the screen through the projector. In some embodiments, a correspondence table between the gesture operations and the control signals is stored in advance in the server (for example, gesture operations 1,2, and 3 may respectively indicate that computers 1,2, and 3 are to be activated), and the server queries the correspondence table to generate corresponding control signals.
In the above embodiment, the operation position and the operation gesture of the user on the screen are captured by the sensor; in contrast, in some embodiments, the user's touch location and gesture operations on the screen may be captured by the screen itself. For example, when the screen is a capacitive, resistive, infrared, or surface acoustic wave touch screen. At this time, the content of the activated one of the three computers can be displayed on the screen by the projector as well. In some embodiments, the screen itself may serve as a display screen that can accept input signals, in which case activated computer content is input into the display device for display without the need for a projector.
FIG. 2 is a system for interaction of at least two computers with a screen according to some embodiments of the present invention. The interactive system comprises at least two computers (three computers 201,202 and 203 in the figure); a screen 100; a projector 200, three projectors are shown in the figure, for projecting contents to be displayed by the computer onto a screen; a sensor 109 for collecting gesture operations on the screen 100 and transmitting the collected signals to the server 30; a multi-screen stitching processor 300; and a server 301. The multi-screen splicing processor 300 is connected with the computer 201, the computer 202, the computer 203, the server and the projector 200, and the multi-screen splicing processor 300 can receive the contents of the computer 201, the computer 202 and the computer 203 and project the contents on the screen 100 through the projector for displaying. And the server controls the multi-screen splicing processor according to the control signal to enable the screen to be displayed as a whole screen or be divided into a plurality of windows to project display contents. For example, the multi-screen splicing processor 300 can divide the screen 100 into a plurality of windows for displaying, for example, the window 101, the window 102, and the window 103 of the screen 100 are related to each other, so that the window 101 displays the content of the computer 201, the window 102 displays the content of the computer 202, the window 103 displays the content of the computer 203, or the window 101 displays the content of three computers, which is omitted and not described in detail since the multi-screen splicing processor is well known to those skilled in the art in terms of window adjustment and layout.
In some embodiments of the present invention, the sensor 109 collects an operation gesture of a user on the screen 100 and a window where the operation occurs, and transmits the collected signal to the server 301, the server 301 recognizes the gesture signal through an image recognition algorithm to obtain a gesture operation input by the user and the window where the gesture operation is located, and the server correspondingly controls the multi-screen stitching processor according to the recognition result. The server is pre-stored with a corresponding relation table between gesture operations and control signals, and queries the corresponding relation table to generate corresponding control signals, for example, gesture operation W may indicate that a multi-screen splicing processor is to be controlled to enter a window layout adjustment mode, gesture operation > indicates to enlarge a window, gesture operation < indicates to reduce the window, gesture left movement indicates to move a current window to the left, and gesture right movement indicates to move the current window to the right.
For example, when a gesture W acts on the window 101 on the screen, the sensor 109 captures the position and motion of the gesture, and transmits the collected signal to the server, and the server recognizes the signal and knows that the gesture W acts on the window 101, so that the server controls the multi-screen stitching processor to enter the window layout adjustment mode according to the recognition result. In the next time, the user can continue to perform gesture operations (>, <, move left, move right, etc.), and the gesture operations are collected by the sensor and then sent to the server again, so that the server controls the mosaic controller to respectively enlarge, reduce, move left, move right, or disappear the window 101. When the window 101 is enlarged to occupy the entire screen, only the contents of the computer 201 will be displayed, and when the window 102 is enlarged to occupy the entire screen, only the contents of the computer 202 will be displayed. Accordingly, the foregoing embodiments of the present invention can realize that 3, 2 or 1 computer can simultaneously or alternatively display the content on the screen 10. In some embodiments, when the user is off-screen, it is not possible to determine which window the gesture operation is located in, and the window to be adjusted by the gesture operation may be defined by other slightly complex gestures. In some embodiments, after the server controls the multi-screen splicing processor to enter the window layout adjustment mode according to the recognition result, the server may also adjust three windows that have been originally divided by the multi-screen splicing processor together according to a certain gesture, for example, if the gesture input at this time is M.
In the above embodiment, the operation position and operation gesture of the user on the screen are captured by the sensor, and the gesture may also be operated off the screen; in some embodiments, the screen is a projection screen, an infrared light curtain is arranged on the screen, and when a hand acts on the screen, the sensor detects a change of the light curtain, generates a control signal corresponding to the gesture operation, and controls the display of the content of the computer on the screen according to the control signal. In contrast, in some embodiments, the user's touch location and gesture on the screen may be captured by the screen itself, for example, when the screen is a capacitive, resistive, infrared, or surface acoustic wave touch screen. At this time, the server controls the multi-screen splicing processor according to the control signal, so that the multi-screen splicing processor enables the screen to be displayed as a whole screen or divided into a plurality of windows to project and display the content of three or one of the computers through one or more projectors (according to the method).
According to one embodiment of the invention, the screen is a projection screen, the sensor is an infrared sensor, the infrared sensor is positioned in front of or behind the screen, an infrared light curtain is arranged on the screen, when a gesture acts on the screen, the infrared sensor collects light curtain signals before and after the gesture acts, and the processor generates the control signal according to the signals collected by the infrared sensor.
According to one embodiment of the invention, the screen is a projection screen, the sensor is an infrared sensor, an infrared light curtain is arranged on the screen, the infrared sensor is located behind the screen, when a gesture acts on the screen, the infrared sensor collects infrared light transmitted through the screen due to the gesture, and the processor generates the control signal according to the signal collected by the infrared sensor.
The projection screen is a screen capable of meeting the projection imaging requirements, and comprises a screen with a certain transmittance.
In some embodiments, the screen may be a plurality of tiled screens. The system also comprises a multi-screen splicing processor, wherein the multi-screen splicing processor is connected with the server, the computer and the splicing screen, and the server controls the multi-screen splicing processor according to the control signal to enable the screen to be used as a whole screen to be displayed or to be divided into a plurality of windows to project display contents.
Methods for at least two computers to interact with a screen according to some embodiments of the present invention are described below with respect to FIG. 2 and FIG. 3. As shown in fig. 3, in step S1, the user may perform a gesture operation on the screen 100; step S2, the sensor collects gesture operation and sends the collected signal to the server 301; step S3, the server 301 processes the image of the gesture signal, analyzes the screen position of the gesture operation and the corresponding gesture operation, and queries the corresponding relation table between the gesture operation and the control signal to generate a corresponding control signal; step S4, the server controls the multi-screen splicing processor to enter a window layout adjustment mode according to the control signal corresponding to the gesture operation; s5, the user adjusts one of the windows, for example, adjusts the window in which the gesture operation is performed. In some embodiments, after the server controls the multi-screen splicing processor to enter the window layout adjustment mode according to the control signal corresponding to the gesture operation, the server may also adjust three windows that have been originally divided by the multi-screen splicing processor together according to a certain gesture, for example, if the gesture input at this time is M, instead of just adjusting the window where the operation is performed.
In the embodiments described above with respect to fig. 1-3, after the computer content is displayed on the screen, the user may further perform an operation on the displayed content, for example, the user may perform a page turning operation on the displayed content, the operation may be captured by the sensor and then transmitted to the server, and the server analyzes the window (or computer) where the operation is located and the corresponding control signal, and then controls or edits the corresponding content in the corresponding computer.
The invention can realize the viewing, editing or controlling of the contents of a plurality of computers through one screen, solves the technical problem that at least two computers in the prior art need to use at least two displays to waste resources, and solves the problem that the content is difficult to solve by other methods.
The above embodiments are all preferred embodiments of the present invention, and therefore do not limit the scope of the present invention. Any equivalent structural and equivalent procedural changes made to the present disclosure without departing from the spirit and scope of the present disclosure are within the scope of the present disclosure as claimed.

Claims (5)

1. A system for at least two computers to interact with a screen, the system comprising:
at least two computers;
a screen;
a server;
a sensor;
the sensor collects a first gesture operation and then sends the collected signal to the server;
the server carries out image processing on the signals, generates a first control signal corresponding to the first gesture operation according to the operation position and the gesture obtained by analysis, and controls the display of the content of at least two computers on the screen according to the first control signal;
the system comprises a server, a computer and a multi-screen splicing processor, wherein the multi-screen splicing processor is connected with the server, the computer and the splicing screen, and the server controls the multi-screen splicing processor according to a control signal to enable the screen to be used as a whole screen to be displayed or divided into a plurality of windows to display contents;
the sensor collects a second gesture operation and then sends the collected second control signal to the server, the server carries out image processing on the second control signal to generate a second control signal corresponding to the second gesture operation, and the multi-screen splicing processor is controlled according to the second control signal, so that one of the windows enters a window adjusting mode;
and after the sensor collects a third gesture operation, the collected third control signal is sent to the server, the server performs image processing on the third control signal to generate a third control signal corresponding to the third gesture operation, and the multi-screen splicing processor is controlled according to the third control signal, so that the window entering the window adjustment mode is enlarged, reduced, moved and disappeared.
2. The system for interaction between at least two computers and a screen according to claim 1, wherein the server stores a corresponding relationship table between gesture operations and control signals in advance, and the server queries the corresponding relationship table to generate corresponding control signals.
3. The system for interaction between at least two computers and a screen according to claim 1, wherein the system includes one or more projectors, and further includes a multi-screen splicing processor, the multi-screen splicing processor is connected to the server, the computers and the projectors, and the server controls the multi-screen splicing processor to display the screen as a whole screen or to divide the screen into a plurality of windows for projecting display contents according to the control signals.
4. The system for at least two computers to interact with a screen according to claim 1, wherein the screen is a capacitive, resistive, infrared or surface acoustic wave touch screen; or the screen is a projection screen, an infrared light curtain is arranged on the projection screen, when a hand acts on the screen, the sensor collects light curtain signals before and after the gesture acts, the processor generates a first control signal, a second control signal, a third control signal or a fourth control signal according to the signals collected by the infrared sensor, and the fourth control signal corresponds to the content displayed on the screen.
5. A system for at least two computers to interact with a screen, the system comprising:
at least two computers;
a screen;
a server;
a sensor;
the sensor collects a first gesture operation and then sends the collected signal to the server;
the server carries out image processing on the signals, generates a first control signal corresponding to the first gesture operation according to the operation position and the gesture obtained by analysis, generates a first control signal corresponding to the first gesture operation, and controls the display of the content of at least two computers on the screen according to the first control signal;
the system comprises one or more projectors and a multi-screen splicing processor, wherein the multi-screen splicing processor is connected with a server, a computer and the projectors, and the server controls the multi-screen splicing processor according to a control signal to enable a screen to be used as a whole screen to be displayed or be divided into a plurality of windows to project display contents;
the sensor acquires a second gesture operation and then sends an acquired second control signal to the server, the server performs image processing on the second control signal to generate a second control signal corresponding to the second gesture operation, and the multi-screen splicing processor is controlled according to the second control signal to enable one of the windows to enter a window adjusting mode;
the server processes the image of the third control signal to generate the third control signal corresponding to the third gesture operation, and controls the multi-screen splicing processor according to the third control signal to enlarge, reduce, move and disappear the window entering the window adjustment mode.
CN201710790514.5A 2017-09-05 2017-09-05 Method and system for interaction between at least two computers and a screen Active CN107643884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710790514.5A CN107643884B (en) 2017-09-05 2017-09-05 Method and system for interaction between at least two computers and a screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710790514.5A CN107643884B (en) 2017-09-05 2017-09-05 Method and system for interaction between at least two computers and a screen

Publications (2)

Publication Number Publication Date
CN107643884A CN107643884A (en) 2018-01-30
CN107643884B true CN107643884B (en) 2021-02-23

Family

ID=61111304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710790514.5A Active CN107643884B (en) 2017-09-05 2017-09-05 Method and system for interaction between at least two computers and a screen

Country Status (1)

Country Link
CN (1) CN107643884B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110147169A (en) * 2018-02-13 2019-08-20 北京仁光科技有限公司 At least one display window distributing adjustment and touch-control automatic calibrating method and system
CN109101173B (en) * 2018-02-13 2020-10-30 北京仁光科技有限公司 Screen layout control method, apparatus, device, and computer-readable storage medium
CN110362259A (en) * 2018-04-10 2019-10-22 北京仁光科技有限公司 Command dispatching system, method, equipment and computer readable storage medium
CN111221482B (en) * 2018-11-23 2022-10-25 北京仁光科技有限公司 The interface layout adjustment method based on the podium
CN111142820B (en) * 2019-12-25 2023-09-19 上海联影医疗科技股份有限公司 Remote control method, device and system based on multiple screens
CN113835650A (en) * 2020-06-23 2021-12-24 明基智能科技(上海)有限公司 Screen positioning system and screen positioning method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045552A (en) * 2015-08-26 2015-11-11 小米科技有限责任公司 Multi-screen splicing display method and apparatus
CN105446689A (en) * 2015-12-16 2016-03-30 广州视睿电子科技有限公司 Method and system for remote annotation synchronization
CN105807549A (en) * 2014-12-30 2016-07-27 联想(北京)有限公司 Electronic device
CN105808181A (en) * 2014-12-31 2016-07-27 中强光电股份有限公司 Image intermediary device, interactive display system and operation method thereof
CN205670292U (en) * 2016-05-10 2016-11-02 安徽大学 Wireless replication and extended display interactive system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105807549A (en) * 2014-12-30 2016-07-27 联想(北京)有限公司 Electronic device
CN105808181A (en) * 2014-12-31 2016-07-27 中强光电股份有限公司 Image intermediary device, interactive display system and operation method thereof
CN105045552A (en) * 2015-08-26 2015-11-11 小米科技有限责任公司 Multi-screen splicing display method and apparatus
CN105446689A (en) * 2015-12-16 2016-03-30 广州视睿电子科技有限公司 Method and system for remote annotation synchronization
CN205670292U (en) * 2016-05-10 2016-11-02 安徽大学 Wireless replication and extended display interactive system

Also Published As

Publication number Publication date
CN107643884A (en) 2018-01-30

Similar Documents

Publication Publication Date Title
CN107643884B (en) Method and system for interaction between at least two computers and a screen
US9195345B2 (en) Position aware gestures with visual feedback as input method
US11294495B2 (en) Electronic whiteboard, method for image processing in electronic whiteboard, and recording medium containing computer program of electronic whiteboard
EP2498485B1 (en) Automated selection and switching of displayed information
JP6372487B2 (en) Information processing apparatus, control method, program, and storage medium
JP5344651B2 (en) Information processing apparatus, control method, program, and information processing system
US8184065B2 (en) Efficient mode switching in a video processor system
CN109101172B (en) Multi-screen linkage system and interactive display method thereof
US9880721B2 (en) Information processing device, non-transitory computer-readable recording medium storing an information processing program, and information processing method
CN110618780A (en) Interaction device and interaction method for interacting multiple signal sources
CN207198834U (en) For at least two computers and the system of a screen interaction
US20170229102A1 (en) Techniques for descriptor overlay superimposed on an asset
CN108170299B (en) Navigation device and image display system
CN107239178A (en) Display system, information processor, projecting apparatus and information processing method
CN110851011A (en) System and method for interacting large-screen multi-signal-source complex display contents
US20140035816A1 (en) Portable apparatus
JP2020197865A (en) Information processing apparatus, information processing method, information processing system, and program
JP6392573B2 (en) Multi display system
CN103150110A (en) Display system and control method thereof
CN106066775B (en) Interactive control system, touch display device and control method thereof
CN103853353A (en) Image projection system
TWI607369B (en) System and method for adjusting image display
JP6816487B2 (en) Remote control device, remote control system and program
JP2020013472A (en) Image output device, control method and program
EP4400942A1 (en) Touchless user interface control method, system, computer program and computer-readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 201, 2nd Floor, Building 1, No. 28 Shangdi Chuangye Middle Road, Haidian District, Beijing

Patentee after: BEIJING ZEN-AI TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: 100084 2nd floor, East District, Shangdi science and technology complex building, 22 Shangdi Information Road, Haidian District, Beijing

Patentee before: BEIJING ZEN-AI TECHNOLOGY Co.,Ltd.

Country or region before: China