TW201142466A - Interactive projection system and system control method thereof - Google Patents
Interactive projection system and system control method thereof Download PDFInfo
- Publication number
- TW201142466A TW201142466A TW99115927A TW99115927A TW201142466A TW 201142466 A TW201142466 A TW 201142466A TW 99115927 A TW99115927 A TW 99115927A TW 99115927 A TW99115927 A TW 99115927A TW 201142466 A TW201142466 A TW 201142466A
- Authority
- TW
- Taiwan
- Prior art keywords
- light
- screen
- spots
- spot
- light spot
- Prior art date
Links
Landscapes
- Projection Apparatus (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
201142466 HD-2010-0005-TW 34022twf.doc/I 六、發明說明: 【發明所屬之技術領域】 9本發明是有關於一種投影系統及其控制方法,且特別 是有關於一種可藉由多光點操作的互動式投影系統及其系 統控制方法。 、、 【先前技術】 在現今資訊時代中,人類對於電子產品之依賴性盥日 ^ °筆記型電腦、行動電話、數位隨身聽、光學投影機 :^子產品均已成為現代人生活紅作林可或缺之應用201142466 HD-2010-0005-TW 34022twf.doc/I VI. Description of the invention: [Technical field to which the invention pertains] 9 The present invention relates to a projection system and a control method thereof, and in particular to a multi-light Point-operated interactive projection system and its system control method. , [Previous technology] In today's information age, human dependence on electronic products is the day ^ ° notebook computer, mobile phone, digital walkman, optical projector: ^ sub-products have become modern life red plantation Indispensable application
Si須2之電子產品均具有一輸入介面,用以輸入使用 二曰二’以使電子產品之㈣线自動執行此項指令。 置上斬人性化的操作模式’製造商開始在電子裝 panel)等的觸如是觸控板加祕Pad)或觸控面板(touch 面板’進而讓使时可透補控板或觸控 面板來輪人指令。就觸控介 用手指或觸控筆與觸人。日⑴便用者大都疋利 行為來進行轉=輪介面之間所產生的接觸或感應 變化或觸抛數量^=電子㈣可_觸娜的座標 (测叶進而根據來定義使用者的觸控手勢 然而’在光學二/勢來執灯對應的操作功能。 或觸控面板錢中,若使用者必須透過觸控板 此,若使用者可藉由’則顯得不切實際。因 影系統的屏幕上進一 7如疋缉射光點的投射,而直接在投 订系統操作時,則不僅可以顛覆傳統使 201142466The Sis 2 electronic products all have an input interface for inputting the second two's to enable the (4) line of the electronic product to automatically execute this command. Putting on the user-friendly operation mode 'manufacturer starts to install the panel in the electronic package, etc.) or the touch panel (the touch panel', so that the time can be used to adjust the control panel or touch panel Wheel man command. Use touch finger or stylus and touch person on the touch. Day (1) users use most profit-making behavior to make the contact or the change of contact or the amount of touch generated between the wheel interface ^=Electronic (4) The coordinates of the touch can be used to determine the user's touch gesture. However, in the optical second/potential, the corresponding operation function of the light is used. Or the touch panel money, if the user has to pass the touchpad If the user can use 'is unrealistic. On the screen of the shadow system, such as the projection of the light spot, and directly in the operation of the ordering system, it can not only subvert the tradition to make 201142466
HD-2010.0005.tw 34022twfd〇c/I 用鍵盤的輸入方式, 利性。 更可以大大地增進使用投影系統的便 【發明内容】 ,發明提供—種絲式投影祕,藉由賴在投影系 祕ί上的多個光點,可同時提供多個使財在投影系統 的屏幕上進行系統操作。 本發明提供-種系統控制方法,藉由辨識在投影 $上的多個光點,可同時提供多個額者在投影系統的 屏幕上進行系統操作。 本發明提供一種互動式投影系統,其適於辨識至少二 光^射器在-#幕上所分難㈣至少二絲。所述互動 式投影系統包括一投影單元、—影像感測單元以及一控制 單元。投影單元用以投射-影像晝面至屏幕。影像感二單 ,用以感測屏幕上的光點,其中每一光點具有一參數,且 每一光點的參數彼此不同。控制單元根據光點的參數分別 識別及區分光點,並根據光點中至少其中之一來 的系統操作。 愿 本發明提供一種系統控制方法,其適於一互動式投影 系統,其中互動式投影系統用以辨識至少二光發射器在— 屏幕上所分別產生的至少二光點。所述系統控制方法包括 如下步驟。投射一影像晝面至—屏幕。感測屏幕上的光點, 其中每一光點具有一參數,且每一光點的參數彼此不同。 根據光點的參數分別識別及區分光點。根據光點至少其中HD-2010.0005.tw 34022twfd〇c/I Keyboard input method, advantage. It is also possible to greatly enhance the use of the projection system. [Inventive content], the invention provides a kind of silk projection secret, which can simultaneously provide a plurality of financial projection systems by using a plurality of light spots on the projection system. System operation on the screen. The present invention provides a system control method for simultaneously providing a plurality of foreheads to perform system operations on the screen of the projection system by recognizing a plurality of spots on the projection $. The present invention provides an interactive projection system adapted to recognize that at least two optical emitters are difficult to divide (four) at least two wires. The interactive projection system includes a projection unit, an image sensing unit, and a control unit. The projection unit is used to project the image to the screen. The image sense is used to sense the spot on the screen, wherein each spot has a parameter, and the parameters of each spot are different from each other. The control unit separately identifies and distinguishes the light spots according to the parameters of the light spots, and operates according to at least one of the light spots. It is intended that the present invention provides a system control method suitable for an interactive projection system in which an interactive projection system is used to identify at least two spots of light produced by at least two light emitters on a screen. The system control method includes the following steps. Project an image to the screen. A spot on the screen is sensed, wherein each spot has a parameter and the parameters of each spot are different from each other. The spots are identified and distinguished based on the parameters of the spot. According to at least one of the light spots
201142466 HD-2U10-0005-TW 34022twf.doc/I 之一來執行對應的系統操作。 基於上述,在本發明之實施例_,互動式投影系統藉 由辨識在投影系統屏幕上的多個光點,可同時提供多個使 用者在投影线的料上進行线操作,不僅可以颠覆傳 統使用鍵盤的輸入方式,更可以大大地增進使用投影 的便利性。 為讓本發明之上述特徵和優點能更明顯易懂,下文特 舉實施例,並配合所附圖式作詳細說明如下。 【實施方式】 圖1繪示本發明一實施例之互動式投影系統。請參考 圖1,在本實施例中,互動式投影系統100適於辨識至少 一光發射斋LP1、LP2在一屏幕11〇上所分別產生的至少 二光點P1、P2。互動式投影系統1〇〇包括一投影單元12〇、 一影像感測單元130以及一控制單元14〇。在此,控制單 元140例如是藉由資料匯流排(data bus)、有線網路或無線 網路等連接方式,與投影單元120及影像感測單元13〇連 接’進而控制其操作。 在本實施例中,控制單元140包括一主機系統142及 一影像解碼單元144 (position decoder)。主機系統142用以 接收一多媒體資料,並控制投影單元120將該多媒體資料 所對應的影像晝面投影至屏幕110。 在此,主機系統142例如是藉由通用序列匯排流 (universal serial bus,USB)或視頻圖形陣列(Video GraphicOne of the 201142466 HD-2U10-0005-TW 34022twf.doc/I to perform the corresponding system operation. Based on the above, in the embodiment of the present invention, the interactive projection system can simultaneously provide multiple users to perform line operations on the material of the projection line by recognizing a plurality of light spots on the screen of the projection system, which can not only subvert the traditional The use of the keyboard input method can greatly enhance the convenience of using projection. The above described features and advantages of the present invention will become more apparent from the description of the appended claims. Embodiments FIG. 1 illustrates an interactive projection system according to an embodiment of the present invention. Referring to FIG. 1, in the present embodiment, the interactive projection system 100 is adapted to recognize at least two light spots P1, P2 respectively generated on at least one of the light emitting cells LP1, LP2 on a screen 11''. The interactive projection system 1 includes a projection unit 12A, an image sensing unit 130, and a control unit 14A. Here, the control unit 140 is connected to the projection unit 120 and the image sensing unit 13 by a connection method such as a data bus, a wired network, or a wireless network, thereby controlling its operation. In this embodiment, the control unit 140 includes a host system 142 and a video decoder unit 144 (position decoder). The host system 142 is configured to receive a multimedia material and control the projection unit 120 to project the image corresponding to the multimedia material to the screen 110. Here, the host system 142 is, for example, a universal serial bus (USB) or a video graphics array (Video Graphic).
201142466 HD-2U1 υ-0005-TW 34022twf.doc/I201142466 HD-2U1 υ-0005-TW 34022twf.doc/I
Array,VGA)訊號線等的輸入介面接收多媒體資料,或者 由安全數位記憶卡(Secure Digital card,SD card)讀取多媒 體資料,但本發明不限於此。 在本實施例中,主機系統142例如是一筆記型電腦 (notebook)或桌上型電腦(desktop personal computer)等的電 腦系統。在此’影像解碼單元144雖獨立配置於主機系統 142之外,但本發明並不限於此。在其他實施例中,影像 解碼單元144也可整合在主機系統142之内。The input interface of the Array, VGA, signal line or the like receives the multimedia material, or the multimedia data is read by a Secure Digital card (SD card), but the present invention is not limited thereto. In the present embodiment, the host system 142 is, for example, a computer system such as a notebook or a desktop personal computer. Here, the video decoding unit 144 is independently disposed outside the host system 142, but the present invention is not limited thereto. In other embodiments, image decoding unit 144 may also be integrated within host system 142.
在本實施例中’投影單元120用以將主機系統142所 欲顯示的影像晝面投影至屏幕110。在此,投影單元120 例如是液晶投影機(liquid crystal display projector,LCD projector)、數位光學處理投影機(digital Ught㈣⑵以吗 proj ector,DLP proj ector)或反射式液晶投影機(liquid crystd on silicon projector,LCOS projector)等的投影裝置。 在本實施例中,多個使用者U1、U2可藉由例如是雷 射光筆(Laser pointer)的光發射器lpi、LP2將光點Pi、P2 投射於屏幕110,以進行系統操作,進而於操作過程中產 生彼此間的互動。在此,系統操作例如是使用者 以光點P卜P2點選屏幕110上影像晝面的圖示(ic〇n),以 使主機系、统142對應選取該圖示或執行對應該圖示的操 作。或者,系統操作例如是使用者、U2以光點ρι、p2 在屏幕110所顯不的影像晝面之視窗捲轴(scr〇ll)上移動, =使主機系統142於屏幕11G上捲動(謂⑴該視窗的内 谷。或者,系統操作例如是使用者Ui、U2以光點ρι、 201142466 HD-2010-0005-TW 34022twf.doc/I 點選屏幕11G上影像晝面的物件,並移動該光點以使主 機系統142根據該光點的移動執跡,於屏幕ιι〇上對應移 動該物件。 詳細而言’在本實施例中,影像感測單元130用以感 測屏幕no所顯示影像晝面及屏幕上的光點ρι、p2。值得 注意的是’本實_之光點P1、p2具有不㈣形狀或不 同的顏色。在此,絲點P1、P2的形狀相同時,並可具 有不同的顏色;而若光點P1、P2 _色相㈣,盆可^ 有不同⑽狀。當然,光點P1、P2也可叹同時具有ς 同的形狀及不同的顏色。進而,控制單元140可根據光點 Ρ卜Ρ2的職或顏色’分別識別及區分投射屏幕上不同的 光點Ρ卜Ρ2,以使主機系·统142可執行對應光點ρ 之系統操作。 換句話說,影像感測單元13 〇用以感測影像晝面及屏 幕110上的第一光點Ρ1及第二光點Ρ2,其中第一光點Η 具有一第一參數(形狀或顏色),且第二光點ρ2具有一第二 參數(不同於第一光點Ρ1的形狀或顏色)。控制^元14〇 ^ 據第一參數及第二參數,分別識別及區分第一光點ρι及 第二光點P2’以執行對應第一光點卩丨或第二光點p2之系 統操作。應注意的是’控制單元140用以識別光點所根據 的參數不限於此處所例示的形狀或顏色。在本發明中,控 制單元據以識別的光點參數可以是任何足以區分投射在^ 幕上的多個光點之參數。 在本實施例中,至少二光發射器在屏幕上所分別產生In the present embodiment, the projection unit 120 is used to project the image to be displayed by the host system 142 to the screen 110. Here, the projection unit 120 is, for example, a liquid crystal display projector (LCD projector), a digital optical processing projector (digital Ught (4) (2), a liquid crystal projector (DLP proj ector)) or a reflective liquid crystal projector (liquid crystd on silicon projector). , LCOS projector) and other projection devices. In this embodiment, the plurality of users U1, U2 can project the spots Pi, P2 onto the screen 110 by means of light emitters lpi, LP2, such as laser pointers, for system operation, and Interactions between each other occur during the operation. Here, the system operation is, for example, the user selects the icon (ic〇n) of the image on the screen 110 by using the light spot P and P2, so that the host system and the system 142 select the icon or execute the corresponding icon. Operation. Alternatively, the system operation is, for example, that the user, U2 moves on the window scroll (scr〇ll) of the image displayed on the screen 110 with the light dots ρι, p2, and causes the host system 142 to scroll on the screen 11G ( It means (1) the inner valley of the window. Or, the system operation is, for example, the user Ui, U2 with the light point ρι, 201142466 HD-2010-0005-TW 34022twf.doc/I click on the object on the screen 11G, and move The light spot is configured to cause the host system 142 to perform the movement on the screen according to the movement of the light spot. In detail, in the embodiment, the image sensing unit 130 is configured to sense the display of the screen no. The image surface and the light spots ρι, p2 on the screen. It is worth noting that the light points P1 and p2 of the real thing have a shape of not (four) or a different color. Here, when the shapes of the filament points P1 and P2 are the same, It can have different colors; if the light points P1, P2 _ hue (four), the pots can have different (10) shapes. Of course, the light spots P1 and P2 can also have the same shape and different colors at the same time. 140 can identify and distinguish the projection screen according to the position or color of the spot Different light spots are used to enable the host system 142 to perform a system operation corresponding to the light spot ρ. In other words, the image sensing unit 13 is configured to sense the image surface and the first light on the screen 110. Point Ρ1 and second spot Ρ2, wherein the first spot Η has a first parameter (shape or color), and the second spot ρ2 has a second parameter (different from the shape or color of the first spot Ρ1) Controlling the first element and the second parameter, respectively identifying and distinguishing the first light spot ρι and the second light spot P2' to perform a system operation corresponding to the first spot 第二 or the second spot p2 It should be noted that the parameter according to the control unit 140 for identifying the light spot is not limited to the shape or color exemplified herein. In the present invention, the light spot parameter by which the control unit is identified may be any enough to distinguish the projection at ^ The parameters of the plurality of light spots on the screen. In this embodiment, at least two light emitters are respectively generated on the screen.
201142466 HD-2010-0005-TW 34022twf.doc/I 的至少二光點係以兩個光點(第一光點pi及第二光點 為例’但本發明並不限於此4其他實施例中,控制一) 可用以識別多個光點,每-投影於屏幕上的光點具^ -參數,且每-光點的參數彼此不同。控制單元可根^ 一光點的參數分別識別及區分投影於屏幕上的不 : 因此,控解元根據多個光點巾的至少其巾之 應的系統操作。 錢仃對 圖2繪示圖1的光發射器投射在屏幕上不同 意圖。請參考圖i及圖2,在本實施财,光點ρι ,= 形狀例如是如圖2所繪示的圓形 '十字形或=角开》 、, 點PI、P2的顏色例如是紅色、綠色或藍色。在此,= PI、P2的形狀或顏色可以相同或不同,惟應注意的是 施例之光點P1、1>2至少形狀或顏色其中之一不同。 因此,影像感測單元130可感測屏幕i 1〇所_ 晝面及屏幕上光點η、P2的參數(例如形狀、顏色或亮 度),並將其傳送至控制單S140。接著,控制單元刚ς 藉由其影像解碼單元144解析屏幕11G所顯示影像晝面及 屏幕上光點P卜P2的參數’使得主機系統142可根據立 解析結果,分別識別及區分投射屏幕上不同的光點ρι、、 打,並執行對應光點Η或P2之系統操作。另外厂在本者 施例中,主齡、統142可包括—影像辨雜式,以搭酉^ 像感測單元D0及影像解碼單& 144分別識別及區分屏幕 U0所顯示影像晝面及屏幕上光點P1、P2。 在本實施例中,影像感測單元130例如為電荷輕合元 201142466At least two light spots of 201142466 HD-2010-0005-TW 34022twf.doc/I are exemplified by two light spots (first light spot pi and second light spot), but the present invention is not limited to the other embodiments. Control 1) can be used to identify a plurality of light spots, each of which is projected on the screen with a parameter, and the parameters of each light point are different from each other. The control unit can separately identify and distinguish the projections on the screen from the parameters of the light spot: therefore, the control element operates according to at least the system of the plurality of light dots. Figure 2 illustrates the different intentions of the light emitter of Figure 1 projected onto the screen. Referring to FIG. 2 and FIG. 2, in the implementation, the light point ρι, = shape is, for example, a circular 'cross shape or = angle opening' as shown in FIG. 2, and the colors of the points PI, P2 are, for example, red, Green or blue. Here, the shape or color of = PI, P2 may be the same or different, but it should be noted that the light spot P1, 1 > 2 of the embodiment differs in at least one of the shapes or colors. Therefore, the image sensing unit 130 can sense the parameters (such as shape, color, or brightness) of the screen i 1 昼 and the on-screen spots η, P2 and transmit them to the control sheet S140. Then, the control unit finally parses the parameter of the image displayed on the screen 11G and the parameter P on the screen by the image decoding unit 144, so that the host system 142 can separately identify and distinguish different projection screens according to the vertical analysis result. The light point ρι, , hit, and perform the system operation corresponding to the spot Η or P2. In addition, in the embodiment of the factory, the master age system and the system 142 may include an image-recognition type, and the image sensing unit D0 and the image decoding unit & 144 respectively identify and distinguish the image surface displayed on the screen U0 and Spots P1, P2 on the screen. In this embodiment, the image sensing unit 130 is, for example, a charge light unit 201142466
xxl/-^.vi 0-0005-TW 34022twf.doc/I 件(charge coupled device,CCD)或互補式金氧半導體感測 元件(complementary metal-oxide-Semiconductor sens〇r, CMOS sensor)等感測裝置。 另外,在本實施例中,投影單元120及影像感測單元 130係整合於一體,並實質上配置於同—位置。同時,投 影單元120之投影焦距與影像感測單元13〇之感測焦距實 質上相同。在此,投影單元丨20之投影焦距例如其投影鏡 頭的焦距,而影像感測草元130之感測焦距例如是CCD 等 鏡頭的焦距或是CMOS感測元件的鏡頭焦距。因此,影像 感測單元130不需再使用影像辨識程式重新定義投影晝面 的大小,可降低影像解碼單元144解析影像晝面及光點參 數的複雜度,以及影像辨識程式處理的複雜度。 在本實施例中,投影單元12〇及影像感測單元13〇係 整合於一體。在另一實施例中,控制單元也可進一步與投 影單元及影像感測單元整合為一互動式投影裝置 (interactive projector)。 # 圖3繪示本發明另一實施例之互動式投影系統。請參 照圖3,在本實施例中,控制單元進一步與投影單元及影 像感測單元整合為一互動式投影裝置。 圖4緣示圖3之互動式投影裝置的方塊示意圖。請參 照圖3及圖4,在本實施例中,互動式投影裝置35〇包括 才又景彡單元320、影像感測單元330以及控制單元340。在此, 控制單元340包括處理單元342、影像解碼單元344、輸入 介面346及網路連接裝置348。 201142466Sensing of xxl/-^.vi 0-0005-TW 34022twf.doc/I (charge coupled device, CCD) or complementary metal-oxide-semiconductor sens〇r (CMOS sensor) Device. In addition, in the present embodiment, the projection unit 120 and the image sensing unit 130 are integrated and substantially disposed at the same position. At the same time, the projection focal length of the projection unit 120 is substantially the same as the sensing focal length of the image sensing unit 13A. Here, the projection focal length of the projection unit 例如20 is, for example, the focal length of the projection lens, and the sensing focal length of the image sensing grass 130 is, for example, the focal length of a lens such as a CCD or the focal length of the lens of the CMOS sensing element. Therefore, the image sensing unit 130 does not need to use the image recognition program to re-define the size of the projection surface, thereby reducing the complexity of the image decoding unit 144 for analyzing the image plane and the light spot parameters, and the complexity of the image recognition program processing. In this embodiment, the projection unit 12 and the image sensing unit 13 are integrated. In another embodiment, the control unit can be further integrated with the projection unit and the image sensing unit as an interactive projector. FIG. 3 illustrates an interactive projection system according to another embodiment of the present invention. Referring to FIG. 3, in the embodiment, the control unit is further integrated with the projection unit and the image sensing unit as an interactive projection device. 4 is a block diagram showing the interactive projection apparatus of FIG. 3. Referring to FIG. 3 and FIG. 4, in the embodiment, the interactive projection device 35 includes a landscape unit 320, an image sensing unit 330, and a control unit 340. Here, the control unit 340 includes a processing unit 342, an image decoding unit 344, an input interface 346, and a network connection device 348. 201142466
HD-20 l〇-〇〇〇5-TW 34022twf.doc/I 在本實施例中,處理單元342搭配影像解碼單元344 用以分別識別及區分屏幕上的影像晝面及光點參數,並執 行系統操作。輸入介面346例如是通用序列匯排流、視頻 圖开^陣列訊號線或安全數位記憶卡插槽,以使控制單元 340可藉此讀取或接收多媒體資料。網路連接裝置用 以與有線網路或無線網路連接’使控制單元34〇可藉此與 其他電子裝置連接。 另外,本貫施例與圖1的實施例相同或相似的部份, 可以由圖1的實施例之敘述中獲致足夠的教示、建議與實 施說明,因此不再贅述。 〇、 力圖5繪示本發明另一實施例之互動式投影系統。請參 照圖5 ’在本實施例中,控制單元54〇例如是一筆記型電 月匈,而影像感測單元530例如是整合於筆記型電腦的網路 攝影機(webcam)或是一與筆記型電腦外接的網路攝影機。 圖6繪示圖5之互動式投影系統的方塊示意圖。為了 =便說明起見,圖6係以不同的功能方塊分別繪示筆記型 =腦542、影像解碼單元544及網路攝影機53〇,但實質上 解碼單元544及網路攝影機530係整合於筆記型電腦 °月參照圖6 ’在本實施例中,筆記型電腦542藉由網 路攝影,530感測屏幕51〇所顯示影像晝面及光點ρι'、 P2 ’並藉由其影像解碼單元544解析影像晝面及光點、 P2的參數,使得筆記型電腦542可根據其解析結果,分別 識別及區分投射屏幕上不同的光點ρι、p2,並執行對應光 201142466 叫-沾 10-0005-TW 34022twf_doc/r 或:2之系統操作。另外’在本實施例中,筆記型電 旦自你可0括衫像辨識程式,以搭配網路攝影機530及 衫像解碼單元544辨識屏幕別所顯示影像晝面 光點 PI、P2。 一 π 應注意的是,在本實施例中,投影單元52〇之投影隹 距與網路攝影機530之感測焦距並不相同。例如,投影^ 兀52〇之投影範圍小於網路攝影機53〇(影像感測單元)之 感測範圍。因此’筆記型電腦542可根據網路攝影機53〇 二傳送的訊號,辨識並擷取投影範圍中的資訊。也就是說, 安裝於筆記型電腦542的影像辨識程式可自動辨識投影晝 面的大小,以配合不同的投影距離。 — 另外,本實施例與圖1的實施例相同或相似的部份, 可以由圖1的實施例之敘述中獲致足夠的教示、建議與實 施說明,因此不再贅述。 圖7繪示本發明一實施例之系統操作之方法流程圖。 請參照圖1及圖7,首先,在步驟S700中,投影單元12〇 鲁將控制單元140所欲顯示的影像晝面投射於屏幕U〇,以 及使用者藉由光發射器LP1、LP2將光點Ρ卜Ρ2投射於屏 幕110。接著,在步驟S702中,影像感測單元130會感測 屏幕110所顯示影像晝面及光點P1、P2的亮度及顏色或 形狀。 之後,以光點P1為例,在步驟S704中,若光點P1 的亮度大於一亮度臨界值時,則影像感測單元130即判斷 此時的光點P1為一有效光點,並將其所感測到的影像晝 11 201142466HD-20 l〇-〇〇〇5-TW 34022twf.doc/I In this embodiment, the processing unit 342 is used in conjunction with the image decoding unit 344 to separately identify and distinguish the image plane and the light spot parameters on the screen, and execute System operation. The input interface 346 is, for example, a universal serial stream, a video map array line or a secure digital memory card slot, so that the control unit 340 can thereby read or receive multimedia material. The network connection device is used to connect to a wired network or a wireless network' so that the control unit 34 can be connected to other electronic devices. In addition, the same or similar portions of the present embodiment as those of the embodiment of Fig. 1 can be sufficiently taught, suggested, and implemented by the description of the embodiment of Fig. 1, and therefore will not be described again. 〇, FIG. 5 illustrates an interactive projection system according to another embodiment of the present invention. Referring to FIG. 5, in the embodiment, the control unit 54 is, for example, a notebook type, and the image sensing unit 530 is, for example, a webcam integrated in a notebook computer or a notebook. An external network camera. 6 is a block diagram of the interactive projection system of FIG. 5. For the sake of explanation, FIG. 6 shows a notebook type 542, a video decoding unit 544, and a network camera 53 以 in different functional blocks, but the decoding unit 544 and the network camera 530 are integrated into the note. Referring to FIG. 6 in the present embodiment, in the present embodiment, the notebook computer 542 senses the image plane and the light spots ρι', P2' displayed on the screen 51 by the network photography, and by the image decoding unit thereof. 544 analyzes the parameters of the image plane and the light spot, P2, so that the notebook computer 542 can identify and distinguish different light spots ρι, p2 on the projection screen according to the analysis result thereof, and execute the corresponding light 201142466 called - dip 10-0005 -TW 34022twf_doc/r or: 2 system operation. In addition, in the present embodiment, the notebook type can be used to identify the image display points PI, P2 displayed on the screen by the network camera 530 and the shirt image decoding unit 544. It should be noted that in the present embodiment, the projection pitch of the projection unit 52 is not the same as the sensing focal length of the network camera 530. For example, the projection range of the projection 〇 52 小于 is smaller than the sensing range of the network camera 53 〇 (image sensing unit). Therefore, the notebook computer 542 can recognize and capture information in the projection range according to the signal transmitted by the network camera 53. That is to say, the image recognition program installed in the notebook computer 542 can automatically recognize the size of the projection surface to match different projection distances. In addition, the same or similar parts of the embodiment as those of the embodiment of Fig. 1 can be sufficiently taught, suggested and implemented by the description of the embodiment of Fig. 1, and therefore will not be described again. FIG. 7 is a flow chart of a method for operating a system according to an embodiment of the invention. Referring to FIG. 1 and FIG. 7, first, in step S700, the projection unit 12 projects the image plane to be displayed by the control unit 140 on the screen U〇, and the user uses the light emitters LP1, LP2 to light the light. The dot 2 is projected on the screen 110. Next, in step S702, the image sensing unit 130 senses the brightness and color or shape of the image plane and the light spots P1, P2 displayed on the screen 110. After the light spot P1 is taken as an example, in step S704, if the brightness of the light spot P1 is greater than a brightness threshold value, the image sensing unit 130 determines that the light spot P1 at this time is an effective light spot, and The image sensed 昼11 201142466
HD-2010-0005-TW 34〇22twf.doc/I 面及送至控制單元⑽進行分析,如步驟_6。 魏地’在步驟S704中,若光點p2的 =時’則影像感測單元13〇即判斷此時二二 有效J點,麟其所感_的影像畫面及光點 ^ •控制單兀140進行分析,如步驟S706。 ^ 應注意暇,林實_ t,若総p -的免度大於亮度臨界值,财#_之 j = 流程即會進行步驟S706。 …先知作的方法 相反地,若光點Ρ^2的亮度皆小於 1 界值時,則影像感測單元130會回、〇 影像晝面及光點的感測。 /02,進仃繼續 繼之,在步驟S708中,控制單元Μ 單元144解析影像晝面以及光點ρι、ρ 像解碼 亮度及顏色或形狀,以及光點ρι、ρ卜;^中之一的 像晝面的位置。 乂/、中之一在影 接著,在步驟S710中,控制單元14〇 根據影像解碼單元144的解析結果,分9,系統142 屏幕上不同的光點Η、Ρ2,並執 光0 j,區分投射 其中之-的系統操作。 wTLiiPi、P2至少 光點==判;若;^中, 根據光點亮度的高低來決定執行或4二== P2之系統操作。或者,主機系统14 :先點P1及 低來決定執行對應光點P1及P2度的向 12HD-2010-0005-TW 34〇22twf.doc/I face and sent to the control unit (10) for analysis, such as step _6. Wei Di 'in step S704, if the light spot p2 = time', the image sensing unit 13 determines that the effective two points are at this time, and the image picture and the light spot of the sensation _ are controlled by the control unit 140. Analysis, as in step S706. ^ It should be noted that, in the case of Lin Shi _ t, if the degree of exemption of 総p - is greater than the threshold value of brightness, the process of step #706 is performed. ...the method of the prophecy. Conversely, if the brightness of the light spot Ρ^2 is less than the 1 boundary value, the image sensing unit 130 will return, 昼 the image surface and the sensing of the light spot. /02, proceeding continues, in step S708, the control unit 单元 unit 144 analyzes the image plane and the light point ρι, ρ image decoding brightness and color or shape, and one of the light spots ρι, ρb; Like the position of the face. One of 乂/, then, in step S710, the control unit 14 分 according to the analysis result of the image decoding unit 144, divides 9, the different spots Η, Ρ 2 on the screen of the system 142, and executes 0 j to distinguish Project the system operation of which -. wTLiiPi, P2 at least the light point == judgment; if; ^, according to the brightness of the light point to determine the execution or 4 two == P2 system operation. Alternatively, the host system 14: first points P1 and low to determine the direction to perform the corresponding spot P1 and P2 degrees.
201142466 HD-2010-0005-TW 34022twf.doc/I 舉例而言’备光點P1之亮度大於光點p2之亮度時, 主機糸統142會執彳讀應光點pi n 行或不執行對應光點P2之系統操作。當主㈣统 擇執行對應光點P2± ^ ’、 ㈣古w 作主機系統142會根據 先點冗度的向低,依序執行對應光點Η之系统操作,以 及對應光點Ρ2之系統操作。 ’、' /、 /在本實施例中’系統操作方法係以圖1的互動式投影 系統100為例,其投影單亓ηη 一…:、又〜早70120之投影焦距與影像感測萆 兀130之感測貫質上焦距相同,因此影像感測單元13〇不 需f新定義投影晝面的大小,但本發明並不限於此。在另 -貫施例中,本發_系統操作之方㈣可實施於圖6的 互動式投影系統500。 圖8繪示本發明另一實施例之系統操作方法流程圖。 明參關6及® 8 ’在本實施例巾,系統操作方法例如是 實施於圖6的互動式投影系統5〇〇。 在圖6的互動式投影系統50〇中,投影單元52〇之投 影焦距與網路攝影機530之感測焦距並不相同。因此,本 實施例之系統操作方法在感測屏幕510所顯示影像晝面及 光點PI、P2的亮度及顏色或形狀的步驟之前,必須先辨 識投影晝面的大小,以配合不同的投影距離。 在步驟S800中,投影單元520先顯示一校正畫面。 接著,在步驟S802中,網路攝影機530感測投射到屏幕 510上的校正晝面。在步驟S804中,若網路攝影機53〇未 感測到校正晝面,則筆記型電腦542會調整網路攝影機53〇 13 201142466201142466 HD-2010-0005-TW 34022twf.doc/I For example, when the brightness of the standby point P1 is greater than the brightness of the spot p2, the host system 142 will read the light point pi n or not execute the corresponding light. Point P2 system operation. When the main (4) chooses to execute the corresponding spot P2± ^ ', (4) the ancient w is used as the host system 142, according to the low point of the first point, the system operation corresponding to the spot 依 is executed in sequence, and the system operation corresponding to the spot Ρ 2 . ', ' /, / In the present embodiment, the system operation method is taken as an example of the interactive projection system 100 of FIG. 1 , and the projection focal length and image sensing of the projection unit 亓ηη:: The sensed focal length of 130 is the same, so the image sensing unit 13 does not need to newly define the size of the projected face, but the invention is not limited thereto. In another embodiment, the method (4) of the present invention can be implemented in the interactive projection system 500 of FIG. FIG. 8 is a flow chart showing a method of operating a system according to another embodiment of the present invention. In the embodiment of the present invention, the system operation method is, for example, the interactive projection system 5 implemented in Fig. 6. In the interactive projection system 50A of Fig. 6, the projection focal length of the projection unit 52 is not the same as the sensing focal length of the webcam 530. Therefore, before the step of sensing the brightness and color or shape of the image plane and the light spots PI, P2 displayed on the screen 510, the system operation method must first identify the size of the projection surface to match different projection distances. . In step S800, the projection unit 520 first displays a correction screen. Next, in step S802, the webcam 530 senses the correction plane projected onto the screen 510. In step S804, if the network camera 53 does not sense the correction surface, the notebook computer 542 adjusts the network camera 53 13 201142466
HD-2010-0005-TW 34022twf.doc/I 的拍攝角度’以涵蓋所欲感測的校正晝面,如步驟S8〇6。 此時’本實施例之系統操作的方法流程即會回到步驟 S802,重新感測校正晝面。 在步驟S804中,若感測到校正晝面,則網路攝影機 530會將所感測到的校正晝面傳送至筆記型電腦542。之 後,在步驟S808中,筆記型電腦542藉由影像解碼單元 544計异校正晝面的大小,並自動調整及縮放(scalar)其參 數,以對映(map)屏幕510將顯示的影像晝面。 繼之,在步驟S810中,投影單元52〇將筆記型電腦 542所欲。顯不的影像晝面投射於屏幕51〇,以及使用者藉由 光發射器LP卜LP2將光點ρ卜Ρ2投射於屏幕51〇。 在本實施例中,其餘的步驟流程與圖7的步驟流程相 同或相似的部份’可以㈣7的實施例之敘述帽致足夠 的教不、建議與實施說明,因此不再贅述。 像書多個範例實施例’以說明於投影屏幕的影 ^ 、圖不、拖髮物件及捲動視窗捲轴等的系統操 900本f明—實施例之投影於屏幕的影像晝面 ,、匕括一作業系統的視窗91〇及 窗_包括-視窗捲軸912,而建 面的=徑或儲存於桌面_案。 丨如疋建立於桌 :月^圖i及圖9,以影像晝面9 測早兀130判斷光點ρι或p 」右衫像感 P2的位置經影像解 _ ,, 光點P1或 1像%碼早凡144之解析係落於物件92〇時, 201142466The shooting angle of HD-2010-0005-TW 34022twf.doc/I is to cover the corrected face to be sensed, as in step S8〇6. At this time, the method flow of the system operation of the present embodiment returns to step S802 to re-sensing the correction plane. In step S804, if the corrected face is sensed, the network camera 530 transmits the sensed facet to the notebook 542. Thereafter, in step S808, the notebook computer 542 adjusts the size of the face by the image decoding unit 544, and automatically adjusts and scalars its parameters to map the displayed image to the screen 510. . Next, in step S810, the projection unit 52 〇 desires the notebook computer 542. The displayed image is projected on the screen 51, and the user projects the spot ρ Ρ 2 on the screen 51 by the light emitter LPb LP2. In the present embodiment, the remaining step flow is the same as or similar to the step flow of the step of Fig. 7. The description of the embodiment of the fourth embodiment can be adequately taught, suggested, and implemented, and therefore will not be described again. A plurality of exemplary embodiments of the book, for example, a system for explaining a shadow of a projection screen, a drawing, a dragging object, and a scrolling window scroll, etc., which are projected onto the screen of the screen, Included in the window 91 of the operating system and the window _ include - window reel 912, and the built-in = diameter or stored in the desktop _ case.丨 Ruo 疋 疋 疋 疋 疋 : : : : : : : 疋 疋 疋 疋 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , % code early 144 analysis is in the object 92〇, 201142466
-2010-0005-TW 34022twf.doc/I 特定期間之後,控制單元14G會執行對應物_ 舉例而言,若物件920為儲存於桌面的槽案 ^1 光點P1點選物件920時,控制單元140即:開 ===類似地’若物件920為建立‘ 單元MO gP合點選物件92G時,控制 庫㈣二? 聯的位置’開啟檔案或執行 式。換句話說’控制單元根據光點中至少 二 又射在影像晝面之位置,執行對應光點的—點選^作。 有效f方面,若影像感測單元130判斷光點P1或1>2為 移動時f二,=點選物件92〇並在屏幕11。: 之移動軌:t早:據光點P1或P2在影像晝面_ 如,若使=ή 光點P1或P2的—拖$操作。例 以移動光畔^要點QP1點選物件920並沿著移動執跡 沿著移動時’則控制單元14G將物件920 係根據光畔中至=,=920’。換句話說’控制單元 對應該光點的操:。在影像畫面之移動執跡,執行 有效光點方_^若影像感測單元130判斷光點P1或P2為 時,控制單- 或P2係投射於視窗捲軸912並移動 m,勃"14〇根據光點P1或P2在捲軸之移動方6 者點ρ1Λρί的—捲動操作。例如,若使用 則控制單元\ 1點選視窗捲車由912並向下移動光點Ρ1時, 千几ho會向下捲動視窗捲軸912,以改變其顯示 15 201142466-2010-0005-TW 34022twf.doc/I After a specific period, the control unit 14G executes the counterpart _ For example, if the object 920 is the slot file 1 stored in the desktop, the object 920 is selected, the control unit 140: Open === Similarly, if the object 920 is to create a unit MO gP point selection object 92G, the control library (four) two? Linked location 'Open file or executable. In other words, the control unit performs a point-and-click selection of the corresponding spot based on at least two of the spots being shot at the position of the image plane. In the case of the effective f, if the image sensing unit 130 determines that the light spot P1 or 1 > 2 is f 2 when moving, the object 92 is clicked and is on the screen 11. : Move track: t early: According to the spot P1 or P2 in the image _, eg, if you make = ή spot P1 or P2 - drag $ operation. For example, the control unit 14G controls the object 920 according to the light side to =, = 920'. In other words, the control unit corresponds to the operation of the light spot: In the movement of the image frame, the effective spot is executed. If the image sensing unit 130 determines that the spot P1 or P2 is the control, the control unit- or P2 is projected on the window reel 912 and moves m, Bo "14〇 According to the spot P1 or P2, the scrolling operation is performed at the point of movement of the reel 6 ρ1 Λ ρί. For example, if the control unit \ 1 clicks on the window reel by 912 and moves the spot Ρ 1 down, the ho will scroll down the window reel 912 to change its display 15 201142466
HD-2010.0005.tw 34022twf.doc/I 内容。換句話說,控制單元根據光點中至少其中之一在捲 軸之移動方向,執行對應該光點的一捲動操作。 在本實施例_,各種系統操作係以圖1的互動式投影 系統100為例,但本發明並不限於此。在另一實施例中, 各種系統操作亦可實施於圖6的互動式投影系統5〇〇。 另外,本實施例與圖1〜圖8的實施例相同或相似的部 份’可以由圖1〜圖8的實施例之敘述中獲致足夠的教示、 建議與實施說明,因此不再贅述。 綜上所述,在本發明之實施例中,互動式投影系統藉 由辨識在投影系統屏幕上的多個光點,可同時提供多個使 用者在投影系統的屏幕上進行系統操作,不僅可以顛覆傳 統使用鍵盤的輸入方式,更可以大大地增進使用投影系統 的便利性。 雖然本發明已以實施例揭露如上,然其並非用以限定 本發明,任何所屬技術領域中具有通常知識者,在不脫離 本發明之精神和範圍内,當可作些許之更動與潤飾,故本 發明之保護範圍當視後附之申請專利範圍所界定者為準。 【圖式簡單說明】 圖1繪示本發明一實施例之互動式投影系統。 圖2繪示圖1的光發射器投射在屏幕上不同光點的示 意圖。 圖3繪示本發明另一實施例之互動式投影系統。 圖4繪示圖3之互動式投影裝置的方塊示意圖。HD-2010.0005.tw 34022twf.doc/I Content. In other words, the control unit performs a scrolling operation corresponding to the spot based on at least one of the spots in the moving direction of the reel. In the present embodiment, various system operations are exemplified by the interactive projection system 100 of Fig. 1, but the present invention is not limited thereto. In another embodiment, various system operations can also be implemented in the interactive projection system 5 of FIG. In addition, the parts of the present embodiment that are the same as or similar to those of the embodiment of Figs. 1 to 8 can be sufficiently taught, suggested, and implemented by the description of the embodiment of Figs. 1 to 8, and therefore will not be described again. In summary, in the embodiment of the present invention, the interactive projection system can simultaneously provide multiple users to perform system operations on the screen of the projection system by recognizing multiple spots on the screen of the projection system. Subverting the traditional input method using the keyboard can greatly enhance the convenience of using the projection system. Although the present invention has been disclosed in the above embodiments, it is not intended to limit the invention, and any one of ordinary skill in the art can make some modifications and refinements without departing from the spirit and scope of the invention. The scope of the invention is defined by the scope of the appended claims. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 illustrates an interactive projection system in accordance with an embodiment of the present invention. 2 is a schematic illustration of the light emitter of FIG. 1 projected onto different spots on the screen. 3 illustrates an interactive projection system in accordance with another embodiment of the present invention. 4 is a block diagram of the interactive projection apparatus of FIG. 3.
201142466 tuj-zv 10-0005-T W 34022twf.doc/I 圖5繪示本發明另一實施例之互動式投影系統。 圖6繪示圖5之互動式投影系統的方塊示意圖。 圖7纟會示本發明一實施例之系統操作之方法流程圖 圖8繪示本發明另一實施例之糸統操作方法流程圖 圖9為本發明一實施例之投影於屏幕的影像^面。 【主要元件符號說明】 100、300、500 :互動式投影系統 110、310、510 :屏幕 120、320、520 :投影單元 130、330 :影像感測單元 140、340、540 :控制單元 142:主機系統 144、344、544 :影像解碼單元 342 :處理單元 346 :輸入介面 348 :網路連接裝置 350 :互動式投影農置 530 :影像感測單元、網路攝影機 542 :筆記型電腦 900 :影像晝面 912 :視窗捲軸 912’ :位置201142466 tuj-zv 10-0005-T W 34022twf.doc/I FIG. 5 illustrates an interactive projection system according to another embodiment of the present invention. 6 is a block diagram of the interactive projection system of FIG. 5. FIG. 8 is a flow chart showing a method for operating a system according to an embodiment of the present invention. FIG. 8 is a flow chart showing a method for operating a system according to another embodiment of the present invention. FIG. 9 is a view showing an image projected on a screen according to an embodiment of the present invention. . [Main Component Symbol Description] 100, 300, 500: Interactive Projection System 110, 310, 510: Screens 120, 320, 520: Projection Units 130, 330: Image Sensing Units 140, 340, 540: Control Unit 142: Host System 144, 344, 544: image decoding unit 342: processing unit 346: input interface 348: network connection device 350: interactive projection farm 530: image sensing unit, network camera 542: notebook computer 900: image file Face 912: Window Reel 912': Position
Ul、U2 :使用者 17 201142466Ul, U2: User 17 201142466
HD-2010-0005-TW 34022twf.doc/I LP1、LP2 :光發射器 PI、P2 :光點 SI :移動執跡 D1 .移動方向 系統操作方法步驟:S700、S702、S704、S706、S708、 S710、S800、S802' S804、S806、S808、S810、S812、S814、 S816 、 S818 、 S820HD-2010-0005-TW 34022twf.doc/I LP1, LP2: Optical Transmitter PI, P2: Spot SI: Mobile Excavation D1. Moving Direction System Operation Method Steps: S700, S702, S704, S706, S708, S710 , S800, S802' S804, S806, S808, S810, S812, S814, S816, S818, S820
Claims (1)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW99115927A TWI408488B (en) | 2010-05-19 | 2010-05-19 | Interactive projection system and system control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW99115927A TWI408488B (en) | 2010-05-19 | 2010-05-19 | Interactive projection system and system control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
TW201142466A true TW201142466A (en) | 2011-12-01 |
TWI408488B TWI408488B (en) | 2013-09-11 |
Family
ID=46765060
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW99115927A TWI408488B (en) | 2010-05-19 | 2010-05-19 | Interactive projection system and system control method thereof |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI408488B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104375629A (en) * | 2013-08-14 | 2015-02-25 | 纬创资通股份有限公司 | Electronic device and control method thereof |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI506483B (en) * | 2013-12-13 | 2015-11-01 | Ind Tech Res Inst | Interactive writing device and operating method thereof using adaptive color identification mechanism |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7242388B2 (en) * | 2001-01-08 | 2007-07-10 | Vkb Inc. | Data input device |
JP2002365721A (en) * | 2001-06-12 | 2002-12-18 | Nec Viewtechnology Ltd | Projection type display device and image detecting method therefor |
JP2003044221A (en) * | 2001-07-31 | 2003-02-14 | Fuji Photo Optical Co Ltd | Presentation system |
US20030222849A1 (en) * | 2002-05-31 | 2003-12-04 | Starkweather Gary K. | Laser-based user input device for electronic projection displays |
TWM249135U (en) * | 2003-12-05 | 2004-11-01 | Kye System Corp | Pointer input device |
TWI305892B (en) * | 2005-11-23 | 2009-02-01 | Inst Information Industry | Apparatus, computer equipment, method and computer readable media for simultaneously controlling a cursor and an optical pointer |
TW200737116A (en) * | 2006-03-31 | 2007-10-01 | Jyh-Horng Chen | Controlling system for displacement of computer cursor by laser light spot |
-
2010
- 2010-05-19 TW TW99115927A patent/TWI408488B/en not_active IP Right Cessation
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104375629A (en) * | 2013-08-14 | 2015-02-25 | 纬创资通股份有限公司 | Electronic device and control method thereof |
TWI498806B (en) * | 2013-08-14 | 2015-09-01 | Wistron Corp | Electronic devices and methods for controlling electronic devices |
CN104375629B (en) * | 2013-08-14 | 2017-05-17 | 纬创资通股份有限公司 | Electronic device and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
TWI408488B (en) | 2013-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9965039B2 (en) | Device and method for displaying user interface of virtual input device based on motion recognition | |
US10120454B2 (en) | Gesture recognition control device | |
US10228848B2 (en) | Gesture controlled adaptive projected information handling system input and output devices | |
KR102649254B1 (en) | Display control method, storage medium and electronic device | |
US9348420B2 (en) | Adaptive projected information handling system output devices | |
US9535595B2 (en) | Accessed location of user interface | |
US9588673B2 (en) | Method for manipulating a graphical object and an interactive input system employing the same | |
US10276133B2 (en) | Projector and display control method for displaying split images | |
US20150268773A1 (en) | Projected Information Handling System Input Interface with Dynamic Adjustment | |
EP2680110A1 (en) | Method and apparatus for processing multiple inputs | |
CN112947825B (en) | Display control method, display control device, electronic equipment and medium | |
CN111052063B (en) | Electronic device and control method thereof | |
US20200142495A1 (en) | Gesture recognition control device | |
US20140340344A1 (en) | Display processor and display processing method | |
CN109101172B (en) | Multi-screen linkage system and interactive display method thereof | |
US20140333585A1 (en) | Electronic apparatus, information processing method, and storage medium | |
US10521101B2 (en) | Scroll mode for touch/pointing control | |
EP2965164B1 (en) | Causing specific location of an object provided to a device | |
US10990344B2 (en) | Information processing apparatus, information processing system, and information processing method | |
US20150268731A1 (en) | Interactive Projected Information Handling System Support Input and Output Devices | |
JP2014033381A (en) | Information processing apparatus and program, and image processing system | |
EP4485167A1 (en) | Multi-screen interaction method and electronic device | |
CN107239178A (en) | Display system, information processor, projecting apparatus and information processing method | |
JP6286836B2 (en) | Projection system, projection apparatus, projection method, and projection program | |
TW201142466A (en) | Interactive projection system and system control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MM4A | Annulment or lapse of patent due to non-payment of fees |