TWI870744B - Control device and control method - Google Patents
Control device and control method Download PDFInfo
- Publication number
- TWI870744B TWI870744B TW111149610A TW111149610A TWI870744B TW I870744 B TWI870744 B TW I870744B TW 111149610 A TW111149610 A TW 111149610A TW 111149610 A TW111149610 A TW 111149610A TW I870744 B TWI870744 B TW I870744B
- Authority
- TW
- Taiwan
- Prior art keywords
- control
- user
- point
- ray
- control point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 20
- 230000004044 response Effects 0.000 claims description 15
- 230000006870 function Effects 0.000 abstract description 8
- 210000003811 finger Anatomy 0.000 description 19
- 238000010586 diagram Methods 0.000 description 18
- 230000036544 posture Effects 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 6
- 210000004247 hand Anatomy 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 210000000707 wrist Anatomy 0.000 description 4
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 3
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 241000282326 Felis catus Species 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 1
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Landscapes
- Position Input By Displaying (AREA)
- Image Generation (AREA)
Abstract
Description
本發明是有關於一種控制裝置,且特別是有關於一種控制裝置及控制方法。The present invention relates to a control device, and more particularly to a control device and a control method.
為了給用戶帶來沉浸式體驗,各種技術不斷地被開發,例如增強實境(augmented reality;AR)和虛擬實境(virtual reality;VR)等。AR技術允許用戶將虛擬元素帶入現實世界。VR技術允許用戶進入整個一個全新的虛擬世界,體驗不一樣的生活。並且,穿戴式裝置常用來提供這些沉浸式體驗。In order to bring immersive experiences to users, various technologies are constantly being developed, such as augmented reality (AR) and virtual reality (VR). AR technology allows users to bring virtual elements into the real world. VR technology allows users to enter a whole new virtual world and experience a different life. In addition, wearable devices are often used to provide these immersive experiences.
本發明提供一種控制裝置及控制方法,讓使用者在虛擬世界中能夠對物體精準地進行控制。The present invention provides a control device and a control method, which enable a user to accurately control an object in a virtual world.
本發明的控制裝置適於控制虛擬世界中的物體。控制裝置包括顯示器以及控制器。顯示器用以顯示虛擬世界。控制器耦接顯示器。控制器用以執行以下功能。在虛擬世界中,在使用者的周圍形成控制面。從物體發射出第一射線。基於第一射線,在控制面上形成第一控制點。根據第一控制點,對物體進行第一控制。The control device of the present invention is suitable for controlling an object in a virtual world. The control device includes a display and a controller. The display is used to display the virtual world. The controller is coupled to the display. The controller is used to perform the following functions. In the virtual world, a control surface is formed around the user. A first ray is emitted from the object. Based on the first ray, a first control point is formed on the control surface. According to the first control point, a first control is performed on the object.
本發明的控制方法適於控制虛擬世界中的物體。控制方法包括:在使用者的周圍形成控制面;從物體發射出第一射線;基於第一射線,在控制面上形成第一控制點;以及根據第一控制點,控制物體。The control method of the present invention is suitable for controlling an object in a virtual world. The control method comprises: forming a control surface around a user; emitting a first ray from the object; forming a first control point on the control surface based on the first ray; and controlling the object according to the first control point.
基於上述,經由從物體來發射出射線,從而使用者可以在虛擬世界中精準地對物體進行控制。Based on the above, by emitting rays from an object, the user can precisely control the object in the virtual world.
為了使本發明之內容可以被更容易明瞭,以下特舉實施例作為本發明確實能夠據以實施的範例。另外,凡可能之處,在圖式及實施方式中使用相同標號的原件/構件/步驟,係代表相同或類似部件。In order to make the content of the present invention more clearly understood, the following embodiments are specifically cited as examples by which the present invention can be truly implemented. In addition, wherever possible, the same reference numerals are used in the drawings and embodiments to represent the same or similar components.
並且,除非另有定義,本文使用的所有術語(包括技術和科學術語)具有與本發明所屬領域的普通技術人員通常理解的相同的含義。將進一步理解的是,諸如在通常使用的字典中定義的那些術語應當被解釋為具有與它們在相關技術和本發明的上下文中的含義一致的含義,並且將不被解釋為理想化的或過度正式的意義,除非本文中明確地這樣定義。Furthermore, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by a person of ordinary skill in the art to which the present invention belongs. It will be further understood that those terms as defined in commonly used dictionaries should be interpreted as having a meaning consistent with their meaning in the context of the relevant art and the present invention, and will not be interpreted as an idealized or overly formal meaning unless expressly defined as such herein.
通過參考以下的詳細描述並同時結合附圖可以理解本發明。須注意的是,為了使讀者能容易瞭解及為了圖式的簡潔,本發明中的多張圖式只繪出電子裝置的一部分,且圖式中的特定元件並非依照實際比例繪圖。此外,圖中各元件的數量及尺寸僅作為示意,並非用來限制本發明的範圍。The present invention can be understood by referring to the following detailed description in conjunction with the attached drawings. It should be noted that, in order to facilitate the reader's understanding and for the sake of simplicity of the drawings, the various drawings in the present invention only depict a portion of the electronic device, and the specific components in the drawings are not drawn according to the actual scale. In addition, the number and size of each component in the drawings are only for illustration and are not used to limit the scope of the present invention.
須知悉的是,以下所舉實施例可以在不脫離本發明的精神下,將數個不同實施例中的技術特徵進行替換、重組、混合以完成其他實施例。並且,在下文說明書與申請專利範圍中,「含有」與「包括」等詞為開放式詞語,因此其應被解釋為「含有但不限定為…」之意。It should be noted that the following embodiments may replace, reorganize, or mix the technical features of several different embodiments to complete other embodiments without departing from the spirit of the present invention. In addition, in the following specification and patent application, the words "including" and "comprising" are open-ended words, and therefore should be interpreted as "including but not limited to..."
為了給用戶帶來沉浸式體驗,各種技術不斷地被開發,例如增強實境(augmented reality;AR)和虛擬實境(virtual reality;VR)等。AR技術允許用戶將虛擬元素帶入現實世界。VR技術允許用戶進入整個一個全新的虛擬世界,體驗不一樣的生活。並且,穿戴式裝置常用來提供這些沉浸式體驗。In order to bring immersive experiences to users, various technologies are constantly being developed, such as augmented reality (AR) and virtual reality (VR). AR technology allows users to bring virtual elements into the real world. VR technology allows users to enter a whole new virtual world and experience a different life. In addition, wearable devices are often used to provide these immersive experiences.
在虛擬世界中,若要對物體進行控制,可由控制裝置或使用者的位置,例如使用者身體的任一部位(手腕、手指),發射出一道射線。並且,射線所指向的物體,則為被控制的物體。然而,此種控制方法須先指定與辨識射線的起始位置並對使用者所指向的方向進行精準的判斷,對於識別的準確度要求較高,從而提高在設計控制裝置時的複雜程度。並且,使用者往往需要維持在特定的姿勢或手勢(例如握拳或張開),以進行正確的控制。也就是說,當使用者所指向的方向不清楚或是沒有維持在特定的姿勢或手勢時,誤判可能就會產生。再者,由於使用者須維持在特定的手勢,從而不能隨意地擺放,降低了使用體驗。此外,透過控制裝置或手勢移動射線的過程也很容易發生控制裝置或手勢辨識不清,以及射線掃過其他物件時容易產生誤觸的情況。因此,如何在使用者保持舒適的姿態下,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制,是本領域技術人員一直以來所追求的課題。In the virtual world, if you want to control an object, you can emit a ray from the control device or the user's position, such as any part of the user's body (wrist, finger). And the object pointed by the ray is the controlled object. However, this control method must first specify and identify the starting position of the ray and accurately judge the direction pointed by the user. The recognition accuracy is high, which increases the complexity of designing the control device. In addition, the user often needs to maintain a specific posture or gesture (such as clenching a fist or opening it) to perform correct control. In other words, when the direction pointed by the user is unclear or the user does not maintain a specific posture or gesture, misjudgment may occur. Furthermore, since the user must maintain a specific gesture, he cannot place it at will, which reduces the user experience. In addition, when moving the ray through a control device or gesture, it is easy for the control device or gesture to be misidentified, and it is easy for the ray to accidentally touch other objects when scanning. Therefore, how to accurately control objects in the virtual world in a simple and effective way while keeping the user in a comfortable posture has always been a topic pursued by technical personnel in this field.
圖1A是依照本發明的一實施例的一種控制裝置的示意圖。參照圖1A,控制裝置100適於控制虛擬世界中的物體。並且,控制裝置100包括顯示器120以及控制器110。顯示器120用以顯示虛擬世界。控制器110耦接顯示器120。FIG1A is a schematic diagram of a control device according to an embodiment of the present invention. Referring to FIG1A , the
值得注意的是,控制器110可用以執行以下功能。首先,在顯示器120所顯示的虛擬世界中,在使用者的周圍形成控制面。接著,在顯示器120所顯示的虛擬世界中,從物體發射出第一射線。在一實施例中,第一射線可從物體的表面所發射,且第一射線是直線或曲線,本發明並不加以限制。然後,在顯示器120所顯示的虛擬世界中,基於第一射線,在控制面上形成第一控制點。在顯示器120所顯示的虛擬世界中,根據第一控制點,對物體進行第一控制。如此一來,使用者可經由控制面上的第一控制點,簡單且直覺地對物體進行精確的控制。It is worth noting that the
在一實施例中,控制器110可基於從物體所發射出的第一射線與使用者周圍的控制面的交會點,在控制面上形成所述第一控制點。然而,本揭露並不以此為限。在一實施例中,控制裝置100例如為頭戴式顯示器(head-mounted display, HMD)、穿戴式眼鏡(例如,AR/VR護目鏡)、電子裝置、其它類似裝置或這些裝置的組合上。然而,本揭露並不以此為限。In one embodiment, the
在一實施例中,控制器110例如為控制器(Microcontroller Unit, MCU)、中央處理單元(Central Processing Unit, CPU)、微處理器(Microprocessor)、數位訊號處理器(Digital Signal Processor, DSP)、可程式化控制器、可程式化邏輯裝置(Programmable Logic Device, PLD)或其他類似裝置或這些裝置的組合,本發明並不加以限制。此外,在一實施例中,控制器110的各功能可被實作為多個程式碼。這些程式碼會被儲存在一個記憶體中,由控制器110來執行這些程式碼。或者,在一實施例中,控制器110的各功能可被實作為一或多個電路。本發明並不限制用軟體或硬體的方式來實作控制器110的各功能。In one embodiment, the
在一個實施例中,顯示器120例如包括有機發光二極體(Organic Light-Emitting Diode, OLED)顯示裝置、迷你LED (Mini LED)顯示裝置、微型LED (Micro LED)顯示裝置、量子點(Quantum Dot, QD)LED顯示裝置、液晶顯示(Liquid-Crystal Display, LCD)裝置、平鋪(Tiled)顯示設備、可折疊(Foldable)顯示設備或電子紙顯示器(Electronic Paper Display, EPD)。然而,本揭露並不以此為限。In one embodiment, the
圖1B是依照本發明的一實施例的一種控制系統的示意圖。控制系統10包括控制裝置100以及攝影機130。控制裝置100的細節可參照圖1A的描述,在此不多加贅述。Fig. 1B is a schematic diagram of a control system according to an embodiment of the present invention. The
在一個實施例中,攝影機130包含例如互補金屬氧化物半導體(complementary metal oxide semiconductor, CMOS)相機或電荷耦合裝置(charge coupled device, CCD)相機,且輔助光單元包括紅外照射單元。然而,本揭露並不以此為限。此外,在一個實施例中,攝影機130可設置於控制裝置100之外,從而一體地形成控制系統10。在另一實施例中,攝影機130可設置於控制裝置100中。也就是說,本揭露並不限定攝影機130設置的位置。In one embodiment, the
在一實施例中,攝影機130可用以取得(拍攝)使用者在真實世界中的影像。並且,控制器110可基於取得的影像進行影像處理,從而判斷使用者的意圖。也就是說,控制器110可根據攝影機130取得的影像,產生與使用者的意圖有關的使用者命令。In one embodiment, the
在一實施例中,攝影機130可取得使用者影像。使用者影像包括在真實世界中的使用者的手(亦稱真實手)或手持遙控器。控制器110可從攝影機130接收上述使用者影像。並且,控制器110可根據使用者影像,判斷在虛擬世界中使用者的虛擬手或控制錨點是否接觸控制面上的第一控制點。舉例而言,在虛擬世界中使用者的虛擬手或控制錨點可對應於在真實世界中的手或手持遙控器,從而在虛擬世界中進行操作。當使用者在虛擬世界中的虛擬手或控制錨點接觸第一控制點時,使用者可經由第一控制點,對物體進行各種控制。換句話說,響應於虛擬手或控制錨點接觸第一控制點,控制器110可對物體進行第一控制。補充說明的是,為了方便說明,以下以虛擬手來進行描述,但本揭露並不限制在虛擬世界中是以虛擬手、控制錨點或其他具有相似功能的物件來進行操控。In one embodiment, the
在一實施例中,當使用者的虛擬手接近或接觸第一控制點時,第一控制點可顯示為反白(待選取)的狀態。在一實施例中,當使用者的虛擬手接近或接觸第一控制點時,第一控制點所對應的物體的周圍影像或物體所發射的第一射線可顯示提示標示,例如影像變色、亮度增加、輪廓強化、顯示文字或動畫說明、播放音效、或是第一射線變色、反白、跳動等變化。如此一來,使用者可清楚地得知目前可經由第一控制點來控制物體。In one embodiment, when the user's virtual hand approaches or touches the first control point, the first control point may be displayed as a highlighted (to be selected) state. In one embodiment, when the user's virtual hand approaches or touches the first control point, the surrounding image of the object corresponding to the first control point or the first ray emitted by the object may display a prompt mark, such as image color change, brightness increase, outline enhancement, display of text or animation description, playing of sound effects, or changes in the first ray color change, highlighting, jumping, etc. In this way, the user can clearly know that the object can be controlled via the first control point.
在一實施例中,控制面可例如為平面、立方體表面、圓柱面、圓球面或橢圓球面,但本揭露並不以此為限。在一實施例中,控制面與使用者之間的控制面距離可以是一預設距離,例如為40公分,但本揭露並不以此為限。在一實施例中,控制面距離可以根據使用者的個人參數來決定。在一實施例中,使用者的個人參數可例如包括使用者的手臂長度、前臂長度、身高、頭長或其他與各種身體數據,本揭露並不以此為限。由於人體比例有一定的規律,因此可以根據任一或多種身體數據的組合,推算出最適合使用者的控制面距離。在一實施例中,立方體表面、圓柱面、圓球面或橢圓球面的中心(例如重心、圓心或球心)或焦點可重疊於使用者。舉例而言,控制面為一個球面。並且,球面的中心重疊於使用者的中心點,且球面的半徑小於或等於使用者的手臂長度或前臂長度。又或者,控制面可為兩個球面。並且,兩個球面的中心分別位於使用者的左邊的肩膀的中心點與右邊的肩膀的中心點,且兩個球面的半徑小於或等於使用者的手臂長度或前臂長度。也就是說,控制面可形成於使用者的周圍,使得使用者可輕易地碰觸到控制面。此外,使用者可以依據本身姿態或應用情境,選擇適合的控制面形狀。再者,使用者還可以依據實際使用情境調整控制面的位置及範圍(大小),其實施方式於後續實施例中會再詳細說明。In one embodiment, the control surface may be, for example, a plane, a cubic surface, a cylindrical surface, a spherical surface, or an elliptical spherical surface, but the disclosure is not limited thereto. In one embodiment, the control surface distance between the control surface and the user may be a preset distance, such as 40 cm, but the disclosure is not limited thereto. In one embodiment, the control surface distance may be determined based on the user's personal parameters. In one embodiment, the user's personal parameters may include, for example, the user's arm length, forearm length, height, head length, or other various body data, but the disclosure is not limited thereto. Since human body proportions have certain regularities, the control surface distance that best suits the user may be calculated based on any one or a combination of multiple body data. In one embodiment, the center (e.g., center of gravity, center of a circle, or center of a sphere) or focus of a cubic surface, a cylindrical surface, a spherical surface, or an elliptical spherical surface may overlap with the user. For example, the control surface is a spherical surface. Moreover, the center of the spherical surface overlaps with the center point of the user, and the radius of the spherical surface is less than or equal to the length of the user's arm or forearm. Alternatively, the control surface may be two spherical surfaces. Moreover, the centers of the two spherical surfaces are located at the center point of the user's left shoulder and the center point of the user's right shoulder, respectively, and the radius of the two spherical surfaces is less than or equal to the length of the user's arm or forearm. In other words, the control surface may be formed around the user so that the user can easily touch the control surface. In addition, the user can choose a suitable control surface shape according to his or her own posture or application scenario. Furthermore, the user can also adjust the position and range (size) of the control surface according to the actual usage scenario, and the implementation method will be described in detail in the subsequent embodiments.
在一實施例中,控制裝置100中設置有感測器,並且感測器可偵測使用者配戴控制裝置100的位置,從而自動地取得使用者的個人參數。在另一實施例中,攝影機130可將拍攝包含使用者的姿態的姿態影像,並且控制器110可基於姿態影像,自動地計算出使用者的個人參數。在又一實施例中,使用者可自行地輸入使用者的個人參數至控制裝置100。換句話說,本揭露並不限制使用者的個人參數的取得方式。In one embodiment, a sensor is provided in the
值得一提的是,使用者是經由鄰近的控制面上的控制點來對物體進行控制,而不是直接地對遠方的物體進行控制。如此一來,使用者可輕易且準確地對物體進行控制,從而增加使用體驗。It is worth mentioning that users control objects through control points on the adjacent control surface, rather than directly controlling distant objects. In this way, users can easily and accurately control objects, thereby increasing the user experience.
圖2是依照本發明的一實施例的一種使用情境的示意圖。參照圖1A至圖2,情境200包括在虛擬世界中的使用者U、物體220(又稱為貓)以及物體230(又稱為按鈕區)。物體220以及物體230可各自包括多個控制區域。並且,物體220以及物體230的每個控制區域可分別地朝向使用者U的方向發射出射線221、222、223、231、232、233。此外,在使用者U的周圍形成控制面210。FIG. 2 is a schematic diagram of a usage scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 2 ,
在一實施例中,控制面210是一個虛擬面,以及/或是任一射線221、222、223、231、232、233是虛擬射線。也就是說,使用者U平常可能看不到控制面210以及射線221、222、223、231、232、233的存在。僅有在當使用者U的手靠近或接觸隱藏的控制面210或是控制點時,控制面210或是射線221、222、223、231、232、233才會浮現出來。並且,當使用者U的手遠離控制面210或是控制點時,控制面210或是射線221、222、223、231、232、233可再度進入隱藏的狀態。換句話說,控制器110可根據在虛擬世界中使用者U的虛擬手與控制面210之間的手部距離或是使用者U的虛擬手與控制點之間的手部距離,判斷將控制面210或是射線221、222、223、231、232、233顯示或隱藏。如此一來,控制面210或是射線221、222、223、231、232、233僅會在使用者U需要對物體進行控制的時候才浮現出來,從而不會影響到使用者U的視覺觀感。在一實施例中,可以透過物體本身的影像效果或播放音效,來取代顯示控制面210和射線221、222、223、231、232、233。換句話說,本揭露並不限制控制面210以及射線221、222、223、231、232、233的實際呈現方式,圖2僅為方便說明的一種實施方式。In one embodiment, the
值得一提的是,在控制面210以及射線221、222、223、231、232、233處在隱藏的狀態時,使用者U的虛擬手可任意地擺放。也就是說,由於此時控制面210以及射線221、222、223、231、232、233處在隱藏的狀態,使用者U的虛擬手不會觸發對控制面210上的控制點的控制。如此一來,在不需要進行控制的時候,使用者U的虛擬手可不受限制地舒適地擺放,從而增加使用體驗。It is worth mentioning that when the
在一實施例中,物體220(貓)可包括多個控制區域。舉例而言,物體220的頭部是第一控制區域,物體220的身體是第二控制區域,且物體220的腳是第三控制區域。第一控制區域可發射出射線221(又稱為第一射線),第二控制區域可發射出射線222(又稱為第二射線),且第三控制區域可發射出射線223(又稱為第三射線)。在一實施例中,射線221、222、223的起點可分別地位於第一控制區域、第二控制區域以及第三控制區域的中心點,但本揭露並不以此為限。並且,射線221、222、223可分別地與控制面210交會,從而形成第一控制點、第二控制點以及第三控制點。In one embodiment, the object 220 (cat) may include a plurality of control regions. For example, the head of the
補充說明的是,在一些實施例中,使用者U可點選第一控制點,使得物體220產生被摸頭的效果。並且,使用者U可點選第二控制點,使得物體220產生被搔癢的效果。再者,使用者U可點選第三控制點,使得物體220進行移動或旋轉。當物體220進行移動或旋轉時,物體220所對應的多個控制點在控制面210上也會相應地進行移動。也就是說,控制器110可根據物體220所對應的不同的控制點,對物體220進行不同的控制。並且,控制器110可根據物體220的移動或旋轉,移動物體220所對應的控制點在控制面210上的位置。如此一來,使用者U可根據物體220所對應的多個控制點,對物體220做出各種控制。It should be noted that in some embodiments, the user U may click on the first control point to make the
在一實施例中,物體230(按鈕區)可包括多個控制區域。舉例而言,物體230由上至下可包括第一按鈕、第二按鈕以及第三按鈕。第一按鈕可發射出射線231(又稱為第一射線),第二按鈕可發射出射線232(又稱為第二射線),且第三按鈕可發射出射線233(又稱為第三射線)。射線231、232、233可分別地與控制面210交會,從而形成第一控制點、第二控制點以及第三控制點。In one embodiment, the object 230 (button area) may include a plurality of control areas. For example, the
補充說明的是,在一些實施例中,物體230所對應的第一控制點、第二控制點以及第三控制點可組成控制群組,以使第一控制點、第二控制點以及第三控制點同步地被調整(例如調整顯示方式、控制點之間的控制點距離或個別控制點的控制點位置),以方便使用者進行選取與控制。It should be noted that in some embodiments, the first control point, the second control point, and the third control point corresponding to the
舉例而言,控制群組中的控制點可以被同步地顯示或隱藏。例如,在控制群組有顯示且使用者的虛擬手接近、接觸或選取控制群組時,控制器110可淡化顯示控制面210上除了控制群組以外的內容。也就是說,當控制群組被選取時,控制面210可僅顯示單一或多個被選取的控制群組,以方便使用者U專注於控制群組的內容。For example, the control points in the control group can be displayed or hidden synchronously. For example, when the control group is displayed and the user's virtual hand approaches, touches or selects the control group, the
又舉例而言,控制群組可以被同步地設置放大率(相當於設置控制點距離)。例如,當控制群組被選取時,控制器110可將控制群組中的控制點距離同步放大,以使控制點的分布較原先分散,例如可以使控制點距離至少為5公分,但本揭露並不以此為限。需要說明的是,關於控制點的控制點距離的調整方式,在後續的實施例中會再詳細說明。For another example, the control group can be synchronously set to a magnification (equivalent to setting the control point distance). For example, when the control group is selected, the
再舉例而言,控制群組可以被同步地設置控制點的偏移位置(控制點位置)。例如,當控制群組被選取時,控制器110可將控制群組中的控制點同步移動到控制面210上較為寬敞或固定的區域。如此一來,藉由調整控制群組中的控制點的控制點位置,可以方便使用者U對控制點精準地進行控制。For another example, the control group can be synchronously set with the offset position (control point position) of the control point. For example, when the control group is selected, the
另外舉例而言,控制群組中的控制點可以鍵盤的形式顯示在控制面210上,以讓使用者方便對鍵盤上的各個按鍵(Key)進行碰觸,從而控制物體230上的各個按鈕(Button)。如此一來,使用者U可直覺地對物體230輸入各種指令(控制指令),從而增加在虛擬世界中的操控的便利性並提升使用者的輸入速度。For another example, the control points in the control group can be displayed on the
圖3A是依照本發明的一實施例的一種操控手勢的示意圖。圖3B是依照本發明的一實施例的一種操控手勢的示意圖。圖3C是依照本發明的一實施例的一種操控手勢的示意圖。參照圖1A至圖3C,手勢300A、手勢300B以及手勢300C分別地繪示使用者U在虛擬世界中的虛擬手H在進行控制時的各種姿勢。FIG. 3A is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. FIG. 3B is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. FIG. 3C is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 3C , gestures 300A, 300B, and 300C respectively illustrate various postures of the virtual hand H of the user U in the virtual world when performing control.
在一實施例中,攝影機130可用以拍攝使用者U在真實世界中的真實手的姿勢的影像,並且控制器110可基於攝影機130拍攝的影像進行影像處理。接著,控制器110可傳輸訊號至顯示器120,使得使用者U在虛擬世界中的虛擬手H產生對應於真實世界中的真實手的動作。然而,本揭露並不以此為限。In one embodiment, the
手勢300A、手勢300B以及手勢300C各自包括手指F1(又稱為第一手指或拇指)、手指F2(又稱為第二手指或食指)以及捏取(Pinch)點PP。在一實施例中,捏取點PP可設置於手指F1至手指F2的連線中較靠近手指F1的位置。一般而言,使用者U的虛擬手H在捏取時,手指F1的移動量會小於手指F2的移動量。如此一來,使用者U可較輕易地完成對捏取點PP的捏取。The
在一實施例中,使用者U可經由對捏取點PP進行捏取,來對虛擬世界中的物體220或物體230進行選取或進行確認。舉例而言,當捏取點PP重疊或為唯一最接近於物體220對應的控制點時,使用者U可對捏取點PP進行捏取,以對物體220進行選取或進行確認。在一實施例中,使用者U可對捏取點PP捏取一段預設時間或捏取後再放開,以出現子視窗功能或進行確認。換句話說,使用者U可對捏取點PP進行捏取,以觸發物體220對應的控制點,進而對物體220進行控制。In one embodiment, the user U can select or confirm the
值得一提的是,使用者U在對捏取點PP進行捏取時,手指F1以及手指F2以外的其他手指可保持任意的姿勢。也就是說,使用者U可隨意地挑選自己感到舒適的姿勢(如圖3A或圖3B或其他手勢),並進行捏取的動作。並且,在一實施例中,如圖3C所示,在使用者U的手指F1以及手指F2接觸到捏取點PP後,手指F1以及手指F2可繼續地向手心收攏,並保持握拳的姿勢。並且,經由保持握拳的姿勢,使用者U可對捏取點PP維持選取的狀態。然而,本揭露並不以此為限。舉例而言,使用者可以透過其他手勢,例如手部往上下左右前後的任一方向滑動,以選擇、觸發或取消選擇物體220對應的控制點。如此一來,使用者U在虛擬世界中,能夠不受手勢的侷限,可以最舒適的姿態來進行控制,從而提升使用體驗。It is worth mentioning that when the user U is pinching the pinch point PP, the fingers other than the fingers F1 and F2 can maintain any posture. That is to say, the user U can arbitrarily choose a posture that he feels comfortable with (such as FIG. 3A or FIG. 3B or other gestures) and perform the pinching action. Moreover, in one embodiment, as shown in FIG. 3C , after the fingers F1 and F2 of the user U touch the pinch point PP, the fingers F1 and F2 can continue to be gathered toward the palm and maintain a fist posture. Moreover, by maintaining a fist posture, the user U can maintain the selected state of the pinch point PP. However, the present disclosure is not limited to this. For example, the user can use other gestures, such as sliding the hand in any direction up, down, left, right, forward, or backward, to select, trigger, or cancel the selection of the control point corresponding to the
圖4是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖4,情境400繪示在虛擬世界中的使用者U的虛擬手H的捏取後並進行拖移的動作。在一實施例中,使用者U進行捏取後,虛擬手H的手指F1以及手指F2可保持在捏取的狀態,以對控制的對象(例如物體220)進行拖移。FIG4 is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG1A to FIG4 ,
需要說明的是,如圖4所示,使用者U在進行拖移時,虛擬手H的手腕可任意地翻轉。也就是說,虛擬手H的手腕不用維持在特定的角度。若是在手腕維持在特定的角度進行拖移,使用者U的姿勢會受到限制。換句話說,經由僅判斷手指F1以及手指F2之間的相對關係(例如距離),來維持捏取的狀態,對於使用者U來說可更靈活地進行操控。如此一來,使用者U可在符合人體工學的情況下,進行各種操控,進而提升使用體驗。It should be noted that, as shown in FIG4 , when the user U is dragging, the wrist of the virtual hand H can be turned arbitrarily. In other words, the wrist of the virtual hand H does not need to be maintained at a specific angle. If the wrist is maintained at a specific angle during dragging, the posture of the user U will be restricted. In other words, by only judging the relative relationship (such as distance) between the finger F1 and the finger F2 to maintain the pinching state, the user U can operate more flexibly. In this way, the user U can perform various operations in an ergonomic manner, thereby improving the user experience.
圖5A是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖5A,情境500A示意地繪示使用者U、控制面CTR以及物體OBJ之間的關係。如圖5A所示,物體OBJ可包括控制區域E1(又稱為第一控制區域)、控制區域E2(又稱為第二控制區域)、控制區域E3(又稱為第三控制區域)以及控制區域E4(又稱為第四控制區域)。控制區域E1~E4可位於物體OBJ的表面,但本揭露並不以此為限。多條射線(例如第一射線至第四射線)可分別地從控制區域E1~E4面向使用者U的方向發射。在一實施例中,多條射線的起點可分別地位於控制區域E1~E4的中心點,但本揭露並不以此為限。並且,對應於控制區域E1~E4的多條射線可分別地交會於控制面CTR,從而形成控制區域E1’(又稱為第一控制點)、控制區域E2’(又稱為第二控制點)、控制區域E3’(又稱為第三控制點)以及控制區域E4’(又稱為第四控制點)。FIG. 5A is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 5A ,
再者,控制區域E1~E4彼此之間的距離可稱為控制區域距離,且控制點E1’~E4’彼此之間的距離可稱為控制點距離。舉例而言,控制區域E1與控制區域E2之間的距離可標示為控制區域距離D1A,且控制點E1’與控制點E2’之間的距離可標示為控制點距離D2A。Furthermore, the distance between the control areas E1-E4 may be referred to as the control area distance, and the distance between the control points E1'-E4' may be referred to as the control point distance. For example, the distance between the control area E1 and the control area E2 may be indicated as the control area distance D1A, and the distance between the control point E1' and the control point E2' may be indicated as the control point distance D2A.
此外,對應於控制區域E1~E4的多條射線可全部地交會於投影點P。舉例而言,控制區域E1所發射的射線(又稱為第一射線)與控制區域E2所發射的射線(又稱為第二射線)均交會於投影點P。也就是說,對應於控制區域E1~E4的多條射線的起點分別為控制區域E1~E4的中心,且對應於控制區域E1~E4的多條射線的終點均為投影點P。In addition, the multiple rays corresponding to the control areas E1 to E4 may all intersect at the projection point P. For example, the rays emitted by the control area E1 (also referred to as the first rays) and the rays emitted by the control area E2 (also referred to as the second rays) all intersect at the projection point P. In other words, the starting points of the multiple rays corresponding to the control areas E1 to E4 are the centers of the control areas E1 to E4, respectively, and the end points of the multiple rays corresponding to the control areas E1 to E4 are all the projection point P.
值得一提的是,控制點E1’~E4’是基於對應的射線交會於控制面而形成,且控制點E1’~E4’對應的射線是基於作為起點的控制區域E1~E4的中心以及作為終點的投影點P而形成。換句話說,當投影點P的位置改變時,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)也會相應地跟著改變。It is worth mentioning that the control points E1'~E4' are formed based on the intersection of the corresponding rays on the control surface, and the rays corresponding to the control points E1'~E4' are formed based on the center of the control area E1~E4 as the starting point and the projection point P as the end point. In other words, when the position of the projection point P changes, the control point distance between the control points E1'~E4' (for example, the control point distance D2A) will also change accordingly.
舉例而言,當投影點P位於使用者的後方且再往使用者的後方(例如圖5A的左方)移動時(即距離使用者U較原先的投影點P更遠),控制點E1’~E4’會往彼此靠近。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會減少。相反地,當投影點P位於使用者的後方且往使用者的前方(例如圖5A的右方)移動時(即距離使用者U較原先的投影點P更近),控制點E1’~E4’會往彼此遠離。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會增加。因此,在一實施例中,使用者U可經由調整投影點P的位置(即使用者U與投影點P之間的投影點距離),來調整控制面CTR上所顯示的控制點E1’~E4’之間的控制點距離(即顯示的倍率)。也就是說,控制器110可基於投影點P與使用者U的位置,決定控制點E1’與控制點E2’之間的控制點距離D2A。For example, when the projection point P is located behind the user and moves to the rear of the user (e.g., the left side of FIG. 5A ) (i.e., the distance from the user U is farther than the original projection point P), the control points E1’~E4’ will move closer to each other. In other words, the control point distance between the control points E1’~E4’ (e.g., the control point distance D2A) will decrease. On the contrary, when the projection point P is located behind the user and moves to the front of the user (e.g., the right side of FIG. 5A ) (i.e., the distance from the user U is closer than the original projection point P), the control points E1’~E4’ will move away from each other. In other words, the control point distance between the control points E1’~E4’ (e.g., the control point distance D2A) will increase. Therefore, in one embodiment, the user U can adjust the control point distance (i.e., the display magnification) between the control points E1'-E4' displayed on the control surface CTR by adjusting the position of the projection point P (i.e., the projection point distance between the user U and the projection point P). In other words, the
另外,由於鄰近於使用者U的控制面CTR位於投影點P與物體OBJ之間,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會小於控制區域E1~E4之間的控制區域距離(例如控制區域距離D1A)。也就是說,控制器110可響應於使用者U位於投影點P與物體OBJ之間,設定第一控制點與第二控制點之間的控制點距離D2A小於控制區域E1與第二控制區域E2之間的控制區域距離D1A。In addition, since the control plane CTR adjacent to the user U is located between the projection point P and the object OBJ, the control point distance between the control points E1'-E4' (e.g., the control point distance D2A) is smaller than the control area distance between the control areas E1-E4 (e.g., the control area distance D1A). In other words, the
附帶一提的是,控制面CTR的位置及範圍(大小)可被進一步調整。舉例而言,圖5A的控制面CTR可包括原點C與限制範圍,且在限制範圍之外的射線及對應的控制點既不顯示也不能被控制。在一實施例中,原點C可設置於於控制面CTR的中心位置,但本揭露並不以此為限。在一實施例中,當使用者U想對控制面CTR的位置進行調整時,可用單手(或雙手)的虛擬手H去拖移控制面CTR的原點C。舉例而言,使用者U可以將控制面CTR拉近或遠離使用者U。例如,當使用者U的雙手同時往前延伸時,可將控制面CTR(包括原點C)的位置往使用者U的前方移動(即距離使用者U較原先的控制面CTR更遠)。也就是說,控制器110可響應於原點C的位置,設定控制面CTR與使用者U之間的控制面距離。Incidentally, the position and range (size) of the control surface CTR can be further adjusted. For example, the control surface CTR of FIG. 5A may include an origin C and a restricted range, and the rays and corresponding control points outside the restricted range are neither displayed nor controlled. In one embodiment, the origin C may be set at the center of the control surface CTR, but the present disclosure is not limited thereto. In one embodiment, when the user U wants to adjust the position of the control surface CTR, the origin C of the control surface CTR may be dragged with a single (or double) virtual hand H. For example, the user U may move the control surface CTR closer to or farther from the user U. For example, when both hands of the user U extend forward at the same time, the position of the control surface CTR (including the origin C) may be moved forward of the user U (i.e., farther from the user U than the original control surface CTR). That is, the
再舉例而言,使用者U可以將控制面CTR往與使用者U距離不變的任一方向(例如使用者U的上下左右)移動。例如,當使用者U的雙手同時往左移動時,可將控制面CTR(包括原點C)的位置往使用者U的左方移動。再舉例而言,當雙手往不同方向移動時,可以將控制面CTR往雙手中點的方向移動。例如,當使用者U的左手往使用者U的左方移動,而右手往使用者U的上方移動時,可將控制面CTR往左上方移動。如此一來,使用者U可以依據實際使用情境,藉由拖移控制面CTR的原點C的位置,來移動控制面CTR的位置,進而調整控制面CTR所涵蓋的控制點範圍。For another example, the user U can move the control surface CTR to any direction (such as up, down, left, and right of the user U) whose distance from the user U remains unchanged. For example, when both hands of the user U move to the left at the same time, the position of the control surface CTR (including the origin C) can be moved to the left of the user U. For another example, when both hands move to different directions, the control surface CTR can be moved in the direction of the center of both hands. For example, when the left hand of the user U moves to the left of the user U and the right hand moves to the top of the user U, the control surface CTR can be moved to the upper left. In this way, the user U can move the position of the control surface CTR by dragging the position of the origin C of the control surface CTR according to the actual usage scenario, thereby adjusting the range of control points covered by the control surface CTR.
進一步而言,在一實施例中,當使用者U想對控制面CTR的範圍大小進行調整時,可用單手或雙手的虛擬手H去縮放控制面CTR的限制範圍。舉例而言,使用者U可以將控制面CTR的範圍放大或縮小。例如,當使用者U的雙手往遠離彼此的方向移動時,可以擴大控制面CTR的限制範圍,以包含到較多控制點(例如包含到圖5A中的控制點E1’~E4’之外的其他控制點)。也就是說,響應於使用者U擴大控制面的限制範圍CTR,本來位於限制範圍外的無效控制點可被移動至限制範圍內,從而將無效控制點設定為有效控制點。Furthermore, in one embodiment, when the user U wants to adjust the range size of the control surface CTR, the user can use one or both virtual hands H to scale the restricted range of the control surface CTR. For example, the user U can enlarge or reduce the range of the control surface CTR. For example, when the hands of the user U move away from each other, the restricted range of the control surface CTR can be expanded to include more control points (for example, to include other control points other than the control points E1'~E4' in FIG. 5A). That is, in response to the user U expanding the restricted range CTR of the control surface, the invalid control point originally located outside the restricted range can be moved into the restricted range, thereby setting the invalid control point as a valid control point.
再例如,當使用者U的雙手往靠近彼此的方向移動時,可以縮減控制面CTR的限制範圍,以包含到較少的控制點(例如減少到只包含控制點E2’與E3’)。也就是說,響應於使用者U縮減控制面CTR的限制範圍,本來位於限制範圍內的有效控制點可被移動至限制範圍外,從而將有效控制點設定為無效控制點。如此一來,使用者U可以依據實際用情境,調整控制面CTR的位置與範圍,以更精準與快速地選擇控制面CTR上的控制點。For another example, when the hands of the user U move toward each other, the restricted range of the control surface CTR can be reduced to include fewer control points (for example, reduced to only control points E2' and E3'). In other words, in response to the user U reducing the restricted range of the control surface CTR, the valid control points originally located in the restricted range can be moved outside the restricted range, thereby setting the valid control points as invalid control points. In this way, the user U can adjust the position and range of the control surface CTR according to the actual usage scenario to select the control points on the control surface CTR more accurately and quickly.
圖5B是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖5B,情境500B示意地繪示使用者U、控制面CTR以及物體OBJ之間的關係。相較於在圖5A中,使用者U設置於投影點P以及物體OBJ之間,在圖5B中,物體OBJ設置於使用者U以及投影點P之間。其中控制區域E1與控制區域E2之間的距離可標示為控制區域距離D1B,且控制點E1’與控制點E2’之間的距離可標示為控制點距離D2B。另外,圖5B中繪示了限制範圍R,在後續的實施例中會進行說明。關於使用者U、控制面CTR以及物體OBJ的細節可參照圖5A的描述,在此不多加贅述。FIG. 5B is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 5B ,
類似於圖5A,當圖5B的投影點P的位置改變時,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)也會相應地跟著改變。舉例而言,當投影點P位於使用者的前方且再往使用者的前方(例如圖5B的右方)移動時(即距離使用者U較原先的投影點P更遠),控制點E1’~E4’會往彼此靠近。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會減少。相反地,當投影點P位於使用者的前方且往使用者的後方(例如圖5B的左方)移動時(即距離使用者U較原先的投影點P更近),控制點E1’~E4’會往彼此遠離。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會增加。因此,在一實施例中,使用者U可經由調整投影點P的位置(即使用者U與投影點P之間的投影點距離),來調整控制面CTR上所顯示的控制點E1’~E4’之間的控制點距離(即顯示的倍率)。也就是說,控制器110可基於投影點P與使用者U的位置,決定控制點E1’與控制點E2’之間的控制點距離D2B。Similar to FIG. 5A , when the position of the projection point P of FIG. 5B changes, the control point distances between the control points E1’~E4’ (e.g., the control point distance D2B) will also change accordingly. For example, when the projection point P is located in front of the user and moves to the front of the user (e.g., the right side of FIG. 5B ) (i.e., the distance from the user U is farther than the original projection point P), the control points E1’~E4’ will move closer to each other. In other words, the control point distances between the control points E1’~E4’ (e.g., the control point distance D2B) will decrease. On the contrary, when the projection point P is located in front of the user and moves to the back of the user (e.g., the left side of FIG. 5B ) (i.e., the distance from the user U is closer than the original projection point P), the control points E1’~E4’ will move away from each other. In other words, the control point distance between the control points E1'~E4' (e.g., the control point distance D2B) will increase. Therefore, in one embodiment, the user U can adjust the control point distance between the control points E1'~E4' displayed on the control surface CTR (i.e., the display magnification) by adjusting the position of the projection point P (i.e., the projection point distance between the user U and the projection point P). In other words, the
另外,由於物體OBJ設置於投影點P以及鄰近於使用者U的控制面CTR之間,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會大於控制區域E1~E4之間的控制區域距離(例如控制區域距離D1B)。也就是說,控制器110可響應於物體OBJ位於投影點P與使用者U之間,設定第一控制點與第二控制點之間的控制點距離D2B大於控制區域E1與第二控制區域E2之間的控制區域距離D1B。In addition, since the object OBJ is disposed between the projection point P and the control plane CTR adjacent to the user U, the control point distance between the control points E1'-E4' (e.g., the control point distance D2B) is greater than the control area distance between the control areas E1-E4 (e.g., the control area distance D1B). In other words, the
此外,在一實施例中,投影點P可以設置於使用者U、控制面CTR以及物體OBJ之間,以得到對應於物體OBJ上的控制區域呈上下左右相反分布的控制點。或者,在一實施例中,投影點P可以設置於使用者U的前方或後方的無窮遠,以得到對應於物體OBJ上的控制區域呈相同分布的控制點,即各控制點之間的控制點距離將與物體OBJ上的對應的各控制區域之間的距離相同。如此一來,經由調整投影點P的位置和使用者U與投影點P之間的投影點距離,使用者U可調整控制面CTR或控制面CTR上的任一控制群組的顯示倍率(放大率),即能夠調整控制面CTR上的控制點距離(例如圖5A的控制點距離D2A或圖5B的控制點距離D2B),從而獲得較佳的視覺體驗,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制。並且,經由調整控制面CTR(例如透過調整控制面CTR的原點C)的位置或範圍,使用者U可調整使用者U與控制面CTR之間的控制面距離以及控制面CTR所涵蓋的控制點範圍(限制範圍R),從而能處在舒適的姿態下對控制面CTR進行操控,進而增加較佳的使用體驗。In addition, in one embodiment, the projection point P can be set between the user U, the control surface CTR and the object OBJ to obtain control points that are oppositely distributed up and down and left and right corresponding to the control area on the object OBJ. Alternatively, in one embodiment, the projection point P can be set to infinity in front of or behind the user U to obtain control points that are uniformly distributed corresponding to the control area on the object OBJ, that is, the control point distance between each control point will be the same as the distance between each corresponding control area on the object OBJ. In this way, by adjusting the position of the projection point P and the projection point distance between the user U and the projection point P, the user U can adjust the display magnification (magnification) of the control surface CTR or any control group on the control surface CTR, that is, can adjust the control point distance on the control surface CTR (for example, the control point distance D2A in Figure 5A or the control point distance D2B in Figure 5B), thereby obtaining a better visual experience and being able to simply and effectively control objects accurately in the virtual world. Furthermore, by adjusting the position or range of the control surface CTR (for example, by adjusting the origin C of the control surface CTR), the user U can adjust the control surface distance between the user U and the control surface CTR and the range of control points covered by the control surface CTR (limited range R), thereby being able to manipulate the control surface CTR in a comfortable posture, thereby increasing a better user experience.
需要說明的是,為了方便說明,圖5B中控制面CTR繪示為有限的長度(有限的平面)。然而,在一些實施例中,控制面CTR為無限延伸的平面。此外,圖5B中繪示了限制範圍R,用以表示控制面CTR的有效的控制區域或有效的顯示區域。在一實施例中,限制範圍R是控制面CTR上的有效控制點的涵蓋範圍。舉例而言,限制範圍R內的控制點為有效控制點,而在限制範圍R外的控制點為無效控制點。也就是說,響應於在虛擬世界中使用者U的虛擬手接觸在控制範圍R內的有效控制點,控制器110可接受使用者U的控制指令。並且,響應於在虛擬世界中使用者U的虛擬手接觸在控制範圍R外的無效控制點,不接受使用者U的控制指令。It should be noted that, for the convenience of explanation, the control surface CTR in FIG. 5B is shown as a finite length (finite plane). However, in some embodiments, the control surface CTR is a plane that extends infinitely. In addition, a restriction range R is shown in FIG. 5B to indicate the effective control area or effective display area of the control surface CTR. In one embodiment, the restriction range R is the coverage range of the effective control points on the control surface CTR. For example, the control points within the restriction range R are effective control points, and the control points outside the restriction range R are invalid control points. That is, in response to the virtual hand of the user U touching the effective control point within the control range R in the virtual world, the
舉例而言,圖5B中繪示了由投影點P為起點並經過物體OBJ的虛線(射線),並且相交於控制面CTR的限制範圍R外,從而形成無效控制點。由於無效控制點位於控制面CTR的限制範圍R外,控制器110會不執行使用者U對於無效控制點的操作。如此一來,經由設置限制區域R,可避免誤操作的發生,從而增加使用體驗。For example, FIG. 5B shows a virtual line (ray) starting from the projection point P and passing through the object OBJ, and intersecting outside the restricted range R of the control surface CTR, thereby forming an invalid control point. Since the invalid control point is outside the restricted range R of the control surface CTR, the
圖6是依照本發明的一實施例的一種控制方法的流程圖。參照圖1以及圖6,控制方法600示意地控制裝置100的操作流程,但本揭露並不以此為限。在本實施例中,控制方法600包括步驟S610至步驟S640。FIG6 is a flow chart of a control method according to an embodiment of the present invention. Referring to FIG1 and FIG6 , the
在步驟S610中,控制器110可傳輸訊號至顯示器120,使得在顯示器120所顯示的虛擬世界中的使用者U的周圍形成控制面210。在步驟S620中,控制器110可傳輸訊號至顯示器120,使得在顯示器120所顯示的虛擬世界中的物體(例如物體220或物體230)發射出第一射線(例如射線221~233的其中之一)。在步驟S630中,控制器110可傳輸訊號至顯示器120,使得顯示器120可基於第一射線,在控制面210上形成(顯示)第一控制點。在步驟S640中,控制器110可傳輸訊號至顯示器120,從而讓使用者U根據第一控制點,控制物體(例如物體220或物體230)。需要說明的是,相關的實施細節可參照圖1至圖5B的描述,在此不多加贅述。如此一來,使用者U可經由控制面210上的第一控制點,簡單且直覺地對物體進行精確的控制。In step S610, the
綜上所述,本發明的控制裝置以控制方法採用了控制面來對虛擬世界中的物體進行控制。因此,使用者可在保持舒適的姿態下,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制。In summary, the control device of the present invention uses a control surface to control objects in the virtual world. Therefore, the user can maintain a comfortable posture while being able to simply and effectively control the objects in the virtual world accurately.
10:控制系統
100:控制裝置
110:控制器
120:顯示器
130:攝影機
200、400、500A、500B:情境
210、CTR:控制面
220、230、OBJ:物體
221、222、223、231、232、233:射線
300A、300B、300C:手勢
600:控制方法
C:原點
D1A、D1B:控制區域距離
D2A、D2B:控制點距離
E1、E2、E3、E4:控制區域
E1’、E2’、E3’、E4’:控制點
F1、F2:手指
H:虛擬手
P:投影點
PP:捏取點
R:限制範圍
S610、S620、S630、S640:步驟
U:使用者
10: Control system
100: Control device
110: Controller
120: Display
130:
圖1A是依照本發明的一實施例的一種控制裝置的示意圖。 圖1B是依照本發明的一實施例的一種控制系統的示意圖。 圖2是依照本發明的一實施例的一種使用情境的示意圖。 圖3A是依照本發明的一實施例的一種操控手勢的示意圖。 圖3B是依照本發明的一實施例的一種操控手勢的示意圖。 圖3C是依照本發明的一實施例的一種操控手勢的示意圖。 圖4是依照本發明的一實施例的一種操控情境的示意圖。 圖5A是依照本發明的一實施例的一種操控情境的示意圖。 圖5B是依照本發明的一實施例的一種操控情境的示意圖。 圖6是依照本發明的一實施例的一種控制方法的流程圖。 FIG. 1A is a schematic diagram of a control device according to an embodiment of the present invention. FIG. 1B is a schematic diagram of a control system according to an embodiment of the present invention. FIG. 2 is a schematic diagram of a use scenario according to an embodiment of the present invention. FIG. 3A is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 3B is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 3C is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 4 is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 5A is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 5B is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 6 is a flow chart of a control method according to an embodiment of the present invention.
100:控制裝置 100: Control device
110:控制器 110: Controller
120:顯示器 120: Display
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW111149610A TWI870744B (en) | 2022-12-23 | 2022-12-23 | Control device and control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW111149610A TWI870744B (en) | 2022-12-23 | 2022-12-23 | Control device and control method |
Publications (2)
Publication Number | Publication Date |
---|---|
TW202427164A TW202427164A (en) | 2024-07-01 |
TWI870744B true TWI870744B (en) | 2025-01-21 |
Family
ID=92929032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
TW111149610A TWI870744B (en) | 2022-12-23 | 2022-12-23 | Control device and control method |
Country Status (1)
Country | Link |
---|---|
TW (1) | TWI870744B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102760308A (en) * | 2012-05-25 | 2012-10-31 | 任伟峰 | Method and device for node selection of object in three-dimensional virtual reality scene |
CN111309142A (en) * | 2018-12-11 | 2020-06-19 | 托比股份公司 | Method and device for switching input modality of display device |
US10747336B1 (en) * | 2017-07-31 | 2020-08-18 | Amazon Technologies, Inc. | Defining operating areas for virtual reality systems using sensor-equipped operating surfaces |
CN112181551A (en) * | 2020-08-31 | 2021-01-05 | 华为技术有限公司 | Information processing method and related equipment |
CN111880648B (en) * | 2020-06-19 | 2022-01-28 | 华为技术有限公司 | Three-dimensional element control method and terminal |
TW202217516A (en) * | 2020-10-29 | 2022-05-01 | 未來市股份有限公司 | Method and system of modifying position of cursor |
-
2022
- 2022-12-23 TW TW111149610A patent/TWI870744B/en active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102760308A (en) * | 2012-05-25 | 2012-10-31 | 任伟峰 | Method and device for node selection of object in three-dimensional virtual reality scene |
US10747336B1 (en) * | 2017-07-31 | 2020-08-18 | Amazon Technologies, Inc. | Defining operating areas for virtual reality systems using sensor-equipped operating surfaces |
CN111309142A (en) * | 2018-12-11 | 2020-06-19 | 托比股份公司 | Method and device for switching input modality of display device |
CN111880648B (en) * | 2020-06-19 | 2022-01-28 | 华为技术有限公司 | Three-dimensional element control method and terminal |
CN112181551A (en) * | 2020-08-31 | 2021-01-05 | 华为技术有限公司 | Information processing method and related equipment |
TW202217516A (en) * | 2020-10-29 | 2022-05-01 | 未來市股份有限公司 | Method and system of modifying position of cursor |
Also Published As
Publication number | Publication date |
---|---|
TW202427164A (en) | 2024-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7649396B2 (en) | DEVICE, METHOD AND GRAPHICAL USER INTERFACE FOR INTERACTING WITH A THREE-DIMENSIONAL ENVIRONMENT - Patent application | |
US12086379B2 (en) | Devices, methods, and graphical user interfaces for providing computer-generated experiences | |
US12288301B2 (en) | Devices, methods, and graphical user interfaces for interacting with three-dimensional environments | |
US12130967B2 (en) | Integration of artificial reality interaction modes | |
US12299251B2 (en) | Devices, methods, and graphical user interfaces for presenting virtual objects in virtual environments | |
US12131011B2 (en) | Virtual interactions for machine control | |
JP2022547930A (en) | Devices, methods, and graphical user interfaces for interacting with a three-dimensional environment | |
US12124674B2 (en) | Devices, methods, and graphical user interfaces for interacting with three-dimensional environments | |
JP2010145861A (en) | Head mount display | |
US20250044911A1 (en) | Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments | |
TWI870744B (en) | Control device and control method | |
US12056269B2 (en) | Control device and control method | |
CN118244882A (en) | Control device and control method | |
CN117957581A (en) | Apparatus, method and graphical user interface for interacting with a three-dimensional environment |