[go: up one dir, main page]

TWI870744B - Control device and control method - Google Patents

Control device and control method Download PDF

Info

Publication number
TWI870744B
TWI870744B TW111149610A TW111149610A TWI870744B TW I870744 B TWI870744 B TW I870744B TW 111149610 A TW111149610 A TW 111149610A TW 111149610 A TW111149610 A TW 111149610A TW I870744 B TWI870744 B TW I870744B
Authority
TW
Taiwan
Prior art keywords
control
user
point
ray
control point
Prior art date
Application number
TW111149610A
Other languages
Chinese (zh)
Other versions
TW202427164A (en
Inventor
鍾維人
Original Assignee
宏達國際電子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 宏達國際電子股份有限公司 filed Critical 宏達國際電子股份有限公司
Priority to TW111149610A priority Critical patent/TWI870744B/en
Publication of TW202427164A publication Critical patent/TW202427164A/en
Application granted granted Critical
Publication of TWI870744B publication Critical patent/TWI870744B/en

Links

Images

Landscapes

  • Position Input By Displaying (AREA)
  • Image Generation (AREA)

Abstract

A control device is provided. The control device is adapted to control an object in a virtual world. The control device includes a display and a controller. The display is configured to display the virtual world. The controller is coupled to the display. The controller is configured to perform the following functions. In the virtual world, a control surface is formed around a user. A first ray is emitted from the object. Based on the first ray, a first control point is formed on the control surface. According to the first control point, a first control is performed on the object.

Description

控制裝置及控制方法Control device and control method

本發明是有關於一種控制裝置,且特別是有關於一種控制裝置及控制方法。The present invention relates to a control device, and more particularly to a control device and a control method.

為了給用戶帶來沉浸式體驗,各種技術不斷地被開發,例如增強實境(augmented reality;AR)和虛擬實境(virtual reality;VR)等。AR技術允許用戶將虛擬元素帶入現實世界。VR技術允許用戶進入整個一個全新的虛擬世界,體驗不一樣的生活。並且,穿戴式裝置常用來提供這些沉浸式體驗。In order to bring immersive experiences to users, various technologies are constantly being developed, such as augmented reality (AR) and virtual reality (VR). AR technology allows users to bring virtual elements into the real world. VR technology allows users to enter a whole new virtual world and experience a different life. In addition, wearable devices are often used to provide these immersive experiences.

本發明提供一種控制裝置及控制方法,讓使用者在虛擬世界中能夠對物體精準地進行控制。The present invention provides a control device and a control method, which enable a user to accurately control an object in a virtual world.

本發明的控制裝置適於控制虛擬世界中的物體。控制裝置包括顯示器以及控制器。顯示器用以顯示虛擬世界。控制器耦接顯示器。控制器用以執行以下功能。在虛擬世界中,在使用者的周圍形成控制面。從物體發射出第一射線。基於第一射線,在控制面上形成第一控制點。根據第一控制點,對物體進行第一控制。The control device of the present invention is suitable for controlling an object in a virtual world. The control device includes a display and a controller. The display is used to display the virtual world. The controller is coupled to the display. The controller is used to perform the following functions. In the virtual world, a control surface is formed around the user. A first ray is emitted from the object. Based on the first ray, a first control point is formed on the control surface. According to the first control point, a first control is performed on the object.

本發明的控制方法適於控制虛擬世界中的物體。控制方法包括:在使用者的周圍形成控制面;從物體發射出第一射線;基於第一射線,在控制面上形成第一控制點;以及根據第一控制點,控制物體。The control method of the present invention is suitable for controlling an object in a virtual world. The control method comprises: forming a control surface around a user; emitting a first ray from the object; forming a first control point on the control surface based on the first ray; and controlling the object according to the first control point.

基於上述,經由從物體來發射出射線,從而使用者可以在虛擬世界中精準地對物體進行控制。Based on the above, by emitting rays from an object, the user can precisely control the object in the virtual world.

為了使本發明之內容可以被更容易明瞭,以下特舉實施例作為本發明確實能夠據以實施的範例。另外,凡可能之處,在圖式及實施方式中使用相同標號的原件/構件/步驟,係代表相同或類似部件。In order to make the content of the present invention more clearly understood, the following embodiments are specifically cited as examples by which the present invention can be truly implemented. In addition, wherever possible, the same reference numerals are used in the drawings and embodiments to represent the same or similar components.

並且,除非另有定義,本文使用的所有術語(包括技術和科學術語)具有與本發明所屬領域的普通技術人員通常理解的相同的含義。將進一步理解的是,諸如在通常使用的字典中定義的那些術語應當被解釋為具有與它們在相關技術和本發明的上下文中的含義一致的含義,並且將不被解釋為理想化的或過度正式的意義,除非本文中明確地這樣定義。Furthermore, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by a person of ordinary skill in the art to which the present invention belongs. It will be further understood that those terms as defined in commonly used dictionaries should be interpreted as having a meaning consistent with their meaning in the context of the relevant art and the present invention, and will not be interpreted as an idealized or overly formal meaning unless expressly defined as such herein.

通過參考以下的詳細描述並同時結合附圖可以理解本發明。須注意的是,為了使讀者能容易瞭解及為了圖式的簡潔,本發明中的多張圖式只繪出電子裝置的一部分,且圖式中的特定元件並非依照實際比例繪圖。此外,圖中各元件的數量及尺寸僅作為示意,並非用來限制本發明的範圍。The present invention can be understood by referring to the following detailed description in conjunction with the attached drawings. It should be noted that, in order to facilitate the reader's understanding and for the sake of simplicity of the drawings, the various drawings in the present invention only depict a portion of the electronic device, and the specific components in the drawings are not drawn according to the actual scale. In addition, the number and size of each component in the drawings are only for illustration and are not used to limit the scope of the present invention.

須知悉的是,以下所舉實施例可以在不脫離本發明的精神下,將數個不同實施例中的技術特徵進行替換、重組、混合以完成其他實施例。並且,在下文說明書與申請專利範圍中,「含有」與「包括」等詞為開放式詞語,因此其應被解釋為「含有但不限定為…」之意。It should be noted that the following embodiments may replace, reorganize, or mix the technical features of several different embodiments to complete other embodiments without departing from the spirit of the present invention. In addition, in the following specification and patent application, the words "including" and "comprising" are open-ended words, and therefore should be interpreted as "including but not limited to..."

為了給用戶帶來沉浸式體驗,各種技術不斷地被開發,例如增強實境(augmented reality;AR)和虛擬實境(virtual reality;VR)等。AR技術允許用戶將虛擬元素帶入現實世界。VR技術允許用戶進入整個一個全新的虛擬世界,體驗不一樣的生活。並且,穿戴式裝置常用來提供這些沉浸式體驗。In order to bring immersive experiences to users, various technologies are constantly being developed, such as augmented reality (AR) and virtual reality (VR). AR technology allows users to bring virtual elements into the real world. VR technology allows users to enter a whole new virtual world and experience a different life. In addition, wearable devices are often used to provide these immersive experiences.

在虛擬世界中,若要對物體進行控制,可由控制裝置或使用者的位置,例如使用者身體的任一部位(手腕、手指),發射出一道射線。並且,射線所指向的物體,則為被控制的物體。然而,此種控制方法須先指定與辨識射線的起始位置並對使用者所指向的方向進行精準的判斷,對於識別的準確度要求較高,從而提高在設計控制裝置時的複雜程度。並且,使用者往往需要維持在特定的姿勢或手勢(例如握拳或張開),以進行正確的控制。也就是說,當使用者所指向的方向不清楚或是沒有維持在特定的姿勢或手勢時,誤判可能就會產生。再者,由於使用者須維持在特定的手勢,從而不能隨意地擺放,降低了使用體驗。此外,透過控制裝置或手勢移動射線的過程也很容易發生控制裝置或手勢辨識不清,以及射線掃過其他物件時容易產生誤觸的情況。因此,如何在使用者保持舒適的姿態下,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制,是本領域技術人員一直以來所追求的課題。In the virtual world, if you want to control an object, you can emit a ray from the control device or the user's position, such as any part of the user's body (wrist, finger). And the object pointed by the ray is the controlled object. However, this control method must first specify and identify the starting position of the ray and accurately judge the direction pointed by the user. The recognition accuracy is high, which increases the complexity of designing the control device. In addition, the user often needs to maintain a specific posture or gesture (such as clenching a fist or opening it) to perform correct control. In other words, when the direction pointed by the user is unclear or the user does not maintain a specific posture or gesture, misjudgment may occur. Furthermore, since the user must maintain a specific gesture, he cannot place it at will, which reduces the user experience. In addition, when moving the ray through a control device or gesture, it is easy for the control device or gesture to be misidentified, and it is easy for the ray to accidentally touch other objects when scanning. Therefore, how to accurately control objects in the virtual world in a simple and effective way while keeping the user in a comfortable posture has always been a topic pursued by technical personnel in this field.

圖1A是依照本發明的一實施例的一種控制裝置的示意圖。參照圖1A,控制裝置100適於控制虛擬世界中的物體。並且,控制裝置100包括顯示器120以及控制器110。顯示器120用以顯示虛擬世界。控制器110耦接顯示器120。FIG1A is a schematic diagram of a control device according to an embodiment of the present invention. Referring to FIG1A , the control device 100 is suitable for controlling an object in a virtual world. In addition, the control device 100 includes a display 120 and a controller 110. The display 120 is used to display the virtual world. The controller 110 is coupled to the display 120.

值得注意的是,控制器110可用以執行以下功能。首先,在顯示器120所顯示的虛擬世界中,在使用者的周圍形成控制面。接著,在顯示器120所顯示的虛擬世界中,從物體發射出第一射線。在一實施例中,第一射線可從物體的表面所發射,且第一射線是直線或曲線,本發明並不加以限制。然後,在顯示器120所顯示的虛擬世界中,基於第一射線,在控制面上形成第一控制點。在顯示器120所顯示的虛擬世界中,根據第一控制點,對物體進行第一控制。如此一來,使用者可經由控制面上的第一控制點,簡單且直覺地對物體進行精確的控制。It is worth noting that the controller 110 can be used to perform the following functions. First, a control surface is formed around the user in the virtual world displayed by the display 120. Then, in the virtual world displayed by the display 120, a first ray is emitted from an object. In one embodiment, the first ray can be emitted from the surface of the object, and the first ray is a straight line or a curve, and the present invention is not limited. Then, in the virtual world displayed by the display 120, a first control point is formed on the control surface based on the first ray. In the virtual world displayed by the display 120, a first control is performed on the object according to the first control point. In this way, the user can simply and intuitively perform precise control of the object through the first control point on the control surface.

在一實施例中,控制器110可基於從物體所發射出的第一射線與使用者周圍的控制面的交會點,在控制面上形成所述第一控制點。然而,本揭露並不以此為限。在一實施例中,控制裝置100例如為頭戴式顯示器(head-mounted display, HMD)、穿戴式眼鏡(例如,AR/VR護目鏡)、電子裝置、其它類似裝置或這些裝置的組合上。然而,本揭露並不以此為限。In one embodiment, the controller 110 may form the first control point on the control surface based on the intersection of the first ray emitted from the object and the control surface around the user. However, the disclosure is not limited thereto. In one embodiment, the control device 100 is, for example, a head-mounted display (HMD), wearable glasses (e.g., AR/VR goggles), an electronic device, other similar devices, or a combination of these devices. However, the disclosure is not limited thereto.

在一實施例中,控制器110例如為控制器(Microcontroller Unit, MCU)、中央處理單元(Central Processing Unit, CPU)、微處理器(Microprocessor)、數位訊號處理器(Digital Signal Processor, DSP)、可程式化控制器、可程式化邏輯裝置(Programmable Logic Device, PLD)或其他類似裝置或這些裝置的組合,本發明並不加以限制。此外,在一實施例中,控制器110的各功能可被實作為多個程式碼。這些程式碼會被儲存在一個記憶體中,由控制器110來執行這些程式碼。或者,在一實施例中,控制器110的各功能可被實作為一或多個電路。本發明並不限制用軟體或硬體的方式來實作控制器110的各功能。In one embodiment, the controller 110 is, for example, a microcontroller unit (MCU), a central processing unit (CPU), a microprocessor (Microprocessor), a digital signal processor (DSP), a programmable controller, a programmable logic device (PLD) or other similar devices or a combination of these devices, and the present invention is not limited thereto. In addition, in one embodiment, each function of the controller 110 can be implemented as a plurality of program codes. These program codes are stored in a memory and executed by the controller 110. Alternatively, in one embodiment, each function of the controller 110 can be implemented as one or more circuits. The present invention is not limited to implementing each function of the controller 110 by software or hardware.

在一個實施例中,顯示器120例如包括有機發光二極體(Organic Light-Emitting Diode, OLED)顯示裝置、迷你LED (Mini LED)顯示裝置、微型LED (Micro LED)顯示裝置、量子點(Quantum Dot, QD)LED顯示裝置、液晶顯示(Liquid-Crystal Display, LCD)裝置、平鋪(Tiled)顯示設備、可折疊(Foldable)顯示設備或電子紙顯示器(Electronic Paper Display, EPD)。然而,本揭露並不以此為限。In one embodiment, the display 120 includes, for example, an organic light-emitting diode (OLED) display device, a mini LED (Mini LED) display device, a micro LED (Micro LED) display device, a quantum dot (QD) LED display device, a liquid crystal display (Liquid-Crystal Display, LCD) device, a tiled display device, a foldable display device, or an electronic paper display (EPD). However, the present disclosure is not limited thereto.

圖1B是依照本發明的一實施例的一種控制系統的示意圖。控制系統10包括控制裝置100以及攝影機130。控制裝置100的細節可參照圖1A的描述,在此不多加贅述。Fig. 1B is a schematic diagram of a control system according to an embodiment of the present invention. The control system 10 includes a control device 100 and a camera 130. The details of the control device 100 can be found in the description of Fig. 1A and will not be elaborated here.

在一個實施例中,攝影機130包含例如互補金屬氧化物半導體(complementary metal oxide semiconductor, CMOS)相機或電荷耦合裝置(charge coupled device, CCD)相機,且輔助光單元包括紅外照射單元。然而,本揭露並不以此為限。此外,在一個實施例中,攝影機130可設置於控制裝置100之外,從而一體地形成控制系統10。在另一實施例中,攝影機130可設置於控制裝置100中。也就是說,本揭露並不限定攝影機130設置的位置。In one embodiment, the camera 130 includes, for example, a complementary metal oxide semiconductor (CMOS) camera or a charge coupled device (CCD) camera, and the auxiliary light unit includes an infrared illumination unit. However, the present disclosure is not limited to this. In addition, in one embodiment, the camera 130 may be disposed outside the control device 100, thereby forming an integrated control system 10. In another embodiment, the camera 130 may be disposed in the control device 100. That is, the present disclosure does not limit the location where the camera 130 is disposed.

在一實施例中,攝影機130可用以取得(拍攝)使用者在真實世界中的影像。並且,控制器110可基於取得的影像進行影像處理,從而判斷使用者的意圖。也就是說,控制器110可根據攝影機130取得的影像,產生與使用者的意圖有關的使用者命令。In one embodiment, the camera 130 may be used to obtain (shoot) an image of the user in the real world. Furthermore, the controller 110 may perform image processing based on the obtained image to determine the user's intention. In other words, the controller 110 may generate a user command related to the user's intention based on the image obtained by the camera 130.

在一實施例中,攝影機130可取得使用者影像。使用者影像包括在真實世界中的使用者的手(亦稱真實手)或手持遙控器。控制器110可從攝影機130接收上述使用者影像。並且,控制器110可根據使用者影像,判斷在虛擬世界中使用者的虛擬手或控制錨點是否接觸控制面上的第一控制點。舉例而言,在虛擬世界中使用者的虛擬手或控制錨點可對應於在真實世界中的手或手持遙控器,從而在虛擬世界中進行操作。當使用者在虛擬世界中的虛擬手或控制錨點接觸第一控制點時,使用者可經由第一控制點,對物體進行各種控制。換句話說,響應於虛擬手或控制錨點接觸第一控制點,控制器110可對物體進行第一控制。補充說明的是,為了方便說明,以下以虛擬手來進行描述,但本揭露並不限制在虛擬世界中是以虛擬手、控制錨點或其他具有相似功能的物件來進行操控。In one embodiment, the camera 130 may obtain a user image. The user image includes the user's hand (also referred to as a real hand) or a handheld remote control in the real world. The controller 110 may receive the above-mentioned user image from the camera 130. In addition, the controller 110 may determine whether the user's virtual hand or control anchor in the virtual world contacts the first control point on the control surface based on the user image. For example, the user's virtual hand or control anchor in the virtual world may correspond to the hand or handheld remote control in the real world, thereby performing operations in the virtual world. When the user's virtual hand or control anchor in the virtual world contacts the first control point, the user may perform various controls on the object through the first control point. In other words, in response to the virtual hand or the control anchor contacting the first control point, the controller 110 can perform the first control on the object. It should be noted that for the convenience of explanation, the following description is made with a virtual hand, but the present disclosure is not limited to the virtual hand, the control anchor or other objects with similar functions for manipulation in the virtual world.

在一實施例中,當使用者的虛擬手接近或接觸第一控制點時,第一控制點可顯示為反白(待選取)的狀態。在一實施例中,當使用者的虛擬手接近或接觸第一控制點時,第一控制點所對應的物體的周圍影像或物體所發射的第一射線可顯示提示標示,例如影像變色、亮度增加、輪廓強化、顯示文字或動畫說明、播放音效、或是第一射線變色、反白、跳動等變化。如此一來,使用者可清楚地得知目前可經由第一控制點來控制物體。In one embodiment, when the user's virtual hand approaches or touches the first control point, the first control point may be displayed as a highlighted (to be selected) state. In one embodiment, when the user's virtual hand approaches or touches the first control point, the surrounding image of the object corresponding to the first control point or the first ray emitted by the object may display a prompt mark, such as image color change, brightness increase, outline enhancement, display of text or animation description, playing of sound effects, or changes in the first ray color change, highlighting, jumping, etc. In this way, the user can clearly know that the object can be controlled via the first control point.

在一實施例中,控制面可例如為平面、立方體表面、圓柱面、圓球面或橢圓球面,但本揭露並不以此為限。在一實施例中,控制面與使用者之間的控制面距離可以是一預設距離,例如為40公分,但本揭露並不以此為限。在一實施例中,控制面距離可以根據使用者的個人參數來決定。在一實施例中,使用者的個人參數可例如包括使用者的手臂長度、前臂長度、身高、頭長或其他與各種身體數據,本揭露並不以此為限。由於人體比例有一定的規律,因此可以根據任一或多種身體數據的組合,推算出最適合使用者的控制面距離。在一實施例中,立方體表面、圓柱面、圓球面或橢圓球面的中心(例如重心、圓心或球心)或焦點可重疊於使用者。舉例而言,控制面為一個球面。並且,球面的中心重疊於使用者的中心點,且球面的半徑小於或等於使用者的手臂長度或前臂長度。又或者,控制面可為兩個球面。並且,兩個球面的中心分別位於使用者的左邊的肩膀的中心點與右邊的肩膀的中心點,且兩個球面的半徑小於或等於使用者的手臂長度或前臂長度。也就是說,控制面可形成於使用者的周圍,使得使用者可輕易地碰觸到控制面。此外,使用者可以依據本身姿態或應用情境,選擇適合的控制面形狀。再者,使用者還可以依據實際使用情境調整控制面的位置及範圍(大小),其實施方式於後續實施例中會再詳細說明。In one embodiment, the control surface may be, for example, a plane, a cubic surface, a cylindrical surface, a spherical surface, or an elliptical spherical surface, but the disclosure is not limited thereto. In one embodiment, the control surface distance between the control surface and the user may be a preset distance, such as 40 cm, but the disclosure is not limited thereto. In one embodiment, the control surface distance may be determined based on the user's personal parameters. In one embodiment, the user's personal parameters may include, for example, the user's arm length, forearm length, height, head length, or other various body data, but the disclosure is not limited thereto. Since human body proportions have certain regularities, the control surface distance that best suits the user may be calculated based on any one or a combination of multiple body data. In one embodiment, the center (e.g., center of gravity, center of a circle, or center of a sphere) or focus of a cubic surface, a cylindrical surface, a spherical surface, or an elliptical spherical surface may overlap with the user. For example, the control surface is a spherical surface. Moreover, the center of the spherical surface overlaps with the center point of the user, and the radius of the spherical surface is less than or equal to the length of the user's arm or forearm. Alternatively, the control surface may be two spherical surfaces. Moreover, the centers of the two spherical surfaces are located at the center point of the user's left shoulder and the center point of the user's right shoulder, respectively, and the radius of the two spherical surfaces is less than or equal to the length of the user's arm or forearm. In other words, the control surface may be formed around the user so that the user can easily touch the control surface. In addition, the user can choose a suitable control surface shape according to his or her own posture or application scenario. Furthermore, the user can also adjust the position and range (size) of the control surface according to the actual usage scenario, and the implementation method will be described in detail in the subsequent embodiments.

在一實施例中,控制裝置100中設置有感測器,並且感測器可偵測使用者配戴控制裝置100的位置,從而自動地取得使用者的個人參數。在另一實施例中,攝影機130可將拍攝包含使用者的姿態的姿態影像,並且控制器110可基於姿態影像,自動地計算出使用者的個人參數。在又一實施例中,使用者可自行地輸入使用者的個人參數至控制裝置100。換句話說,本揭露並不限制使用者的個人參數的取得方式。In one embodiment, a sensor is provided in the control device 100, and the sensor can detect the position where the user wears the control device 100, thereby automatically obtaining the user's personal parameters. In another embodiment, the camera 130 can shoot a posture image including the user's posture, and the controller 110 can automatically calculate the user's personal parameters based on the posture image. In another embodiment, the user can input the user's personal parameters to the control device 100 by himself. In other words, the present disclosure does not limit the method of obtaining the user's personal parameters.

值得一提的是,使用者是經由鄰近的控制面上的控制點來對物體進行控制,而不是直接地對遠方的物體進行控制。如此一來,使用者可輕易且準確地對物體進行控制,從而增加使用體驗。It is worth mentioning that users control objects through control points on the adjacent control surface, rather than directly controlling distant objects. In this way, users can easily and accurately control objects, thereby increasing the user experience.

圖2是依照本發明的一實施例的一種使用情境的示意圖。參照圖1A至圖2,情境200包括在虛擬世界中的使用者U、物體220(又稱為貓)以及物體230(又稱為按鈕區)。物體220以及物體230可各自包括多個控制區域。並且,物體220以及物體230的每個控制區域可分別地朝向使用者U的方向發射出射線221、222、223、231、232、233。此外,在使用者U的周圍形成控制面210。FIG. 2 is a schematic diagram of a usage scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 2 , scenario 200 includes a user U, an object 220 (also referred to as a cat), and an object 230 (also referred to as a button area) in a virtual world. Objects 220 and 230 may each include a plurality of control areas. Furthermore, each control area of object 220 and object 230 may respectively emit rays 221, 222, 223, 231, 232, 233 in the direction of user U. In addition, a control surface 210 is formed around user U.

在一實施例中,控制面210是一個虛擬面,以及/或是任一射線221、222、223、231、232、233是虛擬射線。也就是說,使用者U平常可能看不到控制面210以及射線221、222、223、231、232、233的存在。僅有在當使用者U的手靠近或接觸隱藏的控制面210或是控制點時,控制面210或是射線221、222、223、231、232、233才會浮現出來。並且,當使用者U的手遠離控制面210或是控制點時,控制面210或是射線221、222、223、231、232、233可再度進入隱藏的狀態。換句話說,控制器110可根據在虛擬世界中使用者U的虛擬手與控制面210之間的手部距離或是使用者U的虛擬手與控制點之間的手部距離,判斷將控制面210或是射線221、222、223、231、232、233顯示或隱藏。如此一來,控制面210或是射線221、222、223、231、232、233僅會在使用者U需要對物體進行控制的時候才浮現出來,從而不會影響到使用者U的視覺觀感。在一實施例中,可以透過物體本身的影像效果或播放音效,來取代顯示控制面210和射線221、222、223、231、232、233。換句話說,本揭露並不限制控制面210以及射線221、222、223、231、232、233的實際呈現方式,圖2僅為方便說明的一種實施方式。In one embodiment, the control surface 210 is a virtual surface, and/or any of the rays 221, 222, 223, 231, 232, 233 is a virtual ray. That is, the user U may not normally see the existence of the control surface 210 and the rays 221, 222, 223, 231, 232, 233. Only when the hand of the user U approaches or touches the hidden control surface 210 or the control point, the control surface 210 or the rays 221, 222, 223, 231, 232, 233 will emerge. Moreover, when the hand of the user U moves away from the control surface 210 or the control point, the control surface 210 or the rays 221, 222, 223, 231, 232, 233 may enter a hidden state again. In other words, the controller 110 can determine whether to display or hide the control surface 210 or the rays 221, 222, 223, 231, 232, 233 according to the hand distance between the virtual hand of the user U and the control surface 210 or the hand distance between the virtual hand of the user U and the control point in the virtual world. In this way, the control surface 210 or the rays 221, 222, 223, 231, 232, 233 will only appear when the user U needs to control the object, so as not to affect the visual perception of the user U. In one embodiment, the control surface 210 and the rays 221, 222, 223, 231, 232, 233 may be replaced by the image effect of the object itself or the sound effect. In other words, the present disclosure does not limit the actual presentation method of the control surface 210 and the rays 221, 222, 223, 231, 232, 233, and FIG. 2 is only an implementation method for convenience of explanation.

值得一提的是,在控制面210以及射線221、222、223、231、232、233處在隱藏的狀態時,使用者U的虛擬手可任意地擺放。也就是說,由於此時控制面210以及射線221、222、223、231、232、233處在隱藏的狀態,使用者U的虛擬手不會觸發對控制面210上的控制點的控制。如此一來,在不需要進行控制的時候,使用者U的虛擬手可不受限制地舒適地擺放,從而增加使用體驗。It is worth mentioning that when the control surface 210 and the rays 221, 222, 223, 231, 232, 233 are in a hidden state, the virtual hand of the user U can be placed arbitrarily. In other words, since the control surface 210 and the rays 221, 222, 223, 231, 232, 233 are in a hidden state at this time, the virtual hand of the user U will not trigger the control of the control point on the control surface 210. In this way, when control is not required, the virtual hand of the user U can be placed comfortably without restriction, thereby increasing the use experience.

在一實施例中,物體220(貓)可包括多個控制區域。舉例而言,物體220的頭部是第一控制區域,物體220的身體是第二控制區域,且物體220的腳是第三控制區域。第一控制區域可發射出射線221(又稱為第一射線),第二控制區域可發射出射線222(又稱為第二射線),且第三控制區域可發射出射線223(又稱為第三射線)。在一實施例中,射線221、222、223的起點可分別地位於第一控制區域、第二控制區域以及第三控制區域的中心點,但本揭露並不以此為限。並且,射線221、222、223可分別地與控制面210交會,從而形成第一控制點、第二控制點以及第三控制點。In one embodiment, the object 220 (cat) may include a plurality of control regions. For example, the head of the object 220 is the first control region, the body of the object 220 is the second control region, and the feet of the object 220 are the third control region. The first control region may emit a ray 221 (also referred to as the first ray), the second control region may emit a ray 222 (also referred to as the second ray), and the third control region may emit a ray 223 (also referred to as the third ray). In one embodiment, the starting points of the rays 221, 222, 223 may be located at the center points of the first control region, the second control region, and the third control region, respectively, but the disclosure is not limited thereto. Furthermore, the rays 221, 222, 223 may intersect with the control surface 210, respectively, thereby forming a first control point, a second control point, and a third control point.

補充說明的是,在一些實施例中,使用者U可點選第一控制點,使得物體220產生被摸頭的效果。並且,使用者U可點選第二控制點,使得物體220產生被搔癢的效果。再者,使用者U可點選第三控制點,使得物體220進行移動或旋轉。當物體220進行移動或旋轉時,物體220所對應的多個控制點在控制面210上也會相應地進行移動。也就是說,控制器110可根據物體220所對應的不同的控制點,對物體220進行不同的控制。並且,控制器110可根據物體220的移動或旋轉,移動物體220所對應的控制點在控制面210上的位置。如此一來,使用者U可根據物體220所對應的多個控制點,對物體220做出各種控制。It should be noted that in some embodiments, the user U may click on the first control point to make the object 220 produce the effect of being patted on the head. Furthermore, the user U may click on the second control point to make the object 220 produce the effect of being tickled. Furthermore, the user U may click on the third control point to make the object 220 move or rotate. When the object 220 moves or rotates, the multiple control points corresponding to the object 220 on the control surface 210 will also move accordingly. In other words, the controller 110 may perform different controls on the object 220 according to the different control points corresponding to the object 220. Furthermore, the controller 110 may move the position of the control point corresponding to the object 220 on the control surface 210 according to the movement or rotation of the object 220. In this way, the user U can perform various controls on the object 220 according to the multiple control points corresponding to the object 220.

在一實施例中,物體230(按鈕區)可包括多個控制區域。舉例而言,物體230由上至下可包括第一按鈕、第二按鈕以及第三按鈕。第一按鈕可發射出射線231(又稱為第一射線),第二按鈕可發射出射線232(又稱為第二射線),且第三按鈕可發射出射線233(又稱為第三射線)。射線231、232、233可分別地與控制面210交會,從而形成第一控制點、第二控制點以及第三控制點。In one embodiment, the object 230 (button area) may include a plurality of control areas. For example, the object 230 may include a first button, a second button, and a third button from top to bottom. The first button may emit a ray 231 (also referred to as a first ray), the second button may emit a ray 232 (also referred to as a second ray), and the third button may emit a ray 233 (also referred to as a third ray). The rays 231, 232, 233 may intersect with the control surface 210, respectively, to form a first control point, a second control point, and a third control point.

補充說明的是,在一些實施例中,物體230所對應的第一控制點、第二控制點以及第三控制點可組成控制群組,以使第一控制點、第二控制點以及第三控制點同步地被調整(例如調整顯示方式、控制點之間的控制點距離或個別控制點的控制點位置),以方便使用者進行選取與控制。It should be noted that in some embodiments, the first control point, the second control point, and the third control point corresponding to the object 230 can be formed into a control group so that the first control point, the second control point, and the third control point can be adjusted synchronously (for example, adjusting the display method, the control point distance between the control points, or the control point position of an individual control point) to facilitate user selection and control.

舉例而言,控制群組中的控制點可以被同步地顯示或隱藏。例如,在控制群組有顯示且使用者的虛擬手接近、接觸或選取控制群組時,控制器110可淡化顯示控制面210上除了控制群組以外的內容。也就是說,當控制群組被選取時,控制面210可僅顯示單一或多個被選取的控制群組,以方便使用者U專注於控制群組的內容。For example, the control points in the control group can be displayed or hidden synchronously. For example, when the control group is displayed and the user's virtual hand approaches, touches or selects the control group, the controller 110 can fade the content other than the control group on the control surface 210. In other words, when the control group is selected, the control surface 210 can only display a single or multiple selected control groups to facilitate the user U to focus on the content of the control group.

又舉例而言,控制群組可以被同步地設置放大率(相當於設置控制點距離)。例如,當控制群組被選取時,控制器110可將控制群組中的控制點距離同步放大,以使控制點的分布較原先分散,例如可以使控制點距離至少為5公分,但本揭露並不以此為限。需要說明的是,關於控制點的控制點距離的調整方式,在後續的實施例中會再詳細說明。For another example, the control group can be synchronously set to a magnification (equivalent to setting the control point distance). For example, when the control group is selected, the controller 110 can synchronously magnify the control point distance in the control group to make the distribution of the control points more dispersed than before, for example, the control point distance can be at least 5 cm, but the present disclosure is not limited to this. It should be noted that the adjustment method of the control point distance of the control point will be further described in detail in the subsequent embodiments.

再舉例而言,控制群組可以被同步地設置控制點的偏移位置(控制點位置)。例如,當控制群組被選取時,控制器110可將控制群組中的控制點同步移動到控制面210上較為寬敞或固定的區域。如此一來,藉由調整控制群組中的控制點的控制點位置,可以方便使用者U對控制點精準地進行控制。For another example, the control group can be synchronously set with the offset position (control point position) of the control point. For example, when the control group is selected, the controller 110 can synchronously move the control points in the control group to a wider or fixed area on the control surface 210. In this way, by adjusting the control point positions of the control points in the control group, the user U can control the control points accurately.

另外舉例而言,控制群組中的控制點可以鍵盤的形式顯示在控制面210上,以讓使用者方便對鍵盤上的各個按鍵(Key)進行碰觸,從而控制物體230上的各個按鈕(Button)。如此一來,使用者U可直覺地對物體230輸入各種指令(控制指令),從而增加在虛擬世界中的操控的便利性並提升使用者的輸入速度。For another example, the control points in the control group can be displayed on the control surface 210 in the form of a keyboard, so that the user can easily touch the keys on the keyboard to control the buttons on the object 230. In this way, the user U can intuitively input various commands (control commands) to the object 230, thereby increasing the convenience of manipulation in the virtual world and improving the user's input speed.

圖3A是依照本發明的一實施例的一種操控手勢的示意圖。圖3B是依照本發明的一實施例的一種操控手勢的示意圖。圖3C是依照本發明的一實施例的一種操控手勢的示意圖。參照圖1A至圖3C,手勢300A、手勢300B以及手勢300C分別地繪示使用者U在虛擬世界中的虛擬手H在進行控制時的各種姿勢。FIG. 3A is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. FIG. 3B is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. FIG. 3C is a schematic diagram of a manipulation gesture according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 3C , gestures 300A, 300B, and 300C respectively illustrate various postures of the virtual hand H of the user U in the virtual world when performing control.

在一實施例中,攝影機130可用以拍攝使用者U在真實世界中的真實手的姿勢的影像,並且控制器110可基於攝影機130拍攝的影像進行影像處理。接著,控制器110可傳輸訊號至顯示器120,使得使用者U在虛擬世界中的虛擬手H產生對應於真實世界中的真實手的動作。然而,本揭露並不以此為限。In one embodiment, the camera 130 may be used to capture an image of the user U's real hand gesture in the real world, and the controller 110 may perform image processing based on the image captured by the camera 130. Then, the controller 110 may transmit a signal to the display 120 so that the user U's virtual hand H in the virtual world generates a motion corresponding to the real hand in the real world. However, the present disclosure is not limited thereto.

手勢300A、手勢300B以及手勢300C各自包括手指F1(又稱為第一手指或拇指)、手指F2(又稱為第二手指或食指)以及捏取(Pinch)點PP。在一實施例中,捏取點PP可設置於手指F1至手指F2的連線中較靠近手指F1的位置。一般而言,使用者U的虛擬手H在捏取時,手指F1的移動量會小於手指F2的移動量。如此一來,使用者U可較輕易地完成對捏取點PP的捏取。The gestures 300A, 300B, and 300C each include a finger F1 (also called a first finger or thumb), a finger F2 (also called a second finger or index finger), and a pinch point PP. In one embodiment, the pinch point PP can be set at a position closer to the finger F1 in the line connecting the finger F1 to the finger F2. Generally speaking, when the virtual hand H of the user U pinches, the movement amount of the finger F1 is smaller than the movement amount of the finger F2. In this way, the user U can more easily complete the pinching of the pinch point PP.

在一實施例中,使用者U可經由對捏取點PP進行捏取,來對虛擬世界中的物體220或物體230進行選取或進行確認。舉例而言,當捏取點PP重疊或為唯一最接近於物體220對應的控制點時,使用者U可對捏取點PP進行捏取,以對物體220進行選取或進行確認。在一實施例中,使用者U可對捏取點PP捏取一段預設時間或捏取後再放開,以出現子視窗功能或進行確認。換句話說,使用者U可對捏取點PP進行捏取,以觸發物體220對應的控制點,進而對物體220進行控制。In one embodiment, the user U can select or confirm the object 220 or the object 230 in the virtual world by pinching the pinch point PP. For example, when the pinch point PP overlaps or is the only control point closest to the corresponding object 220, the user U can pinch the pinch point PP to select or confirm the object 220. In one embodiment, the user U can pinch the pinch point PP for a preset time or release it after pinching to make a sub-window function appear or to confirm. In other words, the user U can pinch the pinch point PP to trigger the control point corresponding to the object 220, thereby controlling the object 220.

值得一提的是,使用者U在對捏取點PP進行捏取時,手指F1以及手指F2以外的其他手指可保持任意的姿勢。也就是說,使用者U可隨意地挑選自己感到舒適的姿勢(如圖3A或圖3B或其他手勢),並進行捏取的動作。並且,在一實施例中,如圖3C所示,在使用者U的手指F1以及手指F2接觸到捏取點PP後,手指F1以及手指F2可繼續地向手心收攏,並保持握拳的姿勢。並且,經由保持握拳的姿勢,使用者U可對捏取點PP維持選取的狀態。然而,本揭露並不以此為限。舉例而言,使用者可以透過其他手勢,例如手部往上下左右前後的任一方向滑動,以選擇、觸發或取消選擇物體220對應的控制點。如此一來,使用者U在虛擬世界中,能夠不受手勢的侷限,可以最舒適的姿態來進行控制,從而提升使用體驗。It is worth mentioning that when the user U is pinching the pinch point PP, the fingers other than the fingers F1 and F2 can maintain any posture. That is to say, the user U can arbitrarily choose a posture that he feels comfortable with (such as FIG. 3A or FIG. 3B or other gestures) and perform the pinching action. Moreover, in one embodiment, as shown in FIG. 3C , after the fingers F1 and F2 of the user U touch the pinch point PP, the fingers F1 and F2 can continue to be gathered toward the palm and maintain a fist posture. Moreover, by maintaining a fist posture, the user U can maintain the selected state of the pinch point PP. However, the present disclosure is not limited to this. For example, the user can use other gestures, such as sliding the hand in any direction up, down, left, right, forward, or backward, to select, trigger, or cancel the selection of the control point corresponding to the object 220. In this way, the user U can control the object 220 in the virtual world without being restricted by gestures, and can control the object in the most comfortable posture, thereby improving the user experience.

圖4是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖4,情境400繪示在虛擬世界中的使用者U的虛擬手H的捏取後並進行拖移的動作。在一實施例中,使用者U進行捏取後,虛擬手H的手指F1以及手指F2可保持在捏取的狀態,以對控制的對象(例如物體220)進行拖移。FIG4 is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG1A to FIG4 , scenario 400 shows a user U's virtual hand H pinching and dragging in a virtual world. In one embodiment, after the user U pinches, the finger F1 and the finger F2 of the virtual hand H can remain in the pinching state to drag the controlled object (e.g., object 220).

需要說明的是,如圖4所示,使用者U在進行拖移時,虛擬手H的手腕可任意地翻轉。也就是說,虛擬手H的手腕不用維持在特定的角度。若是在手腕維持在特定的角度進行拖移,使用者U的姿勢會受到限制。換句話說,經由僅判斷手指F1以及手指F2之間的相對關係(例如距離),來維持捏取的狀態,對於使用者U來說可更靈活地進行操控。如此一來,使用者U可在符合人體工學的情況下,進行各種操控,進而提升使用體驗。It should be noted that, as shown in FIG4 , when the user U is dragging, the wrist of the virtual hand H can be turned arbitrarily. In other words, the wrist of the virtual hand H does not need to be maintained at a specific angle. If the wrist is maintained at a specific angle during dragging, the posture of the user U will be restricted. In other words, by only judging the relative relationship (such as distance) between the finger F1 and the finger F2 to maintain the pinching state, the user U can operate more flexibly. In this way, the user U can perform various operations in an ergonomic manner, thereby improving the user experience.

圖5A是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖5A,情境500A示意地繪示使用者U、控制面CTR以及物體OBJ之間的關係。如圖5A所示,物體OBJ可包括控制區域E1(又稱為第一控制區域)、控制區域E2(又稱為第二控制區域)、控制區域E3(又稱為第三控制區域)以及控制區域E4(又稱為第四控制區域)。控制區域E1~E4可位於物體OBJ的表面,但本揭露並不以此為限。多條射線(例如第一射線至第四射線)可分別地從控制區域E1~E4面向使用者U的方向發射。在一實施例中,多條射線的起點可分別地位於控制區域E1~E4的中心點,但本揭露並不以此為限。並且,對應於控制區域E1~E4的多條射線可分別地交會於控制面CTR,從而形成控制區域E1’(又稱為第一控制點)、控制區域E2’(又稱為第二控制點)、控制區域E3’(又稱為第三控制點)以及控制區域E4’(又稱為第四控制點)。FIG. 5A is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 5A , scenario 500A schematically illustrates the relationship between the user U, the control surface CTR, and the object OBJ. As shown in FIG. 5A , the object OBJ may include a control area E1 (also referred to as the first control area), a control area E2 (also referred to as the second control area), a control area E3 (also referred to as the third control area), and a control area E4 (also referred to as the fourth control area). The control areas E1~E4 may be located on the surface of the object OBJ, but the present disclosure is not limited thereto. Multiple rays (e.g., the first ray to the fourth ray) may be emitted from the control areas E1~E4 in the direction of the user U, respectively. In one embodiment, the starting points of the multiple rays may be located at the center points of the control areas E1~E4, respectively, but the present disclosure is not limited thereto. Furthermore, multiple rays corresponding to the control areas E1~E4 can intersect at the control plane CTR respectively, thereby forming the control area E1' (also called the first control point), the control area E2' (also called the second control point), the control area E3' (also called the third control point) and the control area E4' (also called the fourth control point).

再者,控制區域E1~E4彼此之間的距離可稱為控制區域距離,且控制點E1’~E4’彼此之間的距離可稱為控制點距離。舉例而言,控制區域E1與控制區域E2之間的距離可標示為控制區域距離D1A,且控制點E1’與控制點E2’之間的距離可標示為控制點距離D2A。Furthermore, the distance between the control areas E1-E4 may be referred to as the control area distance, and the distance between the control points E1'-E4' may be referred to as the control point distance. For example, the distance between the control area E1 and the control area E2 may be indicated as the control area distance D1A, and the distance between the control point E1' and the control point E2' may be indicated as the control point distance D2A.

此外,對應於控制區域E1~E4的多條射線可全部地交會於投影點P。舉例而言,控制區域E1所發射的射線(又稱為第一射線)與控制區域E2所發射的射線(又稱為第二射線)均交會於投影點P。也就是說,對應於控制區域E1~E4的多條射線的起點分別為控制區域E1~E4的中心,且對應於控制區域E1~E4的多條射線的終點均為投影點P。In addition, the multiple rays corresponding to the control areas E1 to E4 may all intersect at the projection point P. For example, the rays emitted by the control area E1 (also referred to as the first rays) and the rays emitted by the control area E2 (also referred to as the second rays) all intersect at the projection point P. In other words, the starting points of the multiple rays corresponding to the control areas E1 to E4 are the centers of the control areas E1 to E4, respectively, and the end points of the multiple rays corresponding to the control areas E1 to E4 are all the projection point P.

值得一提的是,控制點E1’~E4’是基於對應的射線交會於控制面而形成,且控制點E1’~E4’對應的射線是基於作為起點的控制區域E1~E4的中心以及作為終點的投影點P而形成。換句話說,當投影點P的位置改變時,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)也會相應地跟著改變。It is worth mentioning that the control points E1'~E4' are formed based on the intersection of the corresponding rays on the control surface, and the rays corresponding to the control points E1'~E4' are formed based on the center of the control area E1~E4 as the starting point and the projection point P as the end point. In other words, when the position of the projection point P changes, the control point distance between the control points E1'~E4' (for example, the control point distance D2A) will also change accordingly.

舉例而言,當投影點P位於使用者的後方且再往使用者的後方(例如圖5A的左方)移動時(即距離使用者U較原先的投影點P更遠),控制點E1’~E4’會往彼此靠近。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會減少。相反地,當投影點P位於使用者的後方且往使用者的前方(例如圖5A的右方)移動時(即距離使用者U較原先的投影點P更近),控制點E1’~E4’會往彼此遠離。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會增加。因此,在一實施例中,使用者U可經由調整投影點P的位置(即使用者U與投影點P之間的投影點距離),來調整控制面CTR上所顯示的控制點E1’~E4’之間的控制點距離(即顯示的倍率)。也就是說,控制器110可基於投影點P與使用者U的位置,決定控制點E1’與控制點E2’之間的控制點距離D2A。For example, when the projection point P is located behind the user and moves to the rear of the user (e.g., the left side of FIG. 5A ) (i.e., the distance from the user U is farther than the original projection point P), the control points E1’~E4’ will move closer to each other. In other words, the control point distance between the control points E1’~E4’ (e.g., the control point distance D2A) will decrease. On the contrary, when the projection point P is located behind the user and moves to the front of the user (e.g., the right side of FIG. 5A ) (i.e., the distance from the user U is closer than the original projection point P), the control points E1’~E4’ will move away from each other. In other words, the control point distance between the control points E1’~E4’ (e.g., the control point distance D2A) will increase. Therefore, in one embodiment, the user U can adjust the control point distance (i.e., the display magnification) between the control points E1'-E4' displayed on the control surface CTR by adjusting the position of the projection point P (i.e., the projection point distance between the user U and the projection point P). In other words, the controller 110 can determine the control point distance D2A between the control point E1' and the control point E2' based on the position of the projection point P and the user U.

另外,由於鄰近於使用者U的控制面CTR位於投影點P與物體OBJ之間,控制點E1’~E4’之間的控制點距離(例如控制點距離D2A)會小於控制區域E1~E4之間的控制區域距離(例如控制區域距離D1A)。也就是說,控制器110可響應於使用者U位於投影點P與物體OBJ之間,設定第一控制點與第二控制點之間的控制點距離D2A小於控制區域E1與第二控制區域E2之間的控制區域距離D1A。In addition, since the control plane CTR adjacent to the user U is located between the projection point P and the object OBJ, the control point distance between the control points E1'-E4' (e.g., the control point distance D2A) is smaller than the control area distance between the control areas E1-E4 (e.g., the control area distance D1A). In other words, the controller 110 can set the control point distance D2A between the first control point and the second control point to be smaller than the control area distance D1A between the control area E1 and the second control area E2 in response to the user U being located between the projection point P and the object OBJ.

附帶一提的是,控制面CTR的位置及範圍(大小)可被進一步調整。舉例而言,圖5A的控制面CTR可包括原點C與限制範圍,且在限制範圍之外的射線及對應的控制點既不顯示也不能被控制。在一實施例中,原點C可設置於於控制面CTR的中心位置,但本揭露並不以此為限。在一實施例中,當使用者U想對控制面CTR的位置進行調整時,可用單手(或雙手)的虛擬手H去拖移控制面CTR的原點C。舉例而言,使用者U可以將控制面CTR拉近或遠離使用者U。例如,當使用者U的雙手同時往前延伸時,可將控制面CTR(包括原點C)的位置往使用者U的前方移動(即距離使用者U較原先的控制面CTR更遠)。也就是說,控制器110可響應於原點C的位置,設定控制面CTR與使用者U之間的控制面距離。Incidentally, the position and range (size) of the control surface CTR can be further adjusted. For example, the control surface CTR of FIG. 5A may include an origin C and a restricted range, and the rays and corresponding control points outside the restricted range are neither displayed nor controlled. In one embodiment, the origin C may be set at the center of the control surface CTR, but the present disclosure is not limited thereto. In one embodiment, when the user U wants to adjust the position of the control surface CTR, the origin C of the control surface CTR may be dragged with a single (or double) virtual hand H. For example, the user U may move the control surface CTR closer to or farther from the user U. For example, when both hands of the user U extend forward at the same time, the position of the control surface CTR (including the origin C) may be moved forward of the user U (i.e., farther from the user U than the original control surface CTR). That is, the controller 110 may set the control surface distance between the control surface CTR and the user U in response to the position of the origin C.

再舉例而言,使用者U可以將控制面CTR往與使用者U距離不變的任一方向(例如使用者U的上下左右)移動。例如,當使用者U的雙手同時往左移動時,可將控制面CTR(包括原點C)的位置往使用者U的左方移動。再舉例而言,當雙手往不同方向移動時,可以將控制面CTR往雙手中點的方向移動。例如,當使用者U的左手往使用者U的左方移動,而右手往使用者U的上方移動時,可將控制面CTR往左上方移動。如此一來,使用者U可以依據實際使用情境,藉由拖移控制面CTR的原點C的位置,來移動控制面CTR的位置,進而調整控制面CTR所涵蓋的控制點範圍。For another example, the user U can move the control surface CTR to any direction (such as up, down, left, and right of the user U) whose distance from the user U remains unchanged. For example, when both hands of the user U move to the left at the same time, the position of the control surface CTR (including the origin C) can be moved to the left of the user U. For another example, when both hands move to different directions, the control surface CTR can be moved in the direction of the center of both hands. For example, when the left hand of the user U moves to the left of the user U and the right hand moves to the top of the user U, the control surface CTR can be moved to the upper left. In this way, the user U can move the position of the control surface CTR by dragging the position of the origin C of the control surface CTR according to the actual usage scenario, thereby adjusting the range of control points covered by the control surface CTR.

進一步而言,在一實施例中,當使用者U想對控制面CTR的範圍大小進行調整時,可用單手或雙手的虛擬手H去縮放控制面CTR的限制範圍。舉例而言,使用者U可以將控制面CTR的範圍放大或縮小。例如,當使用者U的雙手往遠離彼此的方向移動時,可以擴大控制面CTR的限制範圍,以包含到較多控制點(例如包含到圖5A中的控制點E1’~E4’之外的其他控制點)。也就是說,響應於使用者U擴大控制面的限制範圍CTR,本來位於限制範圍外的無效控制點可被移動至限制範圍內,從而將無效控制點設定為有效控制點。Furthermore, in one embodiment, when the user U wants to adjust the range size of the control surface CTR, the user can use one or both virtual hands H to scale the restricted range of the control surface CTR. For example, the user U can enlarge or reduce the range of the control surface CTR. For example, when the hands of the user U move away from each other, the restricted range of the control surface CTR can be expanded to include more control points (for example, to include other control points other than the control points E1'~E4' in FIG. 5A). That is, in response to the user U expanding the restricted range CTR of the control surface, the invalid control point originally located outside the restricted range can be moved into the restricted range, thereby setting the invalid control point as a valid control point.

再例如,當使用者U的雙手往靠近彼此的方向移動時,可以縮減控制面CTR的限制範圍,以包含到較少的控制點(例如減少到只包含控制點E2’與E3’)。也就是說,響應於使用者U縮減控制面CTR的限制範圍,本來位於限制範圍內的有效控制點可被移動至限制範圍外,從而將有效控制點設定為無效控制點。如此一來,使用者U可以依據實際用情境,調整控制面CTR的位置與範圍,以更精準與快速地選擇控制面CTR上的控制點。For another example, when the hands of the user U move toward each other, the restricted range of the control surface CTR can be reduced to include fewer control points (for example, reduced to only control points E2' and E3'). In other words, in response to the user U reducing the restricted range of the control surface CTR, the valid control points originally located in the restricted range can be moved outside the restricted range, thereby setting the valid control points as invalid control points. In this way, the user U can adjust the position and range of the control surface CTR according to the actual usage scenario to select the control points on the control surface CTR more accurately and quickly.

圖5B是依照本發明的一實施例的一種操控情境的示意圖。參照圖1A至圖5B,情境500B示意地繪示使用者U、控制面CTR以及物體OBJ之間的關係。相較於在圖5A中,使用者U設置於投影點P以及物體OBJ之間,在圖5B中,物體OBJ設置於使用者U以及投影點P之間。其中控制區域E1與控制區域E2之間的距離可標示為控制區域距離D1B,且控制點E1’與控制點E2’之間的距離可標示為控制點距離D2B。另外,圖5B中繪示了限制範圍R,在後續的實施例中會進行說明。關於使用者U、控制面CTR以及物體OBJ的細節可參照圖5A的描述,在此不多加贅述。FIG. 5B is a schematic diagram of a control scenario according to an embodiment of the present invention. Referring to FIG. 1A to FIG. 5B , scenario 500B schematically illustrates the relationship between the user U, the control surface CTR, and the object OBJ. Compared to FIG. 5A , in which the user U is disposed between the projection point P and the object OBJ, in FIG. 5B , the object OBJ is disposed between the user U and the projection point P. The distance between the control area E1 and the control area E2 can be indicated as the control area distance D1B, and the distance between the control point E1' and the control point E2' can be indicated as the control point distance D2B. In addition, FIG. 5B illustrates a restriction range R, which will be described in subsequent embodiments. For details about the user U, the control surface CTR, and the object OBJ, please refer to the description of FIG. 5A , which will not be elaborated here.

類似於圖5A,當圖5B的投影點P的位置改變時,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)也會相應地跟著改變。舉例而言,當投影點P位於使用者的前方且再往使用者的前方(例如圖5B的右方)移動時(即距離使用者U較原先的投影點P更遠),控制點E1’~E4’會往彼此靠近。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會減少。相反地,當投影點P位於使用者的前方且往使用者的後方(例如圖5B的左方)移動時(即距離使用者U較原先的投影點P更近),控制點E1’~E4’會往彼此遠離。換句話說,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會增加。因此,在一實施例中,使用者U可經由調整投影點P的位置(即使用者U與投影點P之間的投影點距離),來調整控制面CTR上所顯示的控制點E1’~E4’之間的控制點距離(即顯示的倍率)。也就是說,控制器110可基於投影點P與使用者U的位置,決定控制點E1’與控制點E2’之間的控制點距離D2B。Similar to FIG. 5A , when the position of the projection point P of FIG. 5B changes, the control point distances between the control points E1’~E4’ (e.g., the control point distance D2B) will also change accordingly. For example, when the projection point P is located in front of the user and moves to the front of the user (e.g., the right side of FIG. 5B ) (i.e., the distance from the user U is farther than the original projection point P), the control points E1’~E4’ will move closer to each other. In other words, the control point distances between the control points E1’~E4’ (e.g., the control point distance D2B) will decrease. On the contrary, when the projection point P is located in front of the user and moves to the back of the user (e.g., the left side of FIG. 5B ) (i.e., the distance from the user U is closer than the original projection point P), the control points E1’~E4’ will move away from each other. In other words, the control point distance between the control points E1'~E4' (e.g., the control point distance D2B) will increase. Therefore, in one embodiment, the user U can adjust the control point distance between the control points E1'~E4' displayed on the control surface CTR (i.e., the display magnification) by adjusting the position of the projection point P (i.e., the projection point distance between the user U and the projection point P). In other words, the controller 110 can determine the control point distance D2B between the control point E1' and the control point E2' based on the position of the projection point P and the user U.

另外,由於物體OBJ設置於投影點P以及鄰近於使用者U的控制面CTR之間,控制點E1’~E4’之間的控制點距離(例如控制點距離D2B)會大於控制區域E1~E4之間的控制區域距離(例如控制區域距離D1B)。也就是說,控制器110可響應於物體OBJ位於投影點P與使用者U之間,設定第一控制點與第二控制點之間的控制點距離D2B大於控制區域E1與第二控制區域E2之間的控制區域距離D1B。In addition, since the object OBJ is disposed between the projection point P and the control plane CTR adjacent to the user U, the control point distance between the control points E1'-E4' (e.g., the control point distance D2B) is greater than the control area distance between the control areas E1-E4 (e.g., the control area distance D1B). In other words, the controller 110 may set the control point distance D2B between the first control point and the second control point to be greater than the control area distance D1B between the control area E1 and the second control area E2 in response to the object OBJ being located between the projection point P and the user U.

此外,在一實施例中,投影點P可以設置於使用者U、控制面CTR以及物體OBJ之間,以得到對應於物體OBJ上的控制區域呈上下左右相反分布的控制點。或者,在一實施例中,投影點P可以設置於使用者U的前方或後方的無窮遠,以得到對應於物體OBJ上的控制區域呈相同分布的控制點,即各控制點之間的控制點距離將與物體OBJ上的對應的各控制區域之間的距離相同。如此一來,經由調整投影點P的位置和使用者U與投影點P之間的投影點距離,使用者U可調整控制面CTR或控制面CTR上的任一控制群組的顯示倍率(放大率),即能夠調整控制面CTR上的控制點距離(例如圖5A的控制點距離D2A或圖5B的控制點距離D2B),從而獲得較佳的視覺體驗,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制。並且,經由調整控制面CTR(例如透過調整控制面CTR的原點C)的位置或範圍,使用者U可調整使用者U與控制面CTR之間的控制面距離以及控制面CTR所涵蓋的控制點範圍(限制範圍R),從而能處在舒適的姿態下對控制面CTR進行操控,進而增加較佳的使用體驗。In addition, in one embodiment, the projection point P can be set between the user U, the control surface CTR and the object OBJ to obtain control points that are oppositely distributed up and down and left and right corresponding to the control area on the object OBJ. Alternatively, in one embodiment, the projection point P can be set to infinity in front of or behind the user U to obtain control points that are uniformly distributed corresponding to the control area on the object OBJ, that is, the control point distance between each control point will be the same as the distance between each corresponding control area on the object OBJ. In this way, by adjusting the position of the projection point P and the projection point distance between the user U and the projection point P, the user U can adjust the display magnification (magnification) of the control surface CTR or any control group on the control surface CTR, that is, can adjust the control point distance on the control surface CTR (for example, the control point distance D2A in Figure 5A or the control point distance D2B in Figure 5B), thereby obtaining a better visual experience and being able to simply and effectively control objects accurately in the virtual world. Furthermore, by adjusting the position or range of the control surface CTR (for example, by adjusting the origin C of the control surface CTR), the user U can adjust the control surface distance between the user U and the control surface CTR and the range of control points covered by the control surface CTR (limited range R), thereby being able to manipulate the control surface CTR in a comfortable posture, thereby increasing a better user experience.

需要說明的是,為了方便說明,圖5B中控制面CTR繪示為有限的長度(有限的平面)。然而,在一些實施例中,控制面CTR為無限延伸的平面。此外,圖5B中繪示了限制範圍R,用以表示控制面CTR的有效的控制區域或有效的顯示區域。在一實施例中,限制範圍R是控制面CTR上的有效控制點的涵蓋範圍。舉例而言,限制範圍R內的控制點為有效控制點,而在限制範圍R外的控制點為無效控制點。也就是說,響應於在虛擬世界中使用者U的虛擬手接觸在控制範圍R內的有效控制點,控制器110可接受使用者U的控制指令。並且,響應於在虛擬世界中使用者U的虛擬手接觸在控制範圍R外的無效控制點,不接受使用者U的控制指令。It should be noted that, for the convenience of explanation, the control surface CTR in FIG. 5B is shown as a finite length (finite plane). However, in some embodiments, the control surface CTR is a plane that extends infinitely. In addition, a restriction range R is shown in FIG. 5B to indicate the effective control area or effective display area of the control surface CTR. In one embodiment, the restriction range R is the coverage range of the effective control points on the control surface CTR. For example, the control points within the restriction range R are effective control points, and the control points outside the restriction range R are invalid control points. That is, in response to the virtual hand of the user U touching the effective control point within the control range R in the virtual world, the controller 110 can accept the control command of the user U. Furthermore, in response to the virtual hand of the user U touching an invalid control point outside the control range R in the virtual world, the control command of the user U is not accepted.

舉例而言,圖5B中繪示了由投影點P為起點並經過物體OBJ的虛線(射線),並且相交於控制面CTR的限制範圍R外,從而形成無效控制點。由於無效控制點位於控制面CTR的限制範圍R外,控制器110會不執行使用者U對於無效控制點的操作。如此一來,經由設置限制區域R,可避免誤操作的發生,從而增加使用體驗。For example, FIG. 5B shows a virtual line (ray) starting from the projection point P and passing through the object OBJ, and intersecting outside the restricted range R of the control surface CTR, thereby forming an invalid control point. Since the invalid control point is outside the restricted range R of the control surface CTR, the controller 110 will not execute the operation of the user U on the invalid control point. In this way, by setting the restricted area R, the occurrence of erroneous operation can be avoided, thereby increasing the user experience.

圖6是依照本發明的一實施例的一種控制方法的流程圖。參照圖1以及圖6,控制方法600示意地控制裝置100的操作流程,但本揭露並不以此為限。在本實施例中,控制方法600包括步驟S610至步驟S640。FIG6 is a flow chart of a control method according to an embodiment of the present invention. Referring to FIG1 and FIG6 , the control method 600 schematically shows the operation flow of the control device 100, but the present disclosure is not limited thereto. In this embodiment, the control method 600 includes steps S610 to S640.

在步驟S610中,控制器110可傳輸訊號至顯示器120,使得在顯示器120所顯示的虛擬世界中的使用者U的周圍形成控制面210。在步驟S620中,控制器110可傳輸訊號至顯示器120,使得在顯示器120所顯示的虛擬世界中的物體(例如物體220或物體230)發射出第一射線(例如射線221~233的其中之一)。在步驟S630中,控制器110可傳輸訊號至顯示器120,使得顯示器120可基於第一射線,在控制面210上形成(顯示)第一控制點。在步驟S640中,控制器110可傳輸訊號至顯示器120,從而讓使用者U根據第一控制點,控制物體(例如物體220或物體230)。需要說明的是,相關的實施細節可參照圖1至圖5B的描述,在此不多加贅述。如此一來,使用者U可經由控制面210上的第一控制點,簡單且直覺地對物體進行精確的控制。In step S610, the controller 110 may transmit a signal to the display 120, so that the control surface 210 is formed around the user U in the virtual world displayed by the display 120. In step S620, the controller 110 may transmit a signal to the display 120, so that an object (e.g., object 220 or object 230) in the virtual world displayed by the display 120 emits a first ray (e.g., one of rays 221-233). In step S630, the controller 110 may transmit a signal to the display 120, so that the display 120 may form (display) a first control point on the control surface 210 based on the first ray. In step S640, the controller 110 may transmit a signal to the display 120, so that the user U can control the object (e.g., the object 220 or the object 230) according to the first control point. It should be noted that the relevant implementation details can be referred to the description of FIG. 1 to FIG. 5B, and will not be elaborated here. In this way, the user U can simply and intuitively control the object accurately through the first control point on the control surface 210.

綜上所述,本發明的控制裝置以控制方法採用了控制面來對虛擬世界中的物體進行控制。因此,使用者可在保持舒適的姿態下,同時能夠簡單且有效地在虛擬世界中對物體精準地進行控制。In summary, the control device of the present invention uses a control surface to control objects in the virtual world. Therefore, the user can maintain a comfortable posture while being able to simply and effectively control the objects in the virtual world accurately.

10:控制系統 100:控制裝置 110:控制器 120:顯示器 130:攝影機 200、400、500A、500B:情境 210、CTR:控制面 220、230、OBJ:物體 221、222、223、231、232、233:射線 300A、300B、300C:手勢 600:控制方法 C:原點 D1A、D1B:控制區域距離 D2A、D2B:控制點距離 E1、E2、E3、E4:控制區域 E1’、E2’、E3’、E4’:控制點 F1、F2:手指 H:虛擬手 P:投影點 PP:捏取點 R:限制範圍 S610、S620、S630、S640:步驟 U:使用者 10: Control system 100: Control device 110: Controller 120: Display 130: Camera 200, 400, 500A, 500B: Context 210, CTR: Control surface 220, 230, OBJ: Object 221, 222, 223, 231, 232, 233: Rays 300A, 300B, 300C: Gestures 600: Control method C: Origin D1A, D1B: Control area distance D2A, D2B: Control point distance E1, E2, E3, E4: Control area E1’, E2’, E3’, E4’: Control point F1, F2: Fingers H: Virtual hand P: Projection point PP: Pinch point R: Restriction range S610, S620, S630, S640: Steps U: User

圖1A是依照本發明的一實施例的一種控制裝置的示意圖。 圖1B是依照本發明的一實施例的一種控制系統的示意圖。 圖2是依照本發明的一實施例的一種使用情境的示意圖。 圖3A是依照本發明的一實施例的一種操控手勢的示意圖。 圖3B是依照本發明的一實施例的一種操控手勢的示意圖。 圖3C是依照本發明的一實施例的一種操控手勢的示意圖。 圖4是依照本發明的一實施例的一種操控情境的示意圖。 圖5A是依照本發明的一實施例的一種操控情境的示意圖。 圖5B是依照本發明的一實施例的一種操控情境的示意圖。 圖6是依照本發明的一實施例的一種控制方法的流程圖。 FIG. 1A is a schematic diagram of a control device according to an embodiment of the present invention. FIG. 1B is a schematic diagram of a control system according to an embodiment of the present invention. FIG. 2 is a schematic diagram of a use scenario according to an embodiment of the present invention. FIG. 3A is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 3B is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 3C is a schematic diagram of a control gesture according to an embodiment of the present invention. FIG. 4 is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 5A is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 5B is a schematic diagram of a control scenario according to an embodiment of the present invention. FIG. 6 is a flow chart of a control method according to an embodiment of the present invention.

100:控制裝置 100: Control device

110:控制器 110: Controller

120:顯示器 120: Display

Claims (14)

一種控制裝置,適於控制虛擬世界中的物體,其中所述控制裝置包括:顯示器,用以顯示所述虛擬世界;以及控制器,耦接所述顯示器,其中所述控制器用以:在所述虛擬世界中,在使用者的周圍形成控制面,其中所述物體與所述使用者之間的距離大於所述控制面與所述使用者之間的距離;從所述物體發射出第一射線;基於所述第一射線,在所述控制面上形成第一控制點;根據所述第一控制點,對所述物體進行第一控制;以及根據在所述虛擬世界中所述使用者的虛擬手與所述控制面或與所述第一控制點之間的手部距離,判斷將所述控制面或所述第一射線隱藏。 A control device is suitable for controlling an object in a virtual world, wherein the control device comprises: a display for displaying the virtual world; and a controller coupled to the display, wherein the controller is used to: form a control surface around a user in the virtual world, wherein the distance between the object and the user is greater than the distance between the control surface and the user; emit a first ray from the object; form a first control point on the control surface based on the first ray; perform a first control on the object according to the first control point; and determine whether to hide the control surface or the first ray according to the hand distance between the virtual hand of the user and the control surface or the first control point in the virtual world. 如請求項1所述的控制裝置,其中所述控制器更用以:接收使用者影像,其中所述使用者影像包括在真實世界中的所述使用者的手或手持遙控器;根據所述使用者影像,判斷在所述虛擬世界中所述使用者的手或所述手持遙控器所對應的控制錨點是否接觸所述第一控制點;以及響應於所述控制錨點接觸所述第一控制點,對所述物體進行 所述第一控制。 The control device as described in claim 1, wherein the controller is further used to: receive a user image, wherein the user image includes the user's hand or handheld remote control in the real world; determine whether the control anchor corresponding to the user's hand or the handheld remote control in the virtual world touches the first control point based on the user image; and perform the first control on the object in response to the control anchor touching the first control point. 如請求項1所述的控制裝置,其中所述控制器更用以:根據所述使用者的個人參數,決定所述控制面與所述使用者之間的控制面距離。 A control device as described in claim 1, wherein the controller is further used to: determine the control surface distance between the control surface and the user according to the personal parameters of the user. 如請求項1所述的控制裝置,其中所述控制器更用以:基於所述第一射線與所述控制面的交會點,在所述控制面上形成所述第一控制點。 A control device as described in claim 1, wherein the controller is further used to: form the first control point on the control surface based on the intersection point of the first ray and the control surface. 如請求項1所述的控制裝置,其中所述控制器更用以:根據在所述虛擬世界中所述使用者的所述虛擬手與所述控制面或與所述第一控制點之間的所述手部距離,判斷將所述控制面或所述第一射線顯示。 A control device as described in claim 1, wherein the controller is further used to: determine whether to display the control surface or the first ray according to the hand distance between the virtual hand of the user and the control surface or the first control point in the virtual world. 如請求項1所述的控制裝置,其中所述控制面包括平面、立方體表面、圓柱面、圓球面或橢圓球面。 A control device as described in claim 1, wherein the control surface includes a plane, a cubic surface, a cylindrical surface, a spherical surface, or an elliptical spherical surface. 如請求項1所述的控制裝置,其中所述物體包括第一控制區域以及第二控制區域,且所述控制器更用以:從所述第一控制區域發射所述第一射線;從所述第二控制區域發射第二射線,其中所述第一射線以及所述第二射線交會於投影點;基於所述第二射線,在所述控制面上形成第二控制點;以及 基於所述投影點與所述使用者的位置,決定所述第一控制點與所述第二控制點之間的控制點距離。 A control device as described in claim 1, wherein the object includes a first control area and a second control area, and the controller is further used to: emit the first ray from the first control area; emit the second ray from the second control area, wherein the first ray and the second ray intersect at a projection point; form a second control point on the control surface based on the second ray; and determine the control point distance between the first control point and the second control point based on the projection point and the position of the user. 如請求項7所述的控制裝置,其中所述控制器更用以:響應於所述使用者位於所述投影點與所述物體之間,設定所述第一控制點與所述第二控制點之間的所述控制點距離小於所述第一控制區域與所述第二控制區域之間的控制區域距離;或者響應於所述物體位於所述投影點與所述使用者之間,設定所述第一控制點與所述第二控制點之間的所述控制點距離大於所述第一控制區域與所述第二控制區域之間的所述控制區域距離。 The control device as claimed in claim 7, wherein the controller is further used to: in response to the user being located between the projection point and the object, set the control point distance between the first control point and the second control point to be smaller than the control area distance between the first control area and the second control area; or in response to the object being located between the projection point and the user, set the control point distance between the first control point and the second control point to be larger than the control area distance between the first control area and the second control area. 如請求項1所述的控制裝置,其中所述物體包括第一控制區域以及第二控制區域,且所述控制器更用以:從所述第一控制區域發射所述第一射線;從所述第二控制區域發射第二射線;基於所述第二射線,在所述控制面上形成第二控制點;以及將第一控制點與第二控制點組成控制群組,以同步地調整所述第一控制點以及所述第二控制點。 A control device as described in claim 1, wherein the object includes a first control area and a second control area, and the controller is further used to: emit the first ray from the first control area; emit the second ray from the second control area; form a second control point on the control surface based on the second ray; and form a control group with the first control point and the second control point to synchronously adjust the first control point and the second control point. 如請求項9所述的控制裝置,其中所述控制器更用以:響應於所述控制群組被選取,淡化顯示所述控制面上除了所述控制群組以外的內容。 A control device as described in claim 9, wherein the controller is further used to: in response to the control group being selected, fade out the content other than the control group on the control surface. 如請求項1所述的控制裝置,其中所述控制面包括限制範圍,所述控制器更用以:響應於在所述虛擬世界中所述使用者的虛擬手接觸在所述限制範圍內的有效控制點,接受所述使用者的控制指令;以及響應於在所述虛擬世界中所述使用者的虛擬手接觸在所述限制範圍外的無效控制點,不接受所述使用者的所述控制指令。 The control device as claimed in claim 1, wherein the control surface includes a restriction range, and the controller is further used to: respond to the user's virtual hand touching a valid control point within the restriction range in the virtual world, accept the user's control instruction; and respond to the user's virtual hand touching an invalid control point outside the restriction range in the virtual world, not accept the user's control instruction. 如請求項11所述的控制裝置,控制器更用以:響應於所述使用者擴大所述限制範圍且所述無效控制點被移動至所述限制範圍內,將所述無效控制點設定為所述有效控制點;以及響應於所述使用者縮減所述限制範圍且所述有效控制點被移動至所述限制範圍外,將所述有效控制點設定為所述無效控制點。 In the control device as claimed in claim 11, the controller is further used to: in response to the user expanding the restriction range and the invalid control point being moved into the restriction range, set the invalid control point as the valid control point; and in response to the user reducing the restriction range and the valid control point being moved outside the restriction range, set the valid control point as the invalid control point. 一種控制方法,適於控制虛擬世界中的物體,其中所述控制方法包括:在使用者的周圍形成控制面,其中所述物體與所述使用者之間的距離大於所述控制面與所述使用者之間的距離;從所述物體發射出第一射線;基於所述第一射線,在所述控制面上形成第一控制點;以及根據所述第一控制點,控制所述物體;以及根據在所述虛擬世界中所述使用者的虛擬手與所述控制面或與所述第一控制點之間的手部距離,判斷將所述控制面或所述第 一射線隱藏。 A control method is suitable for controlling an object in a virtual world, wherein the control method comprises: forming a control surface around a user, wherein the distance between the object and the user is greater than the distance between the control surface and the user; emitting a first ray from the object; forming a first control point on the control surface based on the first ray; and controlling the object according to the first control point; and determining whether to hide the control surface or the first ray according to the hand distance between the virtual hand of the user and the control surface or the first control point in the virtual world. 如請求項13所述的控制方法,更包括:接收使用者影像,其中所述使用者影像包括在真實世界中的所述使用者的手或手持遙控器;根據所述使用者影像,判斷在所述虛擬世界中所述使用者的手或所述手持遙控器所對應的控制錨點是否接觸所述第一控制點;以及響應於所述控制錨點接觸所述第一控制點,對所述物體進行所述第一控制。 The control method as described in claim 13 further comprises: receiving a user image, wherein the user image includes the user's hand or handheld remote control in the real world; judging whether the control anchor corresponding to the user's hand or the handheld remote control in the virtual world touches the first control point according to the user image; and performing the first control on the object in response to the control anchor touching the first control point.
TW111149610A 2022-12-23 2022-12-23 Control device and control method TWI870744B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW111149610A TWI870744B (en) 2022-12-23 2022-12-23 Control device and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW111149610A TWI870744B (en) 2022-12-23 2022-12-23 Control device and control method

Publications (2)

Publication Number Publication Date
TW202427164A TW202427164A (en) 2024-07-01
TWI870744B true TWI870744B (en) 2025-01-21

Family

ID=92929032

Family Applications (1)

Application Number Title Priority Date Filing Date
TW111149610A TWI870744B (en) 2022-12-23 2022-12-23 Control device and control method

Country Status (1)

Country Link
TW (1) TWI870744B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760308A (en) * 2012-05-25 2012-10-31 任伟峰 Method and device for node selection of object in three-dimensional virtual reality scene
CN111309142A (en) * 2018-12-11 2020-06-19 托比股份公司 Method and device for switching input modality of display device
US10747336B1 (en) * 2017-07-31 2020-08-18 Amazon Technologies, Inc. Defining operating areas for virtual reality systems using sensor-equipped operating surfaces
CN112181551A (en) * 2020-08-31 2021-01-05 华为技术有限公司 Information processing method and related equipment
CN111880648B (en) * 2020-06-19 2022-01-28 华为技术有限公司 Three-dimensional element control method and terminal
TW202217516A (en) * 2020-10-29 2022-05-01 未來市股份有限公司 Method and system of modifying position of cursor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102760308A (en) * 2012-05-25 2012-10-31 任伟峰 Method and device for node selection of object in three-dimensional virtual reality scene
US10747336B1 (en) * 2017-07-31 2020-08-18 Amazon Technologies, Inc. Defining operating areas for virtual reality systems using sensor-equipped operating surfaces
CN111309142A (en) * 2018-12-11 2020-06-19 托比股份公司 Method and device for switching input modality of display device
CN111880648B (en) * 2020-06-19 2022-01-28 华为技术有限公司 Three-dimensional element control method and terminal
CN112181551A (en) * 2020-08-31 2021-01-05 华为技术有限公司 Information processing method and related equipment
TW202217516A (en) * 2020-10-29 2022-05-01 未來市股份有限公司 Method and system of modifying position of cursor

Also Published As

Publication number Publication date
TW202427164A (en) 2024-07-01

Similar Documents

Publication Publication Date Title
JP7649396B2 (en) DEVICE, METHOD AND GRAPHICAL USER INTERFACE FOR INTERACTING WITH A THREE-DIMENSIONAL ENVIRONMENT - Patent application
US12086379B2 (en) Devices, methods, and graphical user interfaces for providing computer-generated experiences
US12288301B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US12130967B2 (en) Integration of artificial reality interaction modes
US12299251B2 (en) Devices, methods, and graphical user interfaces for presenting virtual objects in virtual environments
US12131011B2 (en) Virtual interactions for machine control
JP2022547930A (en) Devices, methods, and graphical user interfaces for interacting with a three-dimensional environment
US12124674B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
JP2010145861A (en) Head mount display
US20250044911A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
TWI870744B (en) Control device and control method
US12056269B2 (en) Control device and control method
CN118244882A (en) Control device and control method
CN117957581A (en) Apparatus, method and graphical user interface for interacting with a three-dimensional environment