[go: up one dir, main page]

WO2016106541A1 - 一种触控操作方法、触控操作组件及电子设备 - Google Patents

一种触控操作方法、触控操作组件及电子设备 Download PDF

Info

Publication number
WO2016106541A1
WO2016106541A1 PCT/CN2014/095489 CN2014095489W WO2016106541A1 WO 2016106541 A1 WO2016106541 A1 WO 2016106541A1 CN 2014095489 W CN2014095489 W CN 2014095489W WO 2016106541 A1 WO2016106541 A1 WO 2016106541A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch sensor
signal
triggering
detecting
Prior art date
Application number
PCT/CN2014/095489
Other languages
English (en)
French (fr)
Inventor
杨松龄
刘自鸿
Original Assignee
深圳市柔宇科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市柔宇科技有限公司 filed Critical 深圳市柔宇科技有限公司
Priority to CN201480031861.4A priority Critical patent/CN105518588B/zh
Priority to PCT/CN2014/095489 priority patent/WO2016106541A1/zh
Priority to EP14909353.6A priority patent/EP3242189A4/en
Priority to JP2017534917A priority patent/JP6470416B2/ja
Priority to KR1020177020815A priority patent/KR20170094451A/ko
Priority to US15/540,727 priority patent/US20170364197A1/en
Publication of WO2016106541A1 publication Critical patent/WO2016106541A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to the field of control, and in particular, to a touch operation method, a touch operation component, and an electronic device having the touch operation component.
  • the intuitiveness of the touch operation mode has been widely applied to various electronic devices to improve the convenience of device manipulation.
  • the user can intuitively touch the corresponding display area according to the specific function options presented by the operation interface to realize the triggering of the corresponding function.
  • an immersive head-mounted display device generally uses physical buttons to provide operational functions, but this method has the following problems:
  • the immersive device cannot observe the control buttons.
  • the user must learn the button layout and the corresponding control functions before using the device, and the singularity of the physical button function increases the difficulty of the user, resulting in lowering the user's acceptance of the new device. degree.
  • the prior art cannot provide a control method that can be compatible with product features while improving the user's control experience.
  • An object of the present invention is to provide a touch operation method, a touch operation component, and an electronic device having the touch operation component to solve the above problems in the prior art.
  • the invention provides a touch operation component, comprising:
  • a first touch sensor and a second touch sensor each for generating a touch signal according to the touch
  • a judging module configured to be connected to the first and second touch sensors, for detecting whether the second touch sensor generates a touch signal when detecting that the first touch sensor generates a touch signal, and if so, triggering the first operation; When it is detected that the second touch sensor generates a touch signal, it is determined whether the first touch sensor generates a touch signal, and if so, the second operation is triggered.
  • the present invention also provides an electronic device comprising the touch operation component of any of the above.
  • the invention also provides a touch operation method, comprising the following steps:
  • the first touch sensor and the second touch sensor generate a touch signal according to the touch
  • the second touch sensor When it is detected that the second touch sensor generates a touch signal, it is determined whether the first touch sensor generates a touch signal, and if so, the second operation is triggered.
  • the touch operation component, the electronic device and the touch operation method provided by the embodiment of the invention further provide a different trigger according to the sequence of detecting the touch signal by the combination of the first touch sensor and the second touch sensor.
  • the operation mode expands more operations under a limited physical structure, and the defined operation can improve the user's operating experience without depending on the operation interface.
  • FIG. 1 is a schematic diagram of a module of a touch operation component according to an embodiment of the invention.
  • FIG. 2 is a schematic top view of a touch operation component according to an embodiment of the invention.
  • FIG. 3 is a cross-sectional view of the touch operation assembly of FIG. 2.
  • FIG. 4 is a schematic diagram of a touch operation according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a touch operation method according to an embodiment of the present invention.
  • FIG. 6 is a flowchart of a touch operation method according to another embodiment of the present invention.
  • the touch operation component includes a first touch sensor 11, a second touch sensor 20, and a determination module 30.
  • the sensor 11 and the second touch sensor 20 are configured to generate a touch signal according to the touch, that is, generate a touch signal when the touch is detected;
  • the determining module 30 is configured to determine the second touch sensor 20 when the touch signal generated by the first touch sensor 11 is detected. Whether the touch signal is generated, and if so, triggering the first operation;
  • the determining module 30 is further configured to determine whether the first touch sensor 11 generates a touch signal when detecting that the second touch sensor 20 generates a touch signal, and if so, trigger the second operation.
  • the touch operation component provided by the embodiment can realize the manner in which the first touch sensor 11 and the second touch sensor 20 are combined, and trigger different operations according to different touch sequences.
  • the first operation and the second operation may be different operations corresponding to the same manipulation function.
  • the operation function can be volume adjustment, video fast forward and backward adjustment, menu entry and exit operation, and the like.
  • the following will be specifically described in conjunction with an embodiment given in FIG. 2.
  • the first touch sensor 11 is circular
  • the second touch sensor 20 is annular, surrounding the first touch sensor 11 and Set it apart from it.
  • the first touch sensor 11 first generates a touch signal
  • the second touch sensor 20 generates a touch signal
  • the determining module 30 determines that the same is detected.
  • the two touch signals trigger a first operation, and the first operation may enter the next level menu according to the function menu item where the current focus is located; correspondingly, when the user's finger is slid by the second touch sensor 20 to the first touch sensor 11
  • the second touch sensor 20 first generates a touch signal, and the first touch sensor 11 generates a touch signal, and the determining module 30 determines that the two touch signals are detected and triggers the second operation, and the second operation may be corresponding.
  • the first operation can also increase the volume, corresponding to the second
  • the operation is volume reduction, the first operation is video fast forward, and the corresponding second operation can be video rewind.
  • the determining module 30 is further configured to: when detecting that the touch signal of the first touch sensor or the second touch sensor is generated later and is a continuous signal, continuously performing the corresponding second operation or the first operation, Until the subsequent continuous signal stops. In this way, the user's operation can be simplified in a specific scenario.
  • the first operation is to increase the volume
  • the first touch sensor 11 First, a touch signal is generated, and the second touch sensor 20 generates a touch signal for a period of time, that is, a continuous signal
  • the determining module 30 determines that the two touch signals are sequentially detected to trigger the first operation of increasing the volume, and Since the second touch sensor 20 continues to generate the touch signal, the determination module 30 continues to perform the increased volume according to the continuous signal until the user's finger leaves the second touch sensor 20. In this way, the user can more easily increase or decrease the volume.
  • the specific application scenario is not limited to adjusting the volume, but may also adjust the video progress or other scenarios suitable for applying the function.
  • a plurality of different operations may be triggered. For example, if there are N second touch sensors, 2N different operations may be defined in combination with the first touch sensors respectively.
  • a plurality of separated second touch sensors 20A, 20B, and 20C are disposed in a region where the second touch sensor 20 is disposed in the above embodiment, and the second touch sensors 20A, 20B, and 20C are respectively combined with the first touch sensor.
  • the determination module 30 first receives the first touch sensor 11 After the generated touch signal, the touch signal generated by the second touch sensor 20A is received again, and the determining module 30 triggers the volume increasing operation. Conversely, when the user's finger is slid by the second touch sensor 20A to the first touch sensor 11, the second touch sensor 20A first generates a touch signal, and the first touch sensor 11 generates a touch signal, and the determination module 30 triggers the volume reduction. Small operation.
  • the determining module 30 receives the touch signal generated by the first touch sensor 11 and then receives the touch signal generated by the second touch sensor 20B. It triggers a video fast forward operation.
  • the second touch sensor 20B first generates a touch signal
  • the first touch sensor 11 generates a touch signal
  • the determining module 30 triggers the video fast.
  • the manner of execution of the first touch sensor 11 and the second touch sensor 20C is as described above with respect to FIG. 2, and details are not described herein again.
  • the manipulation can be completed by one continuous action, such as the sliding across the two touch sensors in the above embodiment.
  • the triggering operation can be manipulated by combining the two hands. In this way, multiple operations can be defined without relying on the interface menu, which can reduce the complicated settings of the menu items, and at the same time, the definition of the operation can be clarified and convenient for the user.
  • the determining module 30 is further configured to determine whether the touch signals respectively generated by the first touch sensor 11 and the second touch sensor 20 are detected within a preset effective time, and if so, trigger the first operation; It is used to determine whether the touch signals generated by the second touch sensor 20 and the first touch sensor 11 respectively detected are detected within the preset effective time, and if so, the second operation is triggered.
  • the determining module 30 detects that the first touch sensor 11 generates a touch signal, it determines whether the second touch sensor 20 generates a touch signal within a preset effective time, and if so, triggers the first operation; when the determining module 30 detects When the second touch sensor 20 generates a touch signal, it is determined whether the first touch sensor 11 generates a touch signal within a preset effective time, and if so, the second operation is triggered. Further determining that the sequential trigger time difference of the touch signals generated by the first touch sensor 11 and the second touch sensor 20 falls within a preset effective time, triggering the corresponding first operation and the second operation, thereby ensuring the validity of the user operation, accuracy.
  • the touch operation component 1 further includes a manipulation button 12 for triggering a third operation after being pressed.
  • the manipulation button 12 can be disposed together with the first touch sensor 11 or the second touch sensor 20 to form a Manipulating the module 10.
  • FIG. 3 a cross-sectional view of a touch operation component of the touch operation component is provided on the control button 12 , but the invention is not limited thereto.
  • the second touch sensor is placed on the control button according to other product forms, characteristics, and the like.
  • the first touch sensor 11 or the second touch sensor 20 may also be indirectly disposed to the manipulation button 12 through other components, as long as the sensor can be pressed to press the button.
  • the control button 12 can be a rotary encoder with a button, the button rotary encoder is used to trigger the third operation when pressed, and is also used to detect the rotation angle to trigger the fourth operation, as shown in FIG.
  • the control button can be rotated with the X axis as an axis.
  • the fourth operation can be a volume adjustment, a video fast forward and backward operation, etc., thereby providing more convenient according to specific product and content control characteristics. Flexible adjustment method. Since the rotary encoder with a button is a mature technology, the principle will not be described again.
  • the touch operation component can adopt different operations of the first touch sensor and the second touch sensor to define different control functions, and can further provide different control modes of the buttons and knobs by combining the rotary encoder with a button. Different control modes are provided for different product control features to optimize the user's control experience.
  • the determining module 30 is further configured to: when the first touch sensor 11 detects that the touch signal is a path slip, trigger a fifth operation to determine whether the second touch sensor 20 detects the touch signal, and if so, And triggering the sixth operation; and when the second touch sensor 20 detects that the touch signal is a path slip, triggering the fifth operation, determining whether the first touch sensor 11 detects the touch signal, and if yes, triggering the first Six operations.
  • the fifth operation may be a focus movement
  • the sixth operation may be a “confirmation” operation, such as currently calling up a manipulation interface.
  • the manipulation interface can be called up by pressing the manipulation button, that is, the third operation can associate different specific operations according to the current display content or state.
  • the user can perform a path slide Y->Z on the first touch sensor 11 by fingers, as shown in FIG. The focus movement of the manipulation option is sent on the manipulation interface.
  • the path slip K->M can be further implemented on the second touch sensor 20, and the K-> triggers the level according to the path.
  • the focus movement of the manipulation option on the menu when the focus moves to the manipulation option to be manipulated, by moving the finger to the first touch sensor 11, the path is swiped M->N, the "confirm” operation is triggered, and the next entry of the option is entered. Level menu.
  • the “confirm” operation is triggered, and the path is swiped M->N, which is the operation of the option.
  • the user may be configured to trigger multiple operations on different touch sensors by continuous actions. Based on the effect, in other embodiments, the user may perform the operation of exiting, returning, etc. by detecting that the user's finger is off the touch sensor.
  • the path slip is not performed, and the "confirm” operation is performed again to trigger the return operation, for example, the user has performed a path slip on the second touch sensor 20 K->M to focus on the control On the option, slide the finger to the first touch sensor 11, and the path slides M->N to enter the next menu of the option, but when it is found that there is no option content to be manipulated, the second touch sensor 20 can be directly returned by sliding the finger.
  • the path slides N->T to trigger the return to the previous menu.
  • the touch operation component provided by the foregoing embodiment of the present invention can use different touch operations of the first touch sensor and the second touch sensor to define different operation functions, and further combines the touch mode of the path sliding, thereby providing the user to trigger more than one continuous gesture action. Operations. It can also be combined with a rotary encoder with a button to further provide different control modes of buttons and knobs, and realize different control modes for different product control features to optimize the user's control experience.
  • the invention also provides an electronic device comprising any of the above touch operation components.
  • the electronic device includes, but is not limited to, a cell phone, a PAD, a computer, a remote controller, a headphone, a head mounted display device, and the like.
  • the invention also provides a touch operation method, as shown in FIG. 4, comprising the following steps:
  • steps S102 and S103 in FIG. 5 is only one embodiment, and is not used to limit the present invention, that is, steps S102 and S103 may be reversed or parallel, and the like may be a person skilled in the technical field of the present invention.
  • the second touch sensor is multiple, and each touch sensor corresponds to a different function, and the first operation and the second operation correspond to different operations of the same function.
  • the method further includes: after detecting that the touch signal of the first touch sensor or the second touch sensor is generated later and is a continuous signal, continuously performing the corresponding second operation or the first operating.
  • the step S102 and the step S103 may include: determining whether the touch signals respectively generated by the first touch sensor and the second touch sensor are detected within a preset effective time, and if so, triggering the first operation; Determining whether the touch signals generated by the second touch sensor and the first touch sensor respectively detected are respectively within a preset effective time, and if so, triggering the second operation.
  • the method may further include that the first touch sensor or the second touch sensor is directly or indirectly disposed on the manipulation button, and when the manipulation button is pressed, triggering the third operation.
  • control button is a button type rotary encoder
  • it is used to trigger the third operation, and is also used to detect the rotation angle to trigger the fourth operation.
  • Step S201 the first touch sensor and the second touch sensor generate a touch signal according to the touch
  • Step S202 When detecting that the first touch sensor generates a touch signal for the path slip, triggering a fifth operation, determining whether the second touch sensor generates a touch signal, and if yes, triggering the sixth operation;
  • Step S203 When detecting that the second touch sensor generates a touch signal for the path slip, the fifth operation is triggered to determine whether the first touch sensor generates a touch signal, and if so, the sixth operation is triggered.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种触控操作组件,包括:第一触摸传感器(11)及第二触摸传感器(20),均用于根据触摸产生触摸信号;判断模块(30),其连接至第一触摸传感器(11)及第二触摸传感器(20),用于在检测到第一触摸传感器(11)产生的触摸信号时,判断第二触摸传感器(20)是否产生触摸信号,若是,则触发第一操作;还用于在检测到第二触摸传感器(20)产生的触摸信号时,判断第一触摸传感器(11)是否产生触摸信号,若是,则触发第二操作。操作组件通过第一触摸传感器(11)与第二触摸传感器(20)的结合,提供一种根据检测到触摸信号顺序的不同触发不同操作的方式,在有限的物理结构下扩展出更多的操作,提升操作体验。还提供了一种电子设备及触控操作方法。

Description

一种触控操作方法、触控操作组件及电子设备 技术领域
本发明涉及操控领域,尤其涉及一种触控操作方法、触控操作组件及具有该触控操作组件的电子设备。
背景技术
目前,基于触控操作方式的直观性,已普遍应用于各类电子设备,以提高设备操控的便捷性。但同时依赖于触摸屏显示设备提供丰富的操控界面,用户可根据该操作界面呈现的具体功能选项,直观的触控相应显示区域,来实现相应功能的触发。
但对于无法直接对显示区域进行操作的设备来说,例如:沉浸式的头戴显示设备,一般采用物理按键提供操作功能,但该方式存在以下问题:
1、沉浸式设备无法观测到操控按键,对于用户来说使用设备前必须学习按键布局及对应的操控功能,且物理按键功能的单一性,增加了用户使用难度,导致降低用户对新设备的接纳度。
2、基于产品性考虑,物理按键的设置有限,则必然减少了供用户操控的功能性。
综上,针对不同形态的产品,现有技术无法提供一种可兼容产品特性同时又能提升用户操控体验的操控方式。
发明内容
本发明的目的在于提供一种触控操作方法、触控操作组件及具有该触控操作组件的电子设备,以解决上述现有技术存在的问题。
本发明提供了一种触控操作组件,包括:
第一触摸传感器及第二触摸传感器,均用于根据触摸产生触摸信号;及
判断模块,其连接至所述第一及第二触摸传感器,用于在检测到第一触摸传感器产生触摸信号时,判断第二触摸传感器是否产生触摸信号,若是,则触发第一操作;还用于在检测到第二触摸传感器产生触摸信号时,判断第一触摸传感器是否产生触摸信号,若是,则触发第二操作。
本发明还提供了一种电子设备,包括上述任一种的触控操作组件。
本发明还提供了一种触控操作方法,包括以下步骤:
第一触摸传感器及第二触摸传感器根据触摸产生触摸信号;
在检测到第一触摸传感器产生触摸信号时,判断第二触摸传感器是否产生触摸信号,若是,则触发第一操作;
在检测到第二触摸传感器产生触摸信号时,判断第一触摸传感器是否产生触摸信号,若是,则触发第二操作。
本发明实施例提供的一种触控操作组件、电子设备及触控操作方法,通过第一触摸传感器与第二触摸传感器的结合,进一步提供了一种可根据检测到触摸信号顺序的不同触发不同操作的方式,在有限的物理结构下扩展出更多的操作,且该定义的操作可不依赖于操作界面,提升用户的操作体验。
附图说明
图1是本发明实施例提供的一种触控操作组件的模块示意图。
图2是本发明实施例提供的一种触控操作组件的俯视示意图。
图3是图2的触控操作组件的剖视示意图。
图4是本发明实施例提供的一种触控操作示意图。
图5是本发明实施例提供的一种触控操作方法的流程图。
图6是本发明又一实施例提供的一种触控操作方法的流程图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
下述较优实施例中对于同一部件采用统一的标识进行具体阐述。
请参照图1,为本发明实施例提供的一种触控操作组件1的模块示意图,该触控操作组件包括第一触摸传感器11、第二触摸传感器20及判断模块30;其中,第一触摸传感器11及第二触摸传感器20用于根据触摸产生触摸信号,即检测到触摸时产生触摸信号;判断模块30用于在检测到第一触摸传感器11产生的触摸信号时,判断第二触摸传感器20是否产生触摸信号,若是,则触发第一操作;判断模块30还用于当检测第二触摸传感器20产生触摸信号时,判断第一触摸传感器11是否产生触摸信号,若是,则触发第二操作。该实施方式提供的一种触控操作组件,可以实现第一触摸传感器11与第二触摸传感器20结合的方式、依据触摸顺序的不同触发不同的操作。
较优的实施方式中,第一操作、第二操作可以为对应同一操控功能的不同操作。如操作功能可以为音量调节、视频快进退调节、菜单进入退出操作等。为了进一步清楚阐述,以下将结合图2给出的一种实施方式加以具体说明,如图所示,第一触摸传感器11呈圆形,第二触摸传感器20呈环形,包围第一触摸传感器11并且与其间隔设置。当用户手指由第一触摸传感器11滑动至第二触摸传感器20时,第一触摸传感器11先产生触摸信号,第二触摸传感器20在后产生触摸信号,判断模块30此时判断出先后检测到这两个触摸信号并触发第一操作,该第一操作可以为根据当前焦点所在的功能菜单项进入下一级菜单;相应的,当用户手指由第二触摸传感器20滑动至第一触摸传感器11时,第二触摸传感器20先产生触摸信号,第一触摸传感器11在后产生触摸信号,判断模块30此时判断出先后检测到这两个触摸信号并触发第二操作,该第二操作相应的可以为返回上一级菜单。当然,第一操作还可以为音量增大、相应的第二 操作为音量减小,第一操作为视频快进、相应的第二操作可以为视频快退。
较优的实施方式中,判断模块30还用于:在检测到第一触摸传感器或第二触摸传感器的触摸信号为在后产生且为持续信号,持续执行相应的第二操作或第一操作,直至所述在后的持续信号停止。如此,在特定的场景下可简化用户的操作。比如,在第一操作为增大音量的情况下,当用户将手指由第一触摸传感器11滑动至第二触摸传感器20并保持触摸在第二触摸传感器20上一段时间时,第一触摸传感器11先产生触摸信号,第二触摸传感器20在后产生持续一段时间的触摸信号,即持续信号,判断模块30此时判断出先后检测到这两个触摸信号从而触发增大音量的第一操作,并且由于第二触摸传感器20持续产生触摸信号,判断模块30根据持续信号持续执行增大音量,直到用户手指离开第二触摸传感器20。这种方式下,用户可更方便地增大或降低音量。当然,具体的应用场景不限于调节音量,也可以是调节视频进度或其它适合应用该功能的场景。
进一步的,当第二触摸传感器为多个,可以触发多个不同的操作,如第二触摸传感器为N个,则分别结合第一触摸传感器可定义2N个不同的操作。比如图2所示,上述实施方式中设置第二触摸传感器20的区域中设置有多个分离的第二触摸传感器20A、20B、20C,第二触摸传感器20A、20B、20C分别结合第一触摸传感器11可分别定义为音量调节功能、视频快进退调节功能、菜单进入退出功能,即当用户手指由第一触摸传感器11滑至第二触摸传感器20A时,判断模块30先接收到第一触摸传感器11产生的触摸信号后,又接收到第二触摸传感器20A产生的触摸信号,此时判断模块30触发音量增大操作。相反的,当用户手指由第二触摸传感器20A滑至第一触摸传感器11时,第二触摸传感器20A先产生触摸信号,第一触摸传感器11在后产生触摸信号,此时判断模块30触发音量减小操作。当用户手指由第一触摸传感器11滑至第二触摸传感器20B时,判断模块30先接收到第一触摸传感器11产生的触摸信号后,又接收到第二触摸传感器20B产生的触摸信号,此时其触发视频快进操作。相 反的,当用户手指由第二触摸传感器20B滑至第一触摸传感器11时,第二触摸传感器20B先产生触摸信号,第一触摸传感器11在后产生触摸信号,此时判断模块30触发视频快退操作。相似的,第一触摸传感器11与第二触摸传感器20C的执行方式如上对图2的具体阐述,在此不再赘述。
可以理解的是,上述的操控功能可以根据不同电子设备的应用特征进行第一操作、第二操作的具体定义。本领域技术人员可以根据上述实施例的具体应用原理加以实现。
当采用该第一触摸传感器11与第二触摸传感器20置于单手可控范围内时,可通过连续的一个动作完成操控,比如上述的实施方式中跨越两个触摸传感器的滑动。在其他实施方式中,当第一触摸传感器与第二触摸传感器置于相距较远的区域时,可以通过两只手结合操控触发操作。通过该方式,可以定义多个操作,而无需依赖于界面菜单,可以减少菜单项的繁杂设置,同时,可以明确操作的定义,方便用户使用。
较优的,判断模块30还用于判断先后检测到的第一触摸传感器11及第二触摸传感器20分别产生的触摸信号是否在一个预设的有效时间内,若是,才触发第一操作;还用于判断先后检测到的第二触摸传感器20及第一触摸传感器11分别产生的触摸信号是否在所述预设的有效时间内,若是,才触发第二操作。即是说,当判断模块30检测到第一触摸传感器11产生触摸信号时,判断预设的有效时间内第二触摸传感器20是否产生触摸信号,若是,则触发第一操作;当判断模块30检测到第二触摸传感器20产生触摸信号时,判断预设的有效时间内第一触摸传感器11是否产生触摸信号,若是,则触发第二操作。进一步判断第一触摸传感器11及第二触摸传感器20产生的触摸信号的先后触发时间差落入预设的有效时间内,方触发相应的第一操作、第二操作,可确保用户操作的有效性、准确性。
触控操作组件1进一步包括操控按键12,用于被按压后触发第三操作。操控按键12可与第一触摸传感器11或第二触摸传感器20设置在一起,形成一个 操控模块10。如图3所示,为本实施例提供的一种触控操作组件的剖视图,该触控操作组件的第一触摸传感器11直接设置于操控按键12上,但本发明并不限于此,也可根据具体产品形态、特性等其他考量将第二触摸传感器设置在操控按键上。另外,第一触摸传感器11或第二触摸传感器20也可以是通过其他元件间接设置至操控按键12,只要按压该传感器能按压到该按键便可。当用户对该第一触摸传感器11施以按压动作使该按键12被按下时,则触发第三操作,该第三操作可以定义为对当前界面菜单的具体选定操作、电源开关机操作等。
较优的,该操控按键12可以为带按键式旋转编码器,该带按键式旋转编码器用于在被按下时则触发第三操作,还用于检测旋转角度触发第四操作,如图3所示,该操控按键可以以X轴为轴心进行旋转,具体的,该第四操作可以为音量调节、视频快进退等操作,从而可根据具体产品、内容的操控特性,而提供更为便捷灵活的调节方式。由于带按键式旋转编码器为现有成熟技术,对其原理不再赘述。
本发明较优实施方式,触控操作组件可以采用第一触摸传感器结合第二触摸传感器定义不同操控功能的不同操作,还可以结合带按键式旋转编码器进一步提供按键、旋钮的不同操控方式,实现了针对不同产品操控特征提供不同操控方式,优化用户的操控体验。
本发明另一实施方式中,判断模块30还可进一步用于当第一触摸传感器11检测到触摸信号为路径滑动时,触发第五操作,判断第二触摸传感器20是否检测到触摸信号,若是,则触发第六操作;还用于当第二触摸传感器20检测到触摸信号为路径滑动时,触发所述第五操作,判断第一触摸传感器11是否检测到触摸信号,若是,则触发所述第六操作。例如,第五操作可以为焦点移动、第六操作可以为“确认”操作,如当前调出操控界面。具体的,当采用包括操控按键的触控操作组件时,可通过按压该操控按键调出操控界面,即第三操作可根据当前显示内容、或状态关联不同的具体操作。根据显示输出的操控界面,用户可通过手指在第一触摸传感器11上实施路径滑动Y->Z,如图4所示,触 发在该操控界面上操控选项的焦点移动,当焦点移动至欲操控的操控选项时,通过将手指移动至第二触摸传感器20,路径滑动Z->K,则触发“确认”操作,进入该选项的下一级菜单,同样的,若该下一级菜单包括多个操控选项,可以继续在该第二触摸传感器20上实施路径滑动K->M,根据该路径滑动K->触发该级菜单上操控选项的焦点移动,当焦点移动至欲操控的操控选项时,通过将手指移动至第一触摸传感器11,路径滑动M->N,则触发“确认”操作,进入该选项的下一级菜单。若该选项无下一级菜单,则触发“确认”操作,路径滑动M->N,即为执行该选项操作。上述方式中,可提供用户在不同的触摸传感器上通过连续的动作触发多个操作,基于该效果,其他实施方式中,还可以通过检测用户手指脱离触摸传感器来执行退出、返回等操作,还可以通过执行“确认”操作后,不实施路径滑动,而再次执行“确认”操作来触发返回操作,例如:用户已通过在第二触摸传感器20上实施路径滑动K->M落焦于欲操控的选项上,滑动手指至第一触摸传感器11,路径滑动M->N,进入该选项的下一级菜单,但发现没有欲操控的选项内容时,可直接通过滑动手指返回第二触摸传感器20,路径滑动N->T,触发返回上一级菜单。上述具体的实施方式仅为说明本发明内容,不具有限定作用。
本发明上述实施例提供的触控操作组件可以采用第一触摸传感器结合第二触摸传感器定义不同操控功能的不同操作,进一步结合路径滑动的触摸方式,提供了用户可通过一个连续的手势动作触发多个操作。还可以结合带按键式旋转编码器进一步提供按键、旋钮的不同操控方式,实现了针对不同产品操控特征提供不同操控方式,优化用户的操控体验。
本发明还提供一种电子设备,包括上述任一种触控操作组件。该电子设备包括但不限于手机、PAD、电脑、遥控器、耳机、头戴显示设备等。
本发明还提供一种触控操作方法,如图4所示,包括以下步骤:
S101:第一触摸传感器及第二触摸传感器根据触摸产生触摸信号;
S102:当检测到第一触摸传感器产生触摸信号,判断第二触摸传感器是否产生触摸信号,若是,则触发第一操作;
S103:当检测到第二触摸传感器产生触摸信号,判断第一触摸传感器是否产生触摸信号,若是,则触发第二操作。
可以理解的是,图5中步骤S102、S103的执行顺序仅为一种实施方式,并不用于限制本发明,即步骤S102、S103可倒序也可并列等,属于本发明技术领域的技术人员可灵活变换的组合方式。
较优的,第二触摸传感器为多个,每个触摸传感器分别对应不同的功能,所述第一操作、第二操作对应同一功能的不同操作。
较优的,该方法还包括:在检测到所述第一触摸传感器或所述第二触摸传感器的触摸信号为在后产生且为持续信号,则持续执行相应的所述第二操作或第一操作。
较优的,步骤S102、步骤S103可具体包括:判断先后检测到的第一触摸传感器及第二触摸传感器分别产生的触摸信号是否在一个预设的有效时间内,若是,才触发第一操作;判断先后检测到的第二触摸传感器及第一触摸传感器分别产生的触摸信号是否在预设的有效时间内,若是,才触发第二操作。该方法还可包括:第一触摸传感器或第二触摸传感器直接或间接地设置在操控按键上,当操控按键被按下,触发第三操作。
较优的,操控按键为带按键式旋转编码器时,用于触发第三操作,还用于检测旋转角度触发第四操作。
本发明又一实施方式提供的一种触控操作方法包括以下步骤:
步骤S201:第一触摸传感器及第二触摸传感器根据触摸产生触摸信号;
步骤S202:当检测到第一触摸传感器产生路径滑动的触摸信号时,触发第五操作,判断第二触摸传感器是否产生触摸信号,若是,则触发第六操作;
步骤S203:当检测到第二触摸传感器产生路径滑动的触摸信号时,触发第五操作,判断第一触摸传感器是否产生触摸信号,若是,则触发第六操作。
由于该方法的具体原理可参照上述一种触控操作组件的原理,在此不再赘述。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内所作的任何修改、等同替换和改进等,均应包含在本发明的保护范围之内。

Claims (17)

  1. 一种触控操作组件,其特征在于,包括:
    第一触摸传感器及第二触摸传感器,均用于根据触摸产生触摸信号;及
    判断模块,其连接至所述第一及第二触摸传感器,用于在检测到所述第一触摸传感器产生的触摸信号时,判断所述第二触摸传感器是否产生触摸信号,若是,则触发第一操作;还用于在检测到所述第二触摸传感器产生的触摸信号时,判断所述第一触摸传感器是否产生触摸信号,若是,则触发第二操作。
  2. 如权利要求1所述的触控操作组件,其特征在于,所述第一操作、第二操作对应同一操控功能的不同操作。
  3. 如权利要求1或2所述的触控操作组件,其特征在于,包括多个所述第二触摸传感器,并且各所述第二触摸传感器对应不同的操控功能。
  4. 如权利要求1所述的触控操作组件,其特征在于,所述判断模块还用于:在检测到所述第一触摸传感器或所述第二触摸传感器的触摸信号为在后产生且为持续信号,则持续执行相应的所述第二操作或第一操作。
  5. 如权利要求1所述的触控操作组件,其特征在于,所述判断模块还用于判断先后检测到的第一触摸传感器及第二触摸传感器分别产生的触摸信号是否在一个预设的有效时间内,若是,才触发第一操作;还用于判断先后检测到的第二触摸传感器及第一触摸传感器分别产生的触摸信号是否在所述预设的有效时间内,若是,才触发第二操作。
  6. 如权利要求1所述的触控操作组件,其特征在于,还包括操控按键,所述操控按键被按下时触发第三操作。
  7. 如权利要求6所述的触控操作组件,其特征在于,所述第一触摸传感器 或第二触摸传感器直接或间接地设置在所述操控按键上。
  8. 如权利要求6所述的触控操作组件,其特征在于,所述操控按键为带按键式旋转编码器,用于触发所述第三操作,还用于检测旋转角度触发第四操作。
  9. 如权利要求1、6-8任一项所述的触控操作组件,其特征在于,
    所述判断模块,还用于当检测到第一触摸传感器产生路径滑动的触摸信号时,触发第五操作,判断第二触摸传感器是否产生触摸信号,若是,则触发第六操作;还用于当检测到第二触摸传感器产生路径滑动的触摸信号时,触发所述第五操作,判断第一触摸传感器是否产生触摸信号,若是,则触发所述第六操作。
  10. 一种电子设备,其特征在于,包括如权利要求1-9任一项所述的触控操作组件。
  11. 一种触控操作方法,其特征在于,包括以下步骤:
    A、第一触摸传感器及第二触摸传感器根据触摸产生触摸信号;
    B、在检测到第一触摸传感器产生触摸信号时,判断第二触摸传感器是否产生触摸信号,若是,则触发第一操作;
    C、在检测到第二触摸传感器产生触摸信号时,判断第一触摸传感器是否产生触摸信号,若是,则触发第二操作。
  12. 如权利要求11所述的方法,其特征在于,所述第二触摸传感器为多个,且各所述第二触摸传感器对应不同的操控功能,所述第一操作、第二操作对应同一操控功能的不同操作。
  13. 如权利要求11所述的方法,其特征在于,所述方法还包括:在检测到所述第一触摸传感器或所述第二触摸传感器的触摸信号为在后产生且为持续信号,则持续执行相应的所述第二操作或第一操作。
  14. 如权利要求11所述的方法,其特征在于,所述方法还包括:判断先后检测到的第一触摸传感器及第二触摸传感器分别产生的触摸信号是否在一个预设的有效时间内,若是,才触发第一操作;判断先后检测到的第二触摸传感器及第一触摸传感器分别产生的触摸信号是否在预设的有效时间内,若是,才触发第二操作。
  15. 如权利要求11所述的方法,其特征在于,所述方法还包括:
    所述第一触摸传感器或第二触摸传感器直接或间接地设置在操控按键上,当所述操控按键被按下,触发第三操作。
  16. 如权利要求15所述的方法,其特征在于,所述方法还包括:所述操控按键为带按键式旋转编码器,用于触发所述第三操作,还用于检测旋转角度触发第四操作。
  17. 如权利要求11所述的方法,其特征在于,
    所述步骤B包括:当检测到第一触摸传感器产生路径滑动的触摸信号时,触发第五操作,判断第二触摸传感器是否产生触摸信号,若是,则触发第六操作;
    所述步骤C包括:当检测到第二触摸传感器产生路径滑动的触摸信号时,触发所述第五操作,判断第一触摸传感器是否产生触摸信号,若是,则触发所述第六操作。
PCT/CN2014/095489 2014-12-30 2014-12-30 一种触控操作方法、触控操作组件及电子设备 WO2016106541A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201480031861.4A CN105518588B (zh) 2014-12-30 2014-12-30 一种触控操作方法、触控操作组件及电子设备
PCT/CN2014/095489 WO2016106541A1 (zh) 2014-12-30 2014-12-30 一种触控操作方法、触控操作组件及电子设备
EP14909353.6A EP3242189A4 (en) 2014-12-30 2014-12-30 Touch operation method, touch operation assembly and electronic device
JP2017534917A JP6470416B2 (ja) 2014-12-30 2014-12-30 タッチ操作方法、タッチ操作コンポーネント及び電子デバイス
KR1020177020815A KR20170094451A (ko) 2014-12-30 2014-12-30 터치 동작 방법, 터치 동작 어셈블리 및 전자 디바이스
US15/540,727 US20170364197A1 (en) 2014-12-30 2014-12-30 Touch operating methods, touch operation assembly, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/095489 WO2016106541A1 (zh) 2014-12-30 2014-12-30 一种触控操作方法、触控操作组件及电子设备

Publications (1)

Publication Number Publication Date
WO2016106541A1 true WO2016106541A1 (zh) 2016-07-07

Family

ID=55725005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/095489 WO2016106541A1 (zh) 2014-12-30 2014-12-30 一种触控操作方法、触控操作组件及电子设备

Country Status (6)

Country Link
US (1) US20170364197A1 (zh)
EP (1) EP3242189A4 (zh)
JP (1) JP6470416B2 (zh)
KR (1) KR20170094451A (zh)
CN (1) CN105518588B (zh)
WO (1) WO2016106541A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019533403A (ja) * 2016-09-14 2019-11-14 シェンジェン ロイオル テクノロジーズ カンパニー リミテッドShenzhen Royole Technologies Co., Ltd. ヘッドホンアセンブリ、このヘッドホンアセンブリを具備するヘッドマウントヘッドホン及びヘッドマウント表示装置

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108900941B (zh) * 2018-07-10 2020-03-31 上海易景信息科技有限公司 耳机的音量控制方法及装置
KR102236950B1 (ko) * 2019-04-17 2021-04-06 주식회사 비엘디 터치패드 모듈
CN111556396A (zh) * 2020-04-28 2020-08-18 歌尔科技有限公司 一种tws耳机和触控方法
US11788729B2 (en) * 2020-11-10 2023-10-17 Midea Group Co., Ltd. Cooking appliance with integrated touch sensing controls
CN116473344A (zh) * 2022-01-14 2023-07-25 浙江捷昌线性驱动科技股份有限公司 电动桌的控制方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101377711A (zh) * 2007-08-28 2009-03-04 Lg电子株式会社 移动终端
US20090273583A1 (en) * 2008-05-05 2009-11-05 Sony Ericsson Mobile Communications Ab Contact sensitive display
CN102073403A (zh) * 2009-11-24 2011-05-25 联发科技股份有限公司 触摸感应装置和用于提供侧触摸板的方法
CN102890580A (zh) * 2012-09-06 2013-01-23 百度在线网络技术(北京)有限公司 移动终端和移动终端中光标位置选择的方法
CN103593085A (zh) * 2012-08-14 2014-02-19 联想(新加坡)私人有限公司 使用第一触摸接口和第二触摸接口来检测触摸事件
CN103902119A (zh) * 2012-12-24 2014-07-02 乐金显示有限公司 触摸感测装置

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4020246B2 (ja) * 2002-03-26 2007-12-12 ポリマテック株式会社 タッチパッド装置
KR100767686B1 (ko) * 2006-03-30 2007-10-17 엘지전자 주식회사 터치휠을 구비한 단말기 및 이를 위한 명령 입력 방법
US20090091536A1 (en) * 2007-10-05 2009-04-09 Microsoft Corporation Dial Pad Data Entry
CN102150108A (zh) * 2008-06-27 2011-08-10 诺基亚公司 触摸板
JP2010287007A (ja) * 2009-06-11 2010-12-24 Sony Corp 入力装置および入力方法
US20100315349A1 (en) * 2009-06-12 2010-12-16 Dave Choi Vehicle commander control switch, system and method
JP2011150413A (ja) * 2010-01-19 2011-08-04 Sony Corp 情報処理装置、操作入力方法及び操作入力プログラム
JP5581904B2 (ja) * 2010-08-31 2014-09-03 日本精機株式会社 入力装置
JP5376187B2 (ja) * 2010-09-24 2013-12-25 トヨタ自動車株式会社 物体検出装置及び物体検出プログラム
JP2013073513A (ja) * 2011-09-28 2013-04-22 Kyocera Corp 電子機器、制御方法、及び制御プログラム
JP5412692B2 (ja) * 2011-10-04 2014-02-12 株式会社モルフォ 画像処理装置、画像処理方法、画像処理プログラム及び記録媒体
JP5808705B2 (ja) * 2012-03-29 2015-11-10 シャープ株式会社 情報入力装置
US20150130712A1 (en) * 2012-08-10 2015-05-14 Mitsubishi Electric Corporation Operation interface device and operation interface method
JP5983225B2 (ja) * 2012-09-17 2016-08-31 株式会社デンソー 入力装置、および入力システム
US9013452B2 (en) * 2013-03-25 2015-04-21 Qeexo, Co. Method and system for activating different interactive functions using different types of finger contacts
KR101444091B1 (ko) * 2013-08-06 2014-09-26 엘지전자 주식회사 이동 단말기 및 그것의 제어방법
US9927927B2 (en) * 2014-05-05 2018-03-27 Atmel Corporation Implementing a virtual controller outside an area of a touch sensor
US20160203036A1 (en) * 2015-01-09 2016-07-14 Ecorithm, Inc. Machine learning-based fault detection system
US10664092B2 (en) * 2016-09-09 2020-05-26 Htc Corporation Portable electronic device, operating method for the same, and non-transitory computer readable recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101377711A (zh) * 2007-08-28 2009-03-04 Lg电子株式会社 移动终端
US20090273583A1 (en) * 2008-05-05 2009-11-05 Sony Ericsson Mobile Communications Ab Contact sensitive display
CN102073403A (zh) * 2009-11-24 2011-05-25 联发科技股份有限公司 触摸感应装置和用于提供侧触摸板的方法
CN103593085A (zh) * 2012-08-14 2014-02-19 联想(新加坡)私人有限公司 使用第一触摸接口和第二触摸接口来检测触摸事件
CN102890580A (zh) * 2012-09-06 2013-01-23 百度在线网络技术(北京)有限公司 移动终端和移动终端中光标位置选择的方法
CN103902119A (zh) * 2012-12-24 2014-07-02 乐金显示有限公司 触摸感测装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3242189A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019533403A (ja) * 2016-09-14 2019-11-14 シェンジェン ロイオル テクノロジーズ カンパニー リミテッドShenzhen Royole Technologies Co., Ltd. ヘッドホンアセンブリ、このヘッドホンアセンブリを具備するヘッドマウントヘッドホン及びヘッドマウント表示装置

Also Published As

Publication number Publication date
EP3242189A1 (en) 2017-11-08
JP6470416B2 (ja) 2019-02-13
KR20170094451A (ko) 2017-08-17
EP3242189A4 (en) 2018-08-08
JP2018500691A (ja) 2018-01-11
US20170364197A1 (en) 2017-12-21
CN105518588A (zh) 2016-04-20
CN105518588B (zh) 2019-09-27

Similar Documents

Publication Publication Date Title
WO2016106541A1 (zh) 一种触控操作方法、触控操作组件及电子设备
WO2017096944A1 (zh) 基于虚拟键盘的内容输入方法、装置及触控设备
US8855966B2 (en) Electronic device having proximity sensor and method for controlling the same
RU2687037C1 (ru) Способ, устройство быстрого разделения экрана, электронное устройство, ui отображения и носитель хранения
EP2508972B1 (en) Portable electronic device and method of controlling same
US9007314B2 (en) Method for touch processing and mobile terminal
US20130016055A1 (en) Wireless transmitting stylus and touch display system
EP2508970B1 (en) Electronic device and method of controlling same
US20150054630A1 (en) Remote Controller and Information Processing Method and System
WO2016165077A1 (zh) 可穿戴设备及其触摸屏、触摸操作方法和图形用户界面
KR20120073104A (ko) 정보 처리 장치, 정보 처리 방법 및 컴퓨터 프로그램 기억 장치
TW201426518A (zh) 觸感回饋系統及其提供觸感回饋的方法
KR20120119440A (ko) 전자기기에서 사용자의 제스처를 인식하는 방법
US10095277B2 (en) Electronic apparatus and display control method thereof
CN103777861A (zh) 终端和用于在终端中控制触摸操作的方法
KR20160019762A (ko) 터치 스크린 한손 제어 방법
WO2022228261A1 (zh) 显示方法和电子设备
KR102259434B1 (ko) 다기능 터치 펜
TWI544353B (zh) 使用者介面的輸入控制系統及方法
AU2017407719B2 (en) Method for adjusting scrolling speed of interface, related device, and computer program product
CN103246444A (zh) 以外围设备操作屏幕显示菜单的方法
KR20120129621A (ko) 휴대용 전기전자 기기의 사용자 인터페이스 제어 장치 및 방법
TWI475421B (zh) 觸控指令整合方法及觸控系統
TWI439923B (zh) 以週邊裝置操作螢幕顯示選單的方法
TWI567593B (zh) 能夠自動切換操作模式的多功能滑鼠裝置及其方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14909353

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017534917

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15540727

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2014909353

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20177020815

Country of ref document: KR

Kind code of ref document: A