[go: up one dir, main page]

WO2018083765A1 - Input operation method, input operation device, display controller, and information content receiving system - Google Patents

Input operation method, input operation device, display controller, and information content receiving system Download PDF

Info

Publication number
WO2018083765A1
WO2018083765A1 PCT/JP2016/082659 JP2016082659W WO2018083765A1 WO 2018083765 A1 WO2018083765 A1 WO 2018083765A1 JP 2016082659 W JP2016082659 W JP 2016082659W WO 2018083765 A1 WO2018083765 A1 WO 2018083765A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
input operation
information
touch surface
medium
Prior art date
Application number
PCT/JP2016/082659
Other languages
French (fr)
Japanese (ja)
Inventor
吉澤 和彦
橋本 康宣
清水 宏
具徳 野村
奥 万寿男
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2016/082659 priority Critical patent/WO2018083765A1/en
Publication of WO2018083765A1 publication Critical patent/WO2018083765A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to an input operation method, an input operation device, a display control device, and an information content receiving system, and more particularly to an input operation technique using a touch panel.
  • Patent Document 1 discloses that “a processing unit displays image data on a display device, and a cursor is positioned at one of a plurality of objects arranged in a display area of the image data.
  • the control unit detects the movement of the designated position on the touch panel from the start to the end of the drag operation on the touch panel, and determines the movement direction of the cursor based on the detected movement of the designated position.
  • the processing unit selects one of a plurality of objects based on the movement direction of the cursor output from the control unit, and displays the position of the selected object with the cursor (summary extract).
  • Patent Document 1 after the touch-down operation “down” is registered as the starting point of the touch position, and in the drag operation “move”, the direction and distance in which the user's instruction medium moves the touch panel is calculated according to a predetermined procedure.
  • the cursor moves between objects on the display screen. The determination of object movement is performed for each adjacent object, and multiple determinations are required to move to an arbitrary object.
  • Such an operation requires a plurality of drag operations, calculation processes, and determinations until the cursor is moved to a remote object desired by the user, and there is a problem with user instruction responsiveness.
  • the present invention has been made in view of the above points, and an object thereof is to provide an input operation technique that realizes a simple and intuitive object selection method.
  • the present invention has the configuration described in the claims.
  • the present invention provides a display pointer that indicates a range that is an input operation target on a display screen using a touch panel that detects capacitance according to the degree of contact between an instruction medium for inputting a user instruction and a touch surface.
  • 2D position information represented by a 2D coordinate system in the touch surface, and indication position information including information indicating a degree of contact between the indication medium and the touch surface when the indication medium is detected are acquired.
  • the block diagram which shows schematic structure of the information content receiving system which concerns on 1st embodiment.
  • Schematic configuration diagram of the input operation device 1 Hardware structure of information content Side view of the touch sensor during user selection
  • Top view of the touch sensor during user selection The figure which shows the operation screen at the time of user selection operation
  • the flowchart which shows the flow of a process of an information content receiver and a display apparatus
  • indication medium is comparatively large
  • indication medium is comparatively small
  • the figure which shows the menu operation screen of the state where the ground wave is selected Diagram showing the terrestrial broadcast operation screen
  • FIG. 1 is a block diagram showing a schematic configuration of an information content receiving system according to the first embodiment.
  • FIG. 2A is a schematic configuration diagram of the input operation device 1.
  • FIG. 2B is a hardware configuration diagram of the information content receiving device 2.
  • the information content receiving system according to the present embodiment includes an input operation device 1, an information content receiving device 2, and a display device 3, which are connected by wired communication or wireless communication. In this embodiment, it is assumed that a wireless communication connection is established.
  • the input operation device 1 has a function as a remote controller for the information content receiving device 2, the input operation device 1 of the present embodiment includes a touch panel 11 in addition to the input keys 12 to enable various operations. .
  • the input operation device 1 is configured by accommodating in the main body 10 an input operation control unit 110 and a communication unit 120 that transmits and receives information to and from an external device.
  • the input operation control unit 110 is configured by connecting a CPU 101, a ROM 102, a RAM 103, and an I / F 105 to a bus 106.
  • a touch panel 11, input keys 12, and a communication unit 120 are connected to the I / F 105.
  • the communication unit 120 may include a Bluetooth interface for transmitting / receiving a signal to / from an external device (in this embodiment, the information content receiving device 2) using, for example, Bluetooth (registered trademark).
  • the CPU 101 is a control unit for controlling the input operation device 1 and executes predetermined processing according to a program stored in advance in the ROM 102, for example.
  • One of the predetermined processes is, for example, a process of discriminating or analyzing a user operation on the touch panel 11 and / or the input key 12 and generating information corresponding to the operation.
  • the information generated by the CPU 101 is output to the communication unit 120 via the I / F 105, and is transmitted from the communication unit 120 to the information content receiving device 2 wirelessly using Bluetooth or the like.
  • Information on operations on the touch panel 11 and / or the input keys 12 and / or information generated by the CPU 101 may be output to the RAM 103 via the I / F 105 and temporarily stored in the RAM 103.
  • a touch panel 11 (corresponding to a touch sensor) and one or more input keys 12 are provided on the upper surface of the main body 10 (see FIG. 1).
  • the touch panel 11 is indicated by an instruction point 14 (indicated by X in the drawing) indicated by a user instruction medium (for example, a finger, a stylus pen, etc.) 13 used for transmitting a user's operation intention. 2, and is not actually provided in the input operation device 1), and the input operation device 1 uses the acquired two-dimensional position information as the information content reception device 2.
  • This two-dimensional position information is expressed by a two-dimensional coordinate system in the touch surface of the touch panel 11.
  • the instruction point 14 indicates a position where the touch panel 11 detects the user instruction medium 13 while the user instruction medium 13 is not in contact with the touch panel 11 and a state where the user instruction medium 13 is in contact with the touch panel 11. May indicate a position where the user instruction medium 13 is detected.
  • the touch panel 11 is configured using a touch sensor that detects capacitance according to the degree of contact or proximity between the user instruction medium 13 and the touch surface, and the user instruction medium and the touch surface when the user instruction medium 13 is detected. Outputs information indicating the degree of contact or approach. This degree of contact or approach includes both the size of the separation distance between the touch panel 11 and the user instruction medium 13 in a non-contact state, and the contact state.
  • the input operation method according to the present embodiment is executed by detecting an action in which the user instruction medium 13 is brought close to the touch panel 11 and brought into contact with the touch panel 11.
  • the two-dimensional coordinate system in the touch surface of the indication point 14 and information indicating the degree of contact or approach are collectively referred to as indication position information.
  • the input operation control unit 110 outputs the designated position information output from the touch panel 11 to the communication unit 120, and the communication unit 120 transmits the designated position information to the information content receiving device 2.
  • the information content receiver 2 is connected to the information storage device 4, the broadcast receiving medium 5 that receives broadcast waves, the communication network 6, and the display device 3 in addition to the input operation device 1. Then, the information content receiving device 2 receives at least one information content from the information storage device 4, the broadcast receiving medium 5, and the communication network 6 in accordance with the instruction of the input operation device 1, performs the processing described later, Is displayed on the display device 3.
  • the information content receiving apparatus 2 includes a storage information input / output unit 24, a broadcast receiving unit 25, a network transmission / reception unit 26, which constitute a functional unit that receives information content. And a display control device that performs the above.
  • the display control device includes, for example, an input operation processing unit 21, an operation screen generation unit 22, a central control unit 23, and a display output unit 27.
  • the input operation processing unit 21 receives the instruction position information of the instruction point 14 transmitted from the input operation device 1 by radio using, for example, Bluetooth, and demodulates the modulation performed for the radio transmission. If the indicated position information is encoded, a decoding process is performed. The input operation processing unit 21 outputs the demodulated / decoded instruction position information to the display pointer processing unit 23a of the central control unit 23, and outputs a control signal in response to or based on reception of the instruction position information. 22 to output.
  • the stored information input / output unit 24 reads the information content stored in the information storage device 4 and sends the information content to the central control unit 23.
  • the information content is a moving image, a photograph, or a sound. Since the information content has a large amount of information in the original state and is not suitable for storage, it is generally subjected to code compression processing specified by MPEG (Motion Picture Experts Group) for moving images and JPEG (Joint Photographic Experts Group) for photographs. ing.
  • MPEG Motion Picture Experts Group
  • JPEG Joint Photographic Experts Group
  • the central control unit 23 performs a decoding / decompression process on the code-compressed information content, restores the original information form, and outputs it to the display output unit 27.
  • the broadcast receiving unit 25 receives terrestrial and satellite broadcast signals from the broadcast receiving medium 5, demodulates them, and obtains information content code-compressed by MPEG or the like.
  • the central control unit 23 restores the information content to the form of the original information and outputs it to the display output unit 27.
  • the network transmission / reception unit 26 receives information content that is code-compressed from the communication network 6 according to a well-known communication protocol such as HTTP (Hypertext Transfer Protocol) or RTP (Real-time Transport Protocol).
  • the central control unit 23 restores the information content to the form of the original information and outputs it to the display output unit 27.
  • the central control unit 23 can also store the information content obtained by the broadcast receiving unit 25 and the network transmission / reception unit 26 as they are or re-encode them in the information storage device 4 via the stored information input / output unit 24.
  • the central control unit 23 can handle information contents other than moving images, photographs, and sounds.
  • the central control unit 23 processes web information such as HTML (Hyper Text Markup Language) via the network transmission / reception unit 26 and the communication network 6.
  • the central control unit 23 also outputs the processed web information to the display output unit 27.
  • the central control unit 23 includes a display pointer processing unit 23a.
  • the display pointer processing unit 23 a determines the display position of the display pointer in the display screen of the display device 3 based on the indicated position information output from the input operation processing unit 21. Further, the display pointer processing unit 23a determines the size of the display pointer based on information indicating the degree of contact or approach included in the designated position information.
  • the display pointer 33 indicates a range in the display screen selected by the user through the input operation.
  • the display pointer processing unit 23a refers to the operation display system information in which the two-dimensional coordinate system of the display screen of the display device 3 is associated with the two-dimensional coordinate system in the surface of the touch panel 11, and the two-dimensional position of the indication point 14
  • Display pointer information for displaying the information 36 converted into the two-dimensional coordinate system of the display screen and the center 36 of the display pointer 33 having the determined size is output to the operation screen generation unit 22.
  • An area covered by the display pointer 33 in the screen is a range to be input by the user.
  • the operation screen generation unit 22 generates a base screen on which a plurality of objects 34 are arranged using the control signal output from the input operation processing unit 21 as a trigger. Further, the operation screen generation unit 22 generates an operation screen 32 by combining the base screen with the display pointer 33 generated based on the display pointer information output from the central control unit 23.
  • the display output unit 27 synthesizes the operation screen 32 from the operation screen generation unit 22 and the main screen 31 from the central control unit 23 and outputs them to the display device 3 via, for example, an HDMI interface.
  • the entire screen area of the display device 3 is referred to as a main screen 31, and the operation screen 32 is described using an example of a screen area smaller than the main screen 31.
  • the operation screen 32 may be configured using the entire screen area of the display device 3.
  • a display pointer 33 is displayed in the operation screen 32. Therefore, the operation screen generation unit 22 outputs information indicating the size of the generated operation screen 32 to the display pointer processing unit 23a, and the display pointer processing unit 23a determines the display position of the display pointer 33 in the operation screen 32. To do.
  • the display device 3 displays information content on the main screen 31 and input operation information on the operation screen 32.
  • the display pointer 33 of a finite area including the display pointer 33 is displayed for convenience of explanation and is not actually displayed on the operation screen 32, and the inside of the object in the area of the display pointer 33 is displayed. As a result, one or more highlighted objects 35 are displayed differently from the other objects.
  • the display output unit 27 compares the position of the object on the operation screen 32 with the position of the display pointer 33 superimposed thereon, identifies the object on which the display pointer 33 overlaps, and highlights the object.
  • Highlight display is an aspect in which an object on which a display pointer is superimposed is displayed in a different display mode from an object on which a display pointer is not superimposed. For example, it is a blinking display that blinks an object on which a display pointer is superimposed. May be.
  • the highlighted object 35 is an object that is a candidate for user selection, and the input operation is completed when the user selects the highlighted object 35.
  • This selection operation is performed when the degree of approach between the touch panel 11 and the user instruction medium 13 gradually increases, the size of the display pointer decreases accordingly, and the number of objects on which the display pointer is superimposed becomes one.
  • the output unit 27 may output a selection operation signal indicating that one object has been selected to the central control unit 23, and the central control unit 23 may determine the object as the object selected by the user. .
  • FIG. 2B is a diagram illustrating a hardware configuration example of the information content receiving apparatus.
  • the information content receiving device 2 includes a control bus 201 and an I / O bus 210.
  • the control bus 201 is connected to a graphic board 202, a video / audio board 203, a CPU 204, a RAM 205, and a ROM 206.
  • a CPU 204, USB I / F 211, tuner 212, LAN I / F 213, communication unit 215, and power supply unit 214 are connected to the I / O bus 210.
  • An HDMI signal is input to the graphic board 202.
  • the power supply unit 214 supplies power to each unit such as the graphic board 202, the video / audio board 203, the CPU 204, the RAM 205, the ROM 206, the USB I / F 211, the tuner 212, the LAN I / F 213, and the communication unit 215.
  • the USB I / F 211 corresponds to the stored information input / output unit 24 in FIG. 1, and reads information content from the information storage device 4 such as an external HDD having a USB terminal, for example, and transmits it to the CPU 204.
  • the tuner 212 corresponds to the broadcast receiving unit 25, and receives information content transmitted by digital broadcast through the broadcast receiving medium 5 such as an antenna, for example, demodulates and decodes as necessary, and transmits it to the CPU 204.
  • the LAN I / F 213 corresponds to the network transmission / reception unit 26 of FIG. 1 and accesses a desired Web site via the communication network 6 to acquire information content.
  • the CPU 204 corresponds to the central control unit 23 in FIG. 1, and the encoded video / audio data included in the information content received or acquired by the USB I / F 211, the tuner 212, and the LAN I / F 213 is converted into the video / audio board 203. And decoding using the RAM 205.
  • the decryption program is stored in the ROM 206, for example, and the CPU 204 reads the program stored in the ROM 206 and executes the decryption process. Further, the CPU 204 performs frame rate conversion processing, scaling processing, brightness correction processing, contrast enhancement processing, and the like using the audio / video board 203 and / or the RAM 205 as necessary.
  • the communication unit 215 corresponds to the input operation processing unit 21 in FIG. 1 and includes, for example, a Bluetooth I / F for communicating with the input operation device 1 in FIG. 1 by Bluetooth.
  • the communication unit 215 receives the designated position information transmitted from the input operation device 1 by radio, performs demodulation / decoding conforming to the Bluetooth standard or specification, and outputs the demodulated information to the CPU 204. Further, the CPU 204 generates a control signal based on the indicated position information and transmits it to the graphic board 202.
  • the CPU 204 further includes the display pointer processing unit 23a shown in FIG. 1, and executes the process for determining the display position and size of the display pointer 33 described above using the indicated position information from the communication unit 215.
  • the graphic board 202 corresponds to the operation screen generation unit 22 in FIG. 1 and has a part of the function of the display output unit 27.
  • the graphic board 202 generates an operation screen 32 including the object 34 and the display pointer 33 using the control signal from the CPU 204, the indicated position information, and the information on the display pointer 33 from the CPU 204.
  • the graphic board 202 receives information content received or acquired by the USB I / F 211, the tuner 212 or the LAN I / F 213 and decrypted by the CPU 204.
  • the information content is used as a main screen 31, and the main screen 31 is operated.
  • a process for synthesizing the screen 32 is performed and output to the HDMII / F216.
  • the processing of the CPU 204 and the processing of the graphic board 202 are executed using a program stored in the ROM 206.
  • the HDM II / F 216 has a part of the function of the display output unit 27 of FIG. 1, and a format that conforms to the HDMI standard or specification of an image obtained by combining the operation screen 32 output from the graphic board 202 and the content. Or it converts into a signal form and outputs to the display apparatus 3.
  • the HDMII / F 216 includes a CEC line, and commands or messages may be exchanged with the display device 3 via the CEC line.
  • 3A to 3C are diagrams showing a relationship between the touch panel 11 of the input operation device 1 and the operation screen 32 of the display device 3 according to the first embodiment of the user selection operation.
  • 3A shows a side view of the touch sensor unit.
  • FIG. 3B shows a top view of the touch sensor unit.
  • FIG. 3C shows an operation screen.
  • FIG. 3A shows a vertical distance d between the touch panel 11 provided in the input operation device 1 and the user instruction medium 13.
  • This distance d is a parameter indicating the degree of contact or approach between the touch panel 11 and the user instruction medium 13.
  • the touch panel 11 uses a touch sensor having a configuration for detecting a change in capacitance according to the distance d from the user instruction medium 13.
  • the touch panel 11 responds as the distance d decreases.
  • the value of the capacitance increases, and the user instruction medium 13 is approaching the touch surface.
  • the display pointer processing unit 23a decreases the size of the display pointer 33 as the distance d decreases.
  • the detection of the degree of contact or approach between the touch panel 11 and the user instruction medium 13 is performed by measuring the level of pressure applied to the touch panel 11 by the user instruction medium 13 in addition to the method of detecting by the change in capacitance as described above.
  • the degree of contact or approach can be detected by converting the pressure level into distance d.
  • the change in electrostatic capacity is set to a positive value of the distance d
  • the change of pressure is set to a negative value of the distance d. It is also possible to represent both values with variables.
  • the touch panel 11 when the touch panel 11 is viewed from the top, the touch panel 11 has its rectangular area represented by four coordinate points (0, 0), (0, 1), (1, 0), and (1, 1). I can express.
  • the indicated position information which is the position information of the user indicating medium 13 with respect to the touch panel 11 is the coordinates (a, b) of the xy coordinates on the two-dimensional plane including the touch surface (where 0 ⁇ a ⁇ 1, 0 ⁇ b ⁇ 1). And the distance d from the touch surface.
  • FIG. 3C is a diagram showing the operation screen 32, and is a diagram showing how the indicated position information is used.
  • the area of the operation screen 32 is also represented by four coordinate points (0, 0), (0, s), (t, 0), and (t, s), similar to the touch surface of the touch panel 11.
  • the display pointer processing unit 23a has operation display system information that associates the xy coordinates of the two-dimensional plane including the touch surface with the xa-ya coordinates that specify the position on the screen displayed on the display device 3.
  • the point on the screen of the display device 3 corresponding to the coordinates (a, b) of the indication point 14 acquired from the input operation device 1 is calculated.
  • the point corresponding to the indication point 14 is the center 36 of the display pointer.
  • the center 36 of the display pointer can be represented by the coordinates (A, B) of the area of the operation screen 32.
  • a and B are obtained by the following equation (1).
  • the operation display system information corresponds to each value of t and s and equation (1).
  • an arbitrary point on the operation screen 32 can be indicated on the touch panel 11 regardless of hardware restrictions such as the shape of the touch panel 11.
  • FIG. 4 is a flowchart showing the flow of processing according to the present embodiment, and an input operation control unit of the input operation device 1 for automatically determining whether or not the operation screen 32 is superimposed on the main screen 31.
  • the flow of processing in 110 is shown.
  • the input operation control unit 110 when the touch panel 11 detects the movement of the user instruction medium 13 (S101 / YES), the input operation control unit 110 generates a detection signal indicating that the user instruction medium 13 is detected, and the communication unit The information is output to the information content receiving apparatus 2 via 120 (S102).
  • the input operation processing unit 21 of the information content receiving apparatus 2 When the input operation processing unit 21 of the information content receiving apparatus 2 receives the detection signal, the input operation processing unit 21 outputs the detection signal or a control signal based on the detection signal to the operation screen generation unit 22 and the central control unit 23.
  • the operation screen generator 22 outputs a display ON signal for turning on the display of the operation screen 32 to the display output unit 27 using a detection signal or a control signal based on the detection signal as a trigger.
  • a setting for detecting the user operation time interval for example, a timer built in the input operation device 1 is initialized, and measurement of the elapsed time T since the user operated most recently is started (S103).
  • the input operation control unit 110 transmits operation information indicating the content of the detected user operation to the information content receiving device 2 ( S105), the input operation processing unit 21 of the information content receiving device 2 receives the information. Thereafter, processing according to the operation information is executed by the information content receiving device 2. If this operation information is the indicated position information, display pointer display processing is executed.
  • the input operation device 1 when the elapsed time T is larger than the threshold value (Td) (T> Td), the input operation device 1 sends a display OFF signal to the information content receiving device 2 to turn off the display of the operation screen 32 (S107). ). Then, the input operation processing unit 21 of the information content receiving device 2 sends a display OFF signal to the display output unit 27 via the operation screen generation unit 22.
  • the operation screen 32 is not displayed superimposed on the main screen 31, and part of the information content displayed on the main screen 31 is not required to be viewed on the operation screen 32. Will not be disturbed.
  • FIG. 5 is a flowchart showing a processing flow of the information content receiving device 2 and the display device 3.
  • the input operation processing unit 21 When the input operation processing unit 21 receives a display ON signal from the input operation device 1 (YES in S201), the input operation processing unit 21 outputs a display ON signal to the operation screen generation unit 22, and the operation screen generation unit 22 The operation screen 32 is displayed superimposed on the main screen 31 via the display output unit 27 (S202). If the input operation processing unit 21 does not receive the display ON signal (NO in S201), it waits.
  • the display pointer processing unit 23a When the input operation processing unit 21 receives the designated position information (YES in S203), the display pointer processing unit 23a reads the distance d from the designated position information and determines the size of the display pointer (S204). Since a circular display pointer is used in this embodiment, the display pointer processing unit 23a determines the radius of the display pointer according to the value of the distance d. More specifically, the radius of the display pointer is increased when the distance d is large, and the radius of the display pointer is decreased when the distance d is small.
  • the display pointer processing unit 23 a reads the two-dimensional position information of the instruction point 14 from the instruction position information, calculates a point corresponding to the coordinates of the instruction point 14 on the operation screen 32, and sets this point as the center 36 of the display pointer 33.
  • the data is output to the display output unit 27.
  • the display output unit 27 displays the specified object 34 in a highlighted manner while aligning and displaying the center 36 of the display pointer 33 at a position corresponding to the instruction point 14 on the operation screen 32 (S206).
  • the process returns to S203 to continue the processing of the display pointer 33 in the operation screen 32. If the display OFF signal is received, the operation screen 32 is displayed and the display pointer 33 is displayed accordingly. Exit.
  • FIG. 6 is a diagram showing two operation screens 32 when the distance d between the touch panel 11 and the user instruction medium 13 is different.
  • FIG. 6A shows a case where the distance d is relatively large
  • FIG. 6B shows a case where the distance d is relatively small. Show the case.
  • the area of the display pointer 33a is relatively large, and a plurality of objects 34 are included in the area of the display pointer 33a.
  • these objects 34 there are a plurality of highlighted objects 35.
  • the object 34 that is entirely included in the area of the display pointer 33a is the highlighted object 35.
  • the user brings the user instruction medium 13 closer to the touch panel 11 in order to select a desired object. This is shown in FIG. 6B.
  • the user has brought the user instruction medium 13 closer to the touch panel 11, and the area of the display pointer 33b is small. This is a result of adjusting the position of the user instruction medium 13 so that the highlighted object 35 is a desired object, so that the highlighted object 35 is one in the area of the display pointer 33b. .
  • the user further presses the user instruction medium 13 in contact with the touch panel 11 to select the highlighted object 35.
  • the size of the area of the display pointer 33 is changed according to the user selection operation, and the display pointer 33 is narrowed down closer to the target, and the cooperation with the user operation is further improved.
  • the object when only one object is included in the display pointer 33b, the object may be selected.
  • FIG. 7A is a diagram showing a menu operation screen in a state where terrestrial waves are selected.
  • FIG. 7B is a diagram showing a terrestrial broadcast operation screen.
  • the menu operation screen 32a includes an object for selecting terrestrial broadcast, an object for selecting satellite broadcast, an object for selecting a BD (Blu-ray Disc) / DVD (Digital Versatile Disk) device, An object 34 for selecting contents to be displayed such as news, movies, music, and photos is arranged.
  • BD Blu-ray Disc
  • DVD Digital Versatile Disk
  • the object 35 highlighted by the display pointer 33 is selected.
  • This object 35 is a terrestrial broadcast object.
  • the terrestrial broadcast application operation screen 32b shown in FIG. 7B is displayed, that is, the menu operation screen 32a is changed to the terrestrial broadcast application operation screen 32b.
  • a channel selection object of 1CH to 12CH and an object that can directly select a program guide or data broadcast are arranged.
  • the objects indicated by the arrow symbols ( ⁇ , ⁇ ) are objects that are paged to the previous page or the next page, and the number of objects that can be displayed at one time can be limited to improve the visibility of the user.
  • the object indicated by the arrow symbol ( ⁇ , ⁇ ) is also in FIG. 7A.
  • FIG. 8 is also a diagram showing an example of the operation screen.
  • FIG. 8A shows a state where the SNS is selected on the menu operation screen 32a
  • FIG. 8B shows that the SNS (Social Network Service) login screen is displayed on the main screen 31. Indicates the state.
  • SNS Social Network Service
  • an SNS login screen 31a appears on the main screen 31, as shown in FIG. 8B.
  • the menu operation screen 32a is automatically hidden.
  • the main screen 31 includes a character input frame 37 such as an ID (Identification) input frame and a password input frame, a login execution button 38, and a cursor 39 used to select the character input frame 37 and the execution button 38. Be placed.
  • the SNS login screen 31a is similar to a screen used in a PC (Personal Computer) or the like, and the character input frame 37, the execution button 38, and the cursor 39 have the same functions.
  • characters are input by the input key 12 of the input operation device 1.
  • the cursor 39 can be moved by moving the user instruction medium 13 on the touch panel 11 as described above. At this time, the touch panel 11 and the main screen 31 may move the cursor based on the relative movement information of the user instruction medium 13 without making the coordinate system identical.
  • the coordinate expression of the area of the operation screen 32 has a one-to-one correspondence with the coordinate expression of the area of the touch panel 11, and the center 36 of the display pointer corresponds to one coordinate corresponding to the indication point 14. Determined to position.
  • the size of the display pointer 33 changes according to the degree of contact or approach between the touch panel 11 and the user instruction medium 13, that is, the distance d from the upper surface of the touch panel 11 to the user instruction medium 13.
  • the display pointer when the distance d is relatively far is indicated by reference numeral 33a
  • the display pointer when the distance d is relatively close is indicated by reference numeral 33b.
  • the display pointer 33 increases when the distance d is long, and decreases when the distance d is short.
  • the degree of contact or approach between the touch panel 11 and the user instruction medium 13 is represented by the size of the display pointer 33 on the operation screen 32, and gives intuitive visibility to the user.
  • the shape of the display pointer 33 is described as being based on a circle. However, other shapes such as a polygon based or an anime character shape may be more familiar to the user. . Further, the display pointer 33 is preferably translucent as shown in the figure, and is colored so as to be easily distinguished from the background.
  • the display pointer 33 includes several objects 34 in the area of the display pointer 33 when the degree of contact or proximity between the touch panel 11 and the user instruction medium 13 is medium.
  • an object close to the center 36 of the display pointer is highlighted as the highlighted object 35, so that the user can easily understand whether the object is a desired object.
  • the user When the highlighted object 35 is a desired object, the user performs a selection operation by pressing the user instruction medium 13 on the touch panel 11. If the highlighted object 35 is not a desired object, the user instruction medium 13 is moved on the touch panel 11 so that the desired object becomes the highlighted object 35.
  • the user instruction medium By bringing 13 to the touch panel 11, the size of the display pointer 33 can be reduced.
  • the highlighted object 35 can be displayed in a different display form, for example, by changing the color or displaying the 3D display. Useful for showing to.
  • the user can operate his / her selection operation while viewing the operation screen 32 of the display device 3 with the display pointer 33 and the highlighted object 35. There is no need to look directly at the touch panel 11 or the user instruction medium 13.
  • the operation screen 32 is displayed in multiple on the main screen 31 on which the information content is displayed, and it is not necessary to keep an eye on the information content in the selection operation.
  • the present embodiment is effective for shaking of the user instruction medium 13.
  • the shaking of the user's own movement directly shakes the user instruction medium 13.
  • the display pointer 33 fluctuates.
  • the highlighted object 35 is presented to the user, the amount of fluctuation or movement is within the range of the object size. It is possible to continue the highlighted object 35 being the same.
  • the size of the shake or movement is compared with the size of the object, and the highlighted object 35 may be made different only when the size of the shake or movement is larger than the size of the object.
  • FIG. 9 is a diagram illustrating the input operation device according to the second embodiment.
  • FIG. 10 is a flowchart according to the input operation processing unit 21 according to the second embodiment.
  • FIG. 9 shows an input operation device 1a applied to the second embodiment, and the input operation device 1 shown in FIG.
  • the holding sensor 15 is installed outside the main body 10 of the input operation device 1a and is realized by, for example, a pressure sensor.
  • the user grasps the input operation device 1a with the intention of operating the information content receiving device 2. Detect motion.
  • the holding sensor 15 detects the contact and outputs a detection signal.
  • the CPU 101 see FIG. 2A
  • the holding sensor 15 is configured to continue outputting the detection signal or to output at a constant period while detecting that the user holds the input operation device 1a. For this reason, while the user is holding the input operation device 1a, the held information is continuously or periodically transmitted to the information content receiving device 2 from the input operation device 1a.
  • FIG. 10 is a flowchart showing a processing operation in the present embodiment of the input operation processing unit 21 of the information content receiving device 2.
  • the input operation processing unit 21 When the input operation processing unit 21 receives the hold information from the input operation device 1a in a state where the operation screen 32 is OFF (S108a / YES), the input operation processing unit 21 displays the operation screen to turn on the display of the operation screen 32. A display ON signal is sent to the display output unit 27 via the generation unit 22 to display the operation screen 32 (S112).
  • the input operation processing unit 21 When the transmission of the holding detection signal from the input operation device 1a is interrupted (S108b / NO), the input operation processing unit 21 outputs a display OFF signal to the display output unit 27 via the operation screen generating unit 22, and the operation screen 32 is displayed. Hide (S117).
  • the operation screen 32 when no user operation is performed, the operation screen 32 is not displayed superimposed on the main screen 31, and a part of information content is unnecessarily prevented from being viewed by the operation screen. Disappear.
  • FIG. 11 is a diagram illustrating an input operation device according to the third embodiment.
  • the same components as those of the input operation device 1 shown in FIG. 11 the same components as those of the input operation device 1 shown in FIG.
  • the input operation device 1b includes a display panel 16 with a touch panel and a home button 17.
  • the touch panel 11 and the input key 12 are configured on one display panel 16 with a touch panel.
  • the touch panel-equipped display panel 16 is, for example, electronic paper or an LCD (Liquid Crystal Display) with a built-in touch cell.
  • the touch panel 11 and the input key 12 divide the display panel 16 with a touch panel and are arranged in each divided area.
  • the upper part is an area where the input keys 12 are arranged, and the lower part is an area corresponding to the touch panel 11.
  • the touch panel of the display panel 16 with a touch panel is uniformly provided over the entire upper surface area of the input operation device 1b, and the area where the input keys 12 are arranged and the area corresponding to the touch panel 11 are an integrated touch panel. Can be created. For this reason, this embodiment can implement
  • the home button 17 is provided to make a transition to a higher screen when the input operation device 1b is applied to other applications.
  • the input operation device 1, the information content receiving device 2, and the display device 3 have been described as separate devices. However, in the embodiment of the present invention, the three devices are configured separately as described above. However, the information content receiving device 2 and the display device 3 may be integrated.
  • FIG. 12 is a diagram showing an information content receiving system in which a tablet terminal and a projector are connected.
  • the information content display system when displaying the information content produced with the tablet terminal 7, for example, the moving image, drawing, and illustration which were image
  • the tablet terminal 7 shown in FIG. 12 includes a touch panel that can measure capacitance.
  • the tablet terminal 7 has the same configuration as that of the input operation device 1 and the information content receiving device 2 in FIG. 1, and receives and stores information content. Then, the tablet terminal 7 and the projector 8 may be connected to each other by wireless communication, for example, Bluetooth (registered trademark), and information based on the input operation received by the tablet terminal 7 may be transmitted to the projector 8 and displayed.
  • the tablet terminal 7 corresponds to a device in which the input operation device 1 and the information content receiving device 2 are configured as an integrated device, and the projector 8 corresponds to the display device 3.
  • microprocessor unit or the like may be realized by software by interpreting and executing an operation program that realizes each function or the like.
  • hardware and software may be used together.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An input operation method using an instruction medium 13 for inputting a user's instruction and a touch panel 11 for detecting static capacitance that corresponds to the degree of contact with a touch surface, said method acquiring instructed position information that includes two-dimensional information in a touch plane pertaining to an instructed point 14 at which the touch panel 11 has detected the instruction medium and information that indicates the degree of contact of the instruction medium 13 with the touch plane, determining the size of a display pointer 33 on the basis of the information that indicates the degree of contact, referring to operation display information in which the two-dimensional coordinate system of a display screen and a two-dimensional coordinate system in the touch plane are correlated, and displaying two-dimensional position information pertaining to the instructed point 14, with the center of the display pointer of the determined size adjusted to a point on the display screen that is converted into the two-dimensional coordinate system.

Description

入力操作方法、入力操作装置、表示制御装置、及び情報コンテンツ受信システムInput operation method, input operation device, display control device, and information content receiving system
 本発明は、入力操作方法、入力操作装置、表示制御装置、及び情報コンテンツ受信システムに係り、特にタッチパネルを用いた入力操作技術に関する。 The present invention relates to an input operation method, an input operation device, a display control device, and an information content receiving system, and more particularly to an input operation technique using a touch panel.
 本技術分野の背景技術として、特許文献1には、「処理部は、画像データを表示装置に表示し、画像データの表示領域に配置された複数のオブジェクトのうちの、いずれかの位置にカーソルを表示する。制御部は、タッチパネルに対してドラッグ操作が開始されてから終了するまでのタッチパネル上の指示位置の移動を検出し、検出した指示位置の移動に基づいてカーソルの移動方向を決定して出力する。処理部は、制御部から出力されたカーソルの移動方向に基づいて複数のオブジェクトのうちのいずれかを選択して、選択されたオブジェクトの位置をカーソルにて表示する(要約抜粋)」入力操作装置の記載がある。 As a background art of this technical field, Patent Document 1 discloses that “a processing unit displays image data on a display device, and a cursor is positioned at one of a plurality of objects arranged in a display area of the image data. The control unit detects the movement of the designated position on the touch panel from the start to the end of the drag operation on the touch panel, and determines the movement direction of the cursor based on the detected movement of the designated position. The processing unit selects one of a plurality of objects based on the movement direction of the cursor output from the control unit, and displays the position of the selected object with the cursor (summary extract). There is a description of the input operation device.
国際公開第2015/108155号International Publication No. 2015/108155
 特許文献1では、タッチダウン操作「down」でタッチ位置の起点を登録し、ドラッグ操作「move」において、ユーザの指示媒体がタッチパネルを移動した方向、距離を所定の手続きにて演算処理した後、表示画面上のカーソルのオブジェクト間の移動がなされる。またオブジェクトの移動の判断は、隣接するオブジェクトごとに行われ、任意のオブジェクトに移動するためには、複数回の判断を必要とする。 In Patent Document 1, after the touch-down operation “down” is registered as the starting point of the touch position, and in the drag operation “move”, the direction and distance in which the user's instruction medium moves the touch panel is calculated according to a predetermined procedure. The cursor moves between objects on the display screen. The determination of object movement is performed for each adjacent object, and multiple determinations are required to move to an arbitrary object.
 かかる操作では、ユーザの所望する遠隔のオブジェクトにカーソルを移動させるまで、複数回のドラッグ操作、演算処理と判断が必要であり、ユーザの指示応答性に課題があった。 Such an operation requires a plurality of drag operations, calculation processes, and determinations until the cursor is moved to a remote object desired by the user, and there is a problem with user instruction responsiveness.
 本発明は、上記の点を鑑みてなされたものであり、その目的は、簡単で、直感的なオブジェクト選択方法を実現する入力操作技術を提供することにある。 The present invention has been made in view of the above points, and an object thereof is to provide an input operation technique that realizes a simple and intuitive object selection method.
 上記課題を解決するために本発明では、特許請求の範囲に記載の構成を備える。その一例として、本発明は、ユーザの指示を入力する指示媒体とタッチ面との接触の度合いに応じた静電容量を検出するタッチパネルを用いて表示画面に入力操作対象となる範囲を示す表示ポインタを表示するための入力操作方法であって、前記タッチ面に対する前記指示媒体の位置を特定する指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触の度合いを示す情報を含む指示位置情報を取得するステップと、前記接触の度合いを示す情報に基づいて、前記表示画面内の入力操作対象となる範囲を示す表示ポインタのサイズを決定するステップと、前記表示画面の2次元座標系と前記タッチ面内の2次元座標系とを対応付けた操作表示系情報を参照し、前記指示ポイントの2次元位置情報を前記表示画面の2次元座標系に変換した点に、前記決定したサイズの表示ポインタの中心を合わせて表示するステップと、を含むことを特徴とする。 In order to solve the above-described problems, the present invention has the configuration described in the claims. As an example thereof, the present invention provides a display pointer that indicates a range that is an input operation target on a display screen using a touch panel that detects capacitance according to the degree of contact between an instruction medium for inputting a user instruction and a touch surface. Is an input operation method for displaying a position of the pointing medium with respect to the touch surface, the coordinates of the pointing point being the position where the pointing medium is detected on the touch surface. 2D position information represented by a 2D coordinate system in the touch surface, and indication position information including information indicating a degree of contact between the indication medium and the touch surface when the indication medium is detected are acquired. Determining a size of a display pointer indicating a range to be an input operation target in the display screen based on the information indicating the degree of contact; and A point obtained by converting the two-dimensional position information of the indicated point into the two-dimensional coordinate system of the display screen by referring to the operation display system information that associates the two-dimensional coordinate system of the screen with the two-dimensional coordinate system in the touch surface. And displaying the center of the display pointer of the determined size in alignment with each other.
 本発明によれば、簡単で、直感的なオブジェクト選択方法を実現する入力操作技術を提供することができる。なお、上記した以外の目的、構成、効果については以下において明らかにされる。 According to the present invention, it is possible to provide an input operation technique that realizes a simple and intuitive object selection method. The purpose, configuration, and effects other than those described above will be clarified below.
第一実施形態に係る情報コンテンツ受信システムの概略構成を示すブロック図The block diagram which shows schematic structure of the information content receiving system which concerns on 1st embodiment. 入力操作装置1の概略構成図Schematic configuration diagram of the input operation device 1 情報コンテンツのハードウェア構成図Hardware structure of information content ユーザ選択動作時のタッチセンサ部側面図Side view of the touch sensor during user selection ユーザ選択動作時のタッチセンサ部上面図Top view of the touch sensor during user selection ユーザ選択動作時の操作画面を示す図The figure which shows the operation screen at the time of user selection operation 第一実施形態に係る処理の流れを示すフローチャートであって、操作画面を主画面に重畳表示するか否かを自動的に行う例のフローチャートIt is a flowchart which shows the flow of the process which concerns on 1st embodiment, Comprising: The flowchart of the example which performs automatically whether the operation screen is superimposed on a main screen, or not 情報コンテンツ受信装置及び表示装置の処理の流れを示すフローチャートThe flowchart which shows the flow of a process of an information content receiver and a display apparatus タッチパネルとユーザ指示媒体の距離が比較的大きい場合の操作画面を示す図The figure which shows the operation screen when the distance of a touch panel and a user instruction | indication medium is comparatively large タッチパネルとユーザ指示媒体の距離が比較的小さい場合の操作画面を示す図The figure which shows the operation screen when the distance of a touch panel and a user instruction | indication medium is comparatively small 地上波が選択された状態のメニュー操作画面を示す図The figure which shows the menu operation screen of the state where the ground wave is selected 地上波放送操作画面を示す図Diagram showing the terrestrial broadcast operation screen SNSが選択された状態のメニュー操作画面を示す図The figure which shows the menu operation screen of the state in which SNS was selected 主画面にSNSログイン画面が表示された状態の画面を示す図The figure which shows the screen of the state in which the SNS login screen was displayed on the main screen 第二実施形態に係る入力操作装置を示す図The figure which shows the input operation apparatus which concerns on 2nd embodiment. 第二実施形態に係る入力操作処理部に係るフローチャートThe flowchart which concerns on the input operation process part which concerns on 2nd embodiment. 第三実施形態に係る入力操作装置を示す図The figure which shows the input operation apparatus which concerns on 3rd embodiment. タブレット端末とプロジェクタとを接続した情報コンテンツ受信システムを示す図The figure which shows the information content receiving system which connected the tablet terminal and the projector
 以下、図面を参照しながら本発明の実施形態について説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
<第一実施形態>
 図1は、第一実施形態に係る情報コンテンツ受信システムの概略構成を示すブロック図である。図2Aは入力操作装置1の概略構成図である。図2Bは情報コンテンツ受信装置2のハードウェア構成図である。本実施形態に係る情報コンテンツ受信システムは、入力操作装置1、情報コンテンツ受信装置2、表示装置3を備え、これらが有線通信接続又は無線通信接続される。本実施形態では無線通信接続されるとする。入力操作装置1は、情報コンテンツ受信装置2のリモコンとしての機能を有するが、本実施形態の入力操作装置1は、入力キー12の他にタッチパネル11を備えることにより様々な操作を可能にしている。
<First embodiment>
FIG. 1 is a block diagram showing a schematic configuration of an information content receiving system according to the first embodiment. FIG. 2A is a schematic configuration diagram of the input operation device 1. FIG. 2B is a hardware configuration diagram of the information content receiving device 2. The information content receiving system according to the present embodiment includes an input operation device 1, an information content receiving device 2, and a display device 3, which are connected by wired communication or wireless communication. In this embodiment, it is assumed that a wireless communication connection is established. Although the input operation device 1 has a function as a remote controller for the information content receiving device 2, the input operation device 1 of the present embodiment includes a touch panel 11 in addition to the input keys 12 to enable various operations. .
 図2Aに示すように、入力操作装置1は、入力操作制御部110及び外部装置と情報の送受信を行う通信部120を本体10内に収容して構成される。 As shown in FIG. 2A, the input operation device 1 is configured by accommodating in the main body 10 an input operation control unit 110 and a communication unit 120 that transmits and receives information to and from an external device.
 入力操作制御部110は、CPU101、ROM102、RAM103、及びI/F105がバス106に接続されて構成される。I/F105には、タッチパネル11、入力キー12、及び通信部120が接続される。通信部120は、例えばBluetooth(登録商標)により外部の機器(本実施形態では情報コンテンツ受信装置2)と信号の送受信を行うためのBluetoothインタフェースを含んでいてもよい。 The input operation control unit 110 is configured by connecting a CPU 101, a ROM 102, a RAM 103, and an I / F 105 to a bus 106. A touch panel 11, input keys 12, and a communication unit 120 are connected to the I / F 105. The communication unit 120 may include a Bluetooth interface for transmitting / receiving a signal to / from an external device (in this embodiment, the information content receiving device 2) using, for example, Bluetooth (registered trademark).
 CPU101は、入力操作装置1を制御するための制御部であり、例えばROM102に予め格納されたプログラムに従って所定の処理を実行する。この所定の処理の一つは、例えば、タッチパネル11及び/又は入力キー12に対するユーザの操作を判別または解析し、この操作に応じた情報を生成する処理である。CPU101で生成した情報は、I/F105を介して通信部120に出力され、通信部120より情報コンテンツ受信装置2へBluetooth等を用いて無線により送信される。タッチパネル11及び/又は入力キー12に対する操作の情報、及び/またはCPU101で生成した情報を、I/F105を介してRAM103に出力し、このRAM103に一時的に記憶しておいてもよい。 The CPU 101 is a control unit for controlling the input operation device 1 and executes predetermined processing according to a program stored in advance in the ROM 102, for example. One of the predetermined processes is, for example, a process of discriminating or analyzing a user operation on the touch panel 11 and / or the input key 12 and generating information corresponding to the operation. The information generated by the CPU 101 is output to the communication unit 120 via the I / F 105, and is transmitted from the communication unit 120 to the information content receiving device 2 wirelessly using Bluetooth or the like. Information on operations on the touch panel 11 and / or the input keys 12 and / or information generated by the CPU 101 may be output to the RAM 103 via the I / F 105 and temporarily stored in the RAM 103.
 本体10の上面には、タッチパネル11(タッチセンサに相当する)と一つ以上の入力キー12を備える(図1参照)。 A touch panel 11 (corresponding to a touch sensor) and one or more input keys 12 are provided on the upper surface of the main body 10 (see FIG. 1).
 タッチパネル11は、ユーザの操作意図を伝達するために用いられるユーザ指示媒体(例えば指、スタイラスペン等)13により指示されるタッチパネル11上の指示ポイント14(図中ではXで示すが、説明のために便宜上示しているものであり、実際に入力操作装置1に備えられるものではない)の2次元位置情報を取得し、入力操作装置1は、該取得した2次元位置情報を情報コンテンツ受信装置2に送信する。この2次元位置情報は、タッチパネル11のタッチ面内の2次元座標系で表現される。なお、指示ポイント14は、タッチパネル11にユーザ指示媒体13が非接触な状態でタッチパネル11がユーザ指示媒体13を検知した位置を示す場合と、ユーザ指示媒体13がタッチパネル11に接触した状態でタッチパネル11がユーザ指示媒体13を検知した位置を示す場合とがある。 The touch panel 11 is indicated by an instruction point 14 (indicated by X in the drawing) indicated by a user instruction medium (for example, a finger, a stylus pen, etc.) 13 used for transmitting a user's operation intention. 2, and is not actually provided in the input operation device 1), and the input operation device 1 uses the acquired two-dimensional position information as the information content reception device 2. Send to. This two-dimensional position information is expressed by a two-dimensional coordinate system in the touch surface of the touch panel 11. The instruction point 14 indicates a position where the touch panel 11 detects the user instruction medium 13 while the user instruction medium 13 is not in contact with the touch panel 11 and a state where the user instruction medium 13 is in contact with the touch panel 11. May indicate a position where the user instruction medium 13 is detected.
 タッチパネル11は、ユーザ指示媒体13とタッチ面との接触或いは接近の度合いに応じた静電容量を検出するタッチセンサを用いて構成され、ユーザ指示媒体13を検出した際のユーザ指示媒体とタッチ面との接触或いは接近の度合いを示す情報を出力する。この接触或いは接近の度合いは、非接触な状態におけるタッチパネル11とユーザ指示媒体13との離間距離の大小、及び接触状態との両者を含む。ユーザ指示媒体13をタッチパネル11に接近させ、また接触させる行為を検出することで、本実施形態に係る入力操作方法が実行される。 The touch panel 11 is configured using a touch sensor that detects capacitance according to the degree of contact or proximity between the user instruction medium 13 and the touch surface, and the user instruction medium and the touch surface when the user instruction medium 13 is detected. Outputs information indicating the degree of contact or approach. This degree of contact or approach includes both the size of the separation distance between the touch panel 11 and the user instruction medium 13 in a non-contact state, and the contact state. The input operation method according to the present embodiment is executed by detecting an action in which the user instruction medium 13 is brought close to the touch panel 11 and brought into contact with the touch panel 11.
 上記指示ポイント14のタッチ面内の2次元座標系及び接触或いは接近の度合いを示す情報を合わせて指示位置情報と称する。入力操作制御部110は、タッチパネル11が出力した指示位置情報を通信部120へ出力し、通信部120は情報コンテンツ受信装置2へ指示位置情報を送信する。 The two-dimensional coordinate system in the touch surface of the indication point 14 and information indicating the degree of contact or approach are collectively referred to as indication position information. The input operation control unit 110 outputs the designated position information output from the touch panel 11 to the communication unit 120, and the communication unit 120 transmits the designated position information to the information content receiving device 2.
 情報コンテンツ受信装置2は、入力操作装置1のほか、情報蓄積装置4、放送波を受信する放送受信媒体5、通信ネットワーク6、及び表示装置3と通信接続される。そして情報コンテンツ受信装置2は、入力操作装置1の指示に従い、情報蓄積装置4、放送受信媒体5、通信ネットワーク6からの少なくとも一つの情報コンテンツを受信し、後述する処理を行った後、情報コンテンツを表示装置3に表示させる。 The information content receiver 2 is connected to the information storage device 4, the broadcast receiving medium 5 that receives broadcast waves, the communication network 6, and the display device 3 in addition to the input operation device 1. Then, the information content receiving device 2 receives at least one information content from the information storage device 4, the broadcast receiving medium 5, and the communication network 6 in accordance with the instruction of the input operation device 1, performs the processing described later, Is displayed on the display device 3.
 情報コンテンツ受信装置2は、大きくは情報コンテンツを受信する機能部を構成する蓄積情報入出力部24、放送受信部25、ネットワーク送受信部26と、後述する表示ポインタやオブジェクト、また情報コンテンツの表示制御を行う表示制御装置とを含んで構成される。 The information content receiving apparatus 2 includes a storage information input / output unit 24, a broadcast receiving unit 25, a network transmission / reception unit 26, which constitute a functional unit that receives information content. And a display control device that performs the above.
 上記表示制御装置は、例えば、入力操作処理部21、操作画面生成部22、中央制御部23、及び表示出力部27を含んで構成される。 The display control device includes, for example, an input operation processing unit 21, an operation screen generation unit 22, a central control unit 23, and a display output unit 27.
 入力操作処理部21は、入力操作装置1から例えばBluetooth等を用いて無線により送信された指示ポイント14の指示位置情報を受信し、無線伝送のために施された変調を復調する。また指示位置情報が符号化されている場合は復号処理を行う。また入力操作処理部21は、復調、復号処理した指示位置情報を中央制御部23の表示ポインタ処理部23aに出力するとともに、指示位置情報の受信に応答した或いは基づいた制御信号を操作画面生成部22に出力する。 The input operation processing unit 21 receives the instruction position information of the instruction point 14 transmitted from the input operation device 1 by radio using, for example, Bluetooth, and demodulates the modulation performed for the radio transmission. If the indicated position information is encoded, a decoding process is performed. The input operation processing unit 21 outputs the demodulated / decoded instruction position information to the display pointer processing unit 23a of the central control unit 23, and outputs a control signal in response to or based on reception of the instruction position information. 22 to output.
 蓄積情報入出力部24は、情報蓄積装置4に蓄えられた情報コンテンツを読み出し、該情報コンテンツを、中央制御部23に送る。 The stored information input / output unit 24 reads the information content stored in the information storage device 4 and sends the information content to the central control unit 23.
 一例として、情報コンテンツは動画像や写真、或いは音声である。該情報コンテンツは、原状態では情報量が多く、蓄積に適さないため、一般に動画像ではMPEG(Motion Picture Experts Group)、写真ではJPEG(Joint Photographic Experts Group)で規定された符号圧縮処理が施されている。 As an example, the information content is a moving image, a photograph, or a sound. Since the information content has a large amount of information in the original state and is not suitable for storage, it is generally subjected to code compression processing specified by MPEG (Motion Picture Experts Group) for moving images and JPEG (Joint Photographic Experts Group) for photographs. ing.
 中央制御部23は、符号圧縮された情報コンテンツの復号伸張処理を行い、原情報の形態に復元して表示出力部27に出力する。 The central control unit 23 performs a decoding / decompression process on the code-compressed information content, restores the original information form, and outputs it to the display output unit 27.
 放送受信部25は、放送受信媒体5より地上波及び衛星波の放送信号を受信し、復調処理し、MPEG等で符号圧縮された情報コンテンツを得る。中央制御部23は、該情報コンテンツを原情報の形態に復元して、表示出力部27に出力する。 The broadcast receiving unit 25 receives terrestrial and satellite broadcast signals from the broadcast receiving medium 5, demodulates them, and obtains information content code-compressed by MPEG or the like. The central control unit 23 restores the information content to the form of the original information and outputs it to the display output unit 27.
 ネットワーク送受信部26は、HTTP(Hypertext Transfer Protocol)やRTP(Real-time Transport Protocol)などのよく知られた通信プロトコルに従い、通信ネットワーク6から符号圧縮された情報コンテンツを受信する。中央制御部23は、該情報コンテンツを原情報の形態に復元して、表示出力部27に出力する。 The network transmission / reception unit 26 receives information content that is code-compressed from the communication network 6 according to a well-known communication protocol such as HTTP (Hypertext Transfer Protocol) or RTP (Real-time Transport Protocol). The central control unit 23 restores the information content to the form of the original information and outputs it to the display output unit 27.
 中央制御部23は、放送受信部25やネットワーク送受信部26で得る情報コンテンツをそのままもしくは再符号化して蓄積情報入出力部24を経由して、情報蓄積装置4に蓄積させることもできる。 The central control unit 23 can also store the information content obtained by the broadcast receiving unit 25 and the network transmission / reception unit 26 as they are or re-encode them in the information storage device 4 via the stored information input / output unit 24.
 さらに中央制御部23は、動画像、写真、音声以外の情報コンテンツを取り扱うこともできる。例えば、中央制御部23はネットワーク送受信部26、通信ネットワーク6を介し、HTML(Hyper Text Markup Language)などのウェブ情報を処理する。中央制御部23は該処理されたウェブ情報も、表示出力部27に出力する。 Furthermore, the central control unit 23 can handle information contents other than moving images, photographs, and sounds. For example, the central control unit 23 processes web information such as HTML (Hyper Text Markup Language) via the network transmission / reception unit 26 and the communication network 6. The central control unit 23 also outputs the processed web information to the display output unit 27.
 中央制御部23は、表示ポインタ処理部23aを含む。表示ポインタ処理部23aは、入力操作処理部21から出力された指示位置情報に基づいて、表示装置3の表示画面内における表示ポインタの表示位置を決定する。また、表示ポインタ処理部23aは、指示位置情報に含まれる接触或いは接近の度合いを示す情報に基づいて、表示ポインタのサイズを決定する。ここで表示ポインタ33は、ユーザが入力操作を介して選択した表示画面内の範囲を示すものである。更に、表示ポインタ処理部23aは、表示装置3の表示画面の2次元座標系とタッチパネル11面内の2次元座標系とを対応付けた操作表示系情報を参照し、指示ポイント14の2次元位置情報を前記表示画面の2次元座標系に変換した点に、決定したサイズの表示ポインタ33の中心36を合わせて表示するための表示ポインタ情報を操作画面生成部22に出力する。画面内における表示ポインタ33に覆われた領域が、ユーザが入力操作対象とする範囲である。 The central control unit 23 includes a display pointer processing unit 23a. The display pointer processing unit 23 a determines the display position of the display pointer in the display screen of the display device 3 based on the indicated position information output from the input operation processing unit 21. Further, the display pointer processing unit 23a determines the size of the display pointer based on information indicating the degree of contact or approach included in the designated position information. Here, the display pointer 33 indicates a range in the display screen selected by the user through the input operation. Further, the display pointer processing unit 23a refers to the operation display system information in which the two-dimensional coordinate system of the display screen of the display device 3 is associated with the two-dimensional coordinate system in the surface of the touch panel 11, and the two-dimensional position of the indication point 14 Display pointer information for displaying the information 36 converted into the two-dimensional coordinate system of the display screen and the center 36 of the display pointer 33 having the determined size is output to the operation screen generation unit 22. An area covered by the display pointer 33 in the screen is a range to be input by the user.
 操作画面生成部22は、入力操作処理部21から出力された制御信号をトリガとして、複数のオブジェクト34が配列されたベース画面を生成する。更に、操作画面生成部22は、このベース画面に、中央制御部23から出力された表示ポインタ情報に基づいて生成した表示ポインタ33を合成して操作画面32を生成する。 The operation screen generation unit 22 generates a base screen on which a plurality of objects 34 are arranged using the control signal output from the input operation processing unit 21 as a trigger. Further, the operation screen generation unit 22 generates an operation screen 32 by combining the base screen with the display pointer 33 generated based on the display pointer information output from the central control unit 23.
 表示出力部27は、操作画面生成部22からの操作画面32と中央制御部23からの主画面31を合成して、例えばHDMIインタフェースを介して表示装置3に出力する。本実施形態では、表示装置3の全画面領域を主画面31と称し、操作画面32は主画面31よりも小さい画面領域を用いて構成する例を挙げて説明するが、主画面31と操作画面32とが同じ大きさ、即ち操作画面32は表示装置3の全画面領域を用いて構成されてもよい。また、本実施形態では操作画面32内に表示ポインタ33を表示する。そこで、操作画面生成部22は、生成した操作画面32の大きさを示す情報を表示ポインタ処理部23aに出力し、表示ポインタ処理部23aは、操作画面32内における表示ポインタ33の表示位置を決定する。 The display output unit 27 synthesizes the operation screen 32 from the operation screen generation unit 22 and the main screen 31 from the central control unit 23 and outputs them to the display device 3 via, for example, an HDMI interface. In the present embodiment, the entire screen area of the display device 3 is referred to as a main screen 31, and the operation screen 32 is described using an example of a screen area smaller than the main screen 31. The operation screen 32 may be configured using the entire screen area of the display device 3. In the present embodiment, a display pointer 33 is displayed in the operation screen 32. Therefore, the operation screen generation unit 22 outputs information indicating the size of the generated operation screen 32 to the display pointer processing unit 23a, and the display pointer processing unit 23a determines the display position of the display pointer 33 in the operation screen 32. To do.
 表示装置3は、情報コンテンツを主画面31に、入力操作情報を操作画面32に表示する。 The display device 3 displays information content on the main screen 31 and input operation information on the operation screen 32.
 操作画面32には、それぞれが定められた実行処理に対応したオブジェクト34が一つ以上配置され、入力操作装置1の指示ポイント14に一致する表示ポインタの中心36(図中ではXで示すが、説明のために便宜上示しているものであり、実際に操作画面32に表示されるものではない)を含む有限領域の表示ポインタ33が表示され、該表示ポインタ33の領域の中にあるオブジェクトの中で、一つ以上がハイライトされたオブジェクト35として、他のオブジェクトとは異なる表示がなされる。表示出力部27は、操作画面32上のオブジェクトの位置とそれに重畳された表示ポインタ33の位置とを比較し、表示ポインタ33が重なったオブジェクトを特定し、ハイライト表示する。ハイライト表示は、表示ポインタが重畳されるオブジェクトを、表示ポインタが重畳されないオブジェクトとは異なる表示態様で表示する際の一態様であり、例えば表示ポインタが重畳されるオブジェクトを点滅させる点滅表示であってもよい。 On the operation screen 32, one or more objects 34 corresponding to the determined execution processes are arranged, and a display pointer center 36 (indicated by X in the figure, which coincides with the instruction point 14 of the input operation device 1). The display pointer 33 of a finite area including the display pointer 33 is displayed for convenience of explanation and is not actually displayed on the operation screen 32, and the inside of the object in the area of the display pointer 33 is displayed. As a result, one or more highlighted objects 35 are displayed differently from the other objects. The display output unit 27 compares the position of the object on the operation screen 32 with the position of the display pointer 33 superimposed thereon, identifies the object on which the display pointer 33 overlaps, and highlights the object. Highlight display is an aspect in which an object on which a display pointer is superimposed is displayed in a different display mode from an object on which a display pointer is not superimposed. For example, it is a blinking display that blinks an object on which a display pointer is superimposed. May be.
 ハイライトされたオブジェクト35は、ユーザの選択対象候補となるオブジェクトであり、該ハイライトされたオブジェクト35をユーザが選択することにより入力操作が完了する。 The highlighted object 35 is an object that is a candidate for user selection, and the input operation is completed when the user selects the highlighted object 35.
 この選択操作は、タッチパネル11とユーザ指示媒体13との接近度合いが徐々に強くなり、それに応じて表示ポインタのサイズが小さくなり、表示ポインタが重畳されるオブジェクトが1つになった場合に、表示出力部27が中央制御部23に対して1つになったオブジェクトが選択されたことを示す選択操作信号を出力し、中央制御部23がそのオブジェクトをユーザが選択したオブジェクトとして決定してもよい。 This selection operation is performed when the degree of approach between the touch panel 11 and the user instruction medium 13 gradually increases, the size of the display pointer decreases accordingly, and the number of objects on which the display pointer is superimposed becomes one. The output unit 27 may output a selection operation signal indicating that one object has been selected to the central control unit 23, and the central control unit 23 may determine the object as the object selected by the user. .
 図2Bは情報コンテンツ受信装置のハードウェア構成例を示す図である。情報コンテンツ受信装置2は、制御バス201とI/Oバス210とを備える。制御バス201には、グラフィックボード202、映像音声ボード203、CPU204、RAM205、ROM206が接続される。I/Oバス210には、CPU204、USBI/F211、チューナ212、及びLANI/F213と通信部215、及び電源ユニット214が接続されて構成される。グラフィックボード202には、HDMI信号が入力される。 FIG. 2B is a diagram illustrating a hardware configuration example of the information content receiving apparatus. The information content receiving device 2 includes a control bus 201 and an I / O bus 210. The control bus 201 is connected to a graphic board 202, a video / audio board 203, a CPU 204, a RAM 205, and a ROM 206. A CPU 204, USB I / F 211, tuner 212, LAN I / F 213, communication unit 215, and power supply unit 214 are connected to the I / O bus 210. An HDMI signal is input to the graphic board 202.
 電源ユニット214は、グラフィックボード202、映像音声ボード203、CPU204、RAM205、ROM206、USBI/F211、チューナ212、LANI/F213、通信部215等の各部に電源を供給する。USBI/F211は、図1の蓄積情報入出力部24に対応するものであり、例えばUSB端子を備えた外付けHDD等の情報蓄積装置4から情報コンテンツを読み出してCPU204に送信する。チューナ212は放送受信部25に対応するものであり、例えばアンテナ等の放送受信媒体5を通してデジタル放送で送信された情報コンテンツを受信し、復調及び必要に応じ復号してCPU204に送信する。LANI/F213は、図1のネットワーク送受信部26に対応するものであり、通信ネットワーク6を介して所望のWebサイトにアクセスして情報コンテンツを取得する。 The power supply unit 214 supplies power to each unit such as the graphic board 202, the video / audio board 203, the CPU 204, the RAM 205, the ROM 206, the USB I / F 211, the tuner 212, the LAN I / F 213, and the communication unit 215. The USB I / F 211 corresponds to the stored information input / output unit 24 in FIG. 1, and reads information content from the information storage device 4 such as an external HDD having a USB terminal, for example, and transmits it to the CPU 204. The tuner 212 corresponds to the broadcast receiving unit 25, and receives information content transmitted by digital broadcast through the broadcast receiving medium 5 such as an antenna, for example, demodulates and decodes as necessary, and transmits it to the CPU 204. The LAN I / F 213 corresponds to the network transmission / reception unit 26 of FIG. 1 and accesses a desired Web site via the communication network 6 to acquire information content.
 CPU204は、図1の中央制御部23に対応するものであり、上記USBI/F211、チューナ212及びLANI/F213で受信或いは取得した情報コンテンツに含まれる符号化映像・音声データを、映像音声ボード203及びRAM205を用いて復号する。この復号のためのプログラムは例えばROM206に記憶されており、CPU204は、このROM206に記憶されたプログラムを読み出して上記の復号処理を実行する。またCPU204は、必要に応じて、映像音声ボード203及び/またはRAM205を用いてフレームレート変換処理、スケーリング処理、ブライトネス補正処理、コントラスト強調処理等を行う。 The CPU 204 corresponds to the central control unit 23 in FIG. 1, and the encoded video / audio data included in the information content received or acquired by the USB I / F 211, the tuner 212, and the LAN I / F 213 is converted into the video / audio board 203. And decoding using the RAM 205. The decryption program is stored in the ROM 206, for example, and the CPU 204 reads the program stored in the ROM 206 and executes the decryption process. Further, the CPU 204 performs frame rate conversion processing, scaling processing, brightness correction processing, contrast enhancement processing, and the like using the audio / video board 203 and / or the RAM 205 as necessary.
 通信部215は、図1の入力操作処理部21に対応するものであり、例えばBluetoothにより図1の入力操作装置1と通信するためのBluetooth I/Fを含んでいる。そして通信部215は、入力操作装置1から無線により送信された指示位置情報を受信し、Bluetoothの規格或いは仕様に適合した復調・復号を行ってCPU204に出力する。CPU204は更に、この指示位置情報に基づき制御信号を生成してグラフィックボード202に送信する。 The communication unit 215 corresponds to the input operation processing unit 21 in FIG. 1 and includes, for example, a Bluetooth I / F for communicating with the input operation device 1 in FIG. 1 by Bluetooth. The communication unit 215 receives the designated position information transmitted from the input operation device 1 by radio, performs demodulation / decoding conforming to the Bluetooth standard or specification, and outputs the demodulated information to the CPU 204. Further, the CPU 204 generates a control signal based on the indicated position information and transmits it to the graphic board 202.
 CPU204は更に図1の表示ポインタ処理部23aを含んでおり、通信部215からの指示位置情報を用いて上述した表示ポインタ33の表示位置、サイズを決定するための処理を実行する。またグラフィックボード202は、図1の操作画面生成部22に対応するとともに、表示出力部27の機能の一部を有するものである。そしてグラフィックボード202は、CPU204からの制御信号、上記指示位置情報とCPU204からの表示ポインタ33の情報を用いて、オブジェクト34及び表示ポインタ33を含む操作画面32を生成する。またグラフィックボード202は、上記USBI/F211、チューナ212またはLANI/F213で受信或いは取得され、かつCPU204で復号された情報コンテンツが入力され、該情報コンテンツを主画面31とし、この主画面31に操作画面32を合成する処理を行い、HDMII/F216に出力する。このCPU204の処理とグラフィックボード202の処理は、ROM206に記憶されたプログラムを用いて実行される。 The CPU 204 further includes the display pointer processing unit 23a shown in FIG. 1, and executes the process for determining the display position and size of the display pointer 33 described above using the indicated position information from the communication unit 215. The graphic board 202 corresponds to the operation screen generation unit 22 in FIG. 1 and has a part of the function of the display output unit 27. The graphic board 202 generates an operation screen 32 including the object 34 and the display pointer 33 using the control signal from the CPU 204, the indicated position information, and the information on the display pointer 33 from the CPU 204. The graphic board 202 receives information content received or acquired by the USB I / F 211, the tuner 212 or the LAN I / F 213 and decrypted by the CPU 204. The information content is used as a main screen 31, and the main screen 31 is operated. A process for synthesizing the screen 32 is performed and output to the HDMII / F216. The processing of the CPU 204 and the processing of the graphic board 202 are executed using a program stored in the ROM 206.
 HDMII/F216は、図1の表示出力部27の一部の機能を有するものであり、グラフィックボード202から出力された操作画面32とコンテンツとが合成された画像をHDMI規格或いは仕様に適合したフォーマット又は信号形態に変換して表示装置3に出力する。HDMII/F216はCECラインを含んでおり、このCECラインを介して表示装置3とコマンド或いはメッセージの遣り取りを行ってもよい。 The HDM II / F 216 has a part of the function of the display output unit 27 of FIG. 1, and a format that conforms to the HDMI standard or specification of an image obtained by combining the operation screen 32 output from the graphic board 202 and the content. Or it converts into a signal form and outputs to the display apparatus 3. FIG. The HDMII / F 216 includes a CEC line, and commands or messages may be exchanged with the display device 3 via the CEC line.
 図3A~図3Cは、ユーザ選択動作の第一の実施形態であって、入力操作装置1のタッチパネル11と表示装置3の操作画面32の関係を示す図である。このうち図3Aは、タッチセンサ部側面図を示す。図3Bはタッチセンサ部上面図を示す。図3Cは操作画面を示す。 3A to 3C are diagrams showing a relationship between the touch panel 11 of the input operation device 1 and the operation screen 32 of the display device 3 according to the first embodiment of the user selection operation. 3A shows a side view of the touch sensor unit. FIG. 3B shows a top view of the touch sensor unit. FIG. 3C shows an operation screen.
 図3Aでは、入力操作装置1に備えられるタッチパネル11とユーザ指示媒体13との上下方向の距離dを示している。この距離dは、タッチパネル11とユーザ指示媒体13との接触或いは接近の度合いを示すパラメータである。タッチパネル11はユーザ指示媒体13との距離dに応じた静電容量の変化を検知する構成を備えたタッチセンサを用いる。これにより、ホバー状態、すなわちユーザ指示媒体13がタッチパネル11に非接触な状態で、タッチパネル11がユーザ指示媒体13の動きを検知する状態においても、タッチパネル11は、該距離dが小さくなるに応じて静電容量の値が大きくなり、ユーザ指示媒体13がタッチ面に近づいていることになる。表示ポインタ処理部23aは、距離dが小さいほど、表示ポインタ33のサイズを小さくする。 FIG. 3A shows a vertical distance d between the touch panel 11 provided in the input operation device 1 and the user instruction medium 13. This distance d is a parameter indicating the degree of contact or approach between the touch panel 11 and the user instruction medium 13. The touch panel 11 uses a touch sensor having a configuration for detecting a change in capacitance according to the distance d from the user instruction medium 13. Thus, even in a hover state, that is, in a state where the user instruction medium 13 is not in contact with the touch panel 11 and the touch panel 11 detects the movement of the user instruction medium 13, the touch panel 11 responds as the distance d decreases. The value of the capacitance increases, and the user instruction medium 13 is approaching the touch surface. The display pointer processing unit 23a decreases the size of the display pointer 33 as the distance d decreases.
 タッチパネル11とユーザ指示媒体13との接触或いは接近の度合いの検知は、上記のように静電容量の変化により検知する方法以外にも、ユーザ指示媒体13がタッチパネル11に加わる圧力のレベルを測定して検知する方法もある。そのいずれかを用いてもよいし、両方を組み合わせて用いてもよい。圧力により検知する方法を単独で用いる場合は、該圧力のレベルを距離dに換算することにより上記の接触或いは接近の度合いの検知することができる。また、静電容量により検知する方法と圧力により検知する方法の両方用いる場合においては、静電容量の変化を距離dの正値、圧力の変化を距離dの負値とすることによって、一つの変数で両方の値を表すようにすることも可能である。 The detection of the degree of contact or approach between the touch panel 11 and the user instruction medium 13 is performed by measuring the level of pressure applied to the touch panel 11 by the user instruction medium 13 in addition to the method of detecting by the change in capacitance as described above. There is also a method to detect. Either of them may be used, or both may be used in combination. When the method of detecting by pressure is used alone, the degree of contact or approach can be detected by converting the pressure level into distance d. Further, in the case of using both the method of detecting by electrostatic capacity and the method of detecting by pressure, the change in electrostatic capacity is set to a positive value of the distance d, and the change of pressure is set to a negative value of the distance d. It is also possible to represent both values with variables.
 図3Bに示すように、タッチパネル11を上部から見ると、タッチパネル11はその方形領域を4つの座標点(0,0)、(0,1)、(1,0)、(1,1)で表せる。 As shown in FIG. 3B, when the touch panel 11 is viewed from the top, the touch panel 11 has its rectangular area represented by four coordinate points (0, 0), (0, 1), (1, 0), and (1, 1). I can express.
 タッチパネル11に対するユーザ指示媒体13の位置情報である指示位置情報は、タッチ面を含む2次元平面上x-y座標の座標(a、b)(但し0≦a≦1、0≦b≦1)と、タッチ面からの距離dとを用いて特定できる。 The indicated position information which is the position information of the user indicating medium 13 with respect to the touch panel 11 is the coordinates (a, b) of the xy coordinates on the two-dimensional plane including the touch surface (where 0 ≦ a ≦ 1, 0 ≦ b ≦ 1). And the distance d from the touch surface.
 図3Cは操作画面32を示す図であり、指示位置情報がどのように使われるのかを示す図である。操作画面32の領域も、タッチパネル11のタッチ面と同様4つの座標点(0,0)、(0,s)、(t,0)、(t,s)で表される。 FIG. 3C is a diagram showing the operation screen 32, and is a diagram showing how the indicated position information is used. The area of the operation screen 32 is also represented by four coordinate points (0, 0), (0, s), (t, 0), and (t, s), similar to the touch surface of the touch panel 11.
 表示ポインタ処理部23aは、タッチ面を含む2次元平面のx-y座標と、表示装置3に表示される画面上の位置を特定するxa―ya座標とを対応付ける操作表示系情報を予め有しており、入力操作装置1から取得した指示ポイント14の座標(a,b)に対応する表示装置3の画面内の点を演算する。図3Cでは指示ポイント14に対応する点が、表示ポインタの中心36である。該表示ポインタの中心36は、操作画面32の領域の座標(A,B)で表せる。ここで、A,Bは下式(1)により求められる。
Figure JPOXMLDOC01-appb-M000001
The display pointer processing unit 23a has operation display system information that associates the xy coordinates of the two-dimensional plane including the touch surface with the xa-ya coordinates that specify the position on the screen displayed on the display device 3. The point on the screen of the display device 3 corresponding to the coordinates (a, b) of the indication point 14 acquired from the input operation device 1 is calculated. In FIG. 3C, the point corresponding to the indication point 14 is the center 36 of the display pointer. The center 36 of the display pointer can be represented by the coordinates (A, B) of the area of the operation screen 32. Here, A and B are obtained by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
 上記式(1)において、操作表示系情報は、t,sの各値及び式(1)に相当する。上記式(1)を用いることにより、タッチパネル11の形状というハードウェアの制約に関らず、操作画面32の任意の点をタッチパネル11において指示することができる。 In the above equation (1), the operation display system information corresponds to each value of t and s and equation (1). By using the above formula (1), an arbitrary point on the operation screen 32 can be indicated on the touch panel 11 regardless of hardware restrictions such as the shape of the touch panel 11.
 図4は、本実施形態に係る処理の流れを示すフローチャートであって、操作画面32を主画面31に重畳表示するか否かを自動的に行うための、入力操作装置1の入力操作制御部110における処理の流れを示している。 FIG. 4 is a flowchart showing the flow of processing according to the present embodiment, and an input operation control unit of the input operation device 1 for automatically determining whether or not the operation screen 32 is superimposed on the main screen 31. The flow of processing in 110 is shown.
 本フローチャートに示すように、まずタッチパネル11がユーザ指示媒体13の動きを検出すると(S101/YES)、入力操作制御部110がユーザ指示媒体13を検知したことを示す検知信号を生成し、通信部120を介して情報コンテンツ受信装置2に出力する(S102)。 As shown in this flowchart, first, when the touch panel 11 detects the movement of the user instruction medium 13 (S101 / YES), the input operation control unit 110 generates a detection signal indicating that the user instruction medium 13 is detected, and the communication unit The information is output to the information content receiving apparatus 2 via 120 (S102).
 情報コンテンツ受信装置2の入力操作処理部21は検知信号を受信すると、該検知信号或いはこれに基づく制御信号を操作画面生成部22及び中央制御部23に出力する。操作画面生成部22は、検知信号或いはこれに基づく制御信号をトリガとして操作画面32の表示をONさせるための表示ON信号を表示出力部27に出力する。 When the input operation processing unit 21 of the information content receiving apparatus 2 receives the detection signal, the input operation processing unit 21 outputs the detection signal or a control signal based on the detection signal to the operation screen generation unit 22 and the central control unit 23. The operation screen generator 22 outputs a display ON signal for turning on the display of the operation screen 32 to the display output unit 27 using a detection signal or a control signal based on the detection signal as a trigger.
 次にユーザ操作時間間隔を検知するための設定、例えば入力操作装置1に内蔵されたタイマーを初期化し、直近でユーザが操作してからの経過時間Tの計測を開始する(S103)。 Next, a setting for detecting the user operation time interval, for example, a timer built in the input operation device 1 is initialized, and measurement of the elapsed time T since the user operated most recently is started (S103).
 次のユーザ操作が行われたことをタッチパネル11が検知した場合(S104/YES)、入力操作制御部110は情報コンテンツ受信装置2に対して検知したユーザ操作の内容を示す操作情報を送信し(S105)、情報コンテンツ受信装置2の入力操作処理部21が受信する。その後操作情報に従った処理が情報コンテンツ受信装置2で実行される。この操作情報が指示位置情報の場合は、表示ポインタの表示処理が実行される。 When the touch panel 11 detects that the next user operation has been performed (S104 / YES), the input operation control unit 110 transmits operation information indicating the content of the detected user operation to the information content receiving device 2 ( S105), the input operation processing unit 21 of the information content receiving device 2 receives the information. Thereafter, processing according to the operation information is executed by the information content receiving device 2. If this operation information is the indicated position information, display pointer display processing is executed.
 その後、再度ユーザ操作間隔検知設定(S103)に戻る。ユーザ操作が行われる間、この処理が繰り返される。 Thereafter, the process returns to the user operation interval detection setting (S103) again. This process is repeated while a user operation is performed.
 ユーザ操作が検知されない状態(S104/NO)では、直近でユーザが操作してからの経過時間Tと閾値(Td)との大小を判定する(S106)。S106において、経過時間Tが閾値(Td)以下の場合(T=<Td)、ユーザ操作をさらに待ち受ける。 In the state where the user operation is not detected (S104 / NO), the magnitude of the elapsed time T and the threshold value (Td) since the most recent user operation is determined (S106). In S106, when the elapsed time T is equal to or less than the threshold value (Td) (T = <Td), the user operation is further waited.
 S106において、経過時間Tが閾値(Td)より大きい場合(T>Td)には、操作画面32の表示をOFFさせるべく、入力操作装置1が表示OFF信号を情報コンテンツ受信装置2に送る(S107)。そして情報コンテンツ受信装置2の入力操作処理部21は、操作画面生成部22を介して表示出力部27に表示OFF信号を送る。 In S106, when the elapsed time T is larger than the threshold value (Td) (T> Td), the input operation device 1 sends a display OFF signal to the information content receiving device 2 to turn off the display of the operation screen 32 (S107). ). Then, the input operation processing unit 21 of the information content receiving device 2 sends a display OFF signal to the display output unit 27 via the operation screen generation unit 22.
 この結果、ユーザ操作が行われない場合に操作画面32が主画面31に重畳されて表示されることはなく、主画面31に表示される情報コンテンツの一部が操作画面32で視聴を不必要に妨げられることがなくなる。 As a result, when no user operation is performed, the operation screen 32 is not displayed superimposed on the main screen 31, and part of the information content displayed on the main screen 31 is not required to be viewed on the operation screen 32. Will not be disturbed.
 次に情報コンテンツ受信装置2及び表示装置3の処理について説明する。図5は情報コンテンツ受信装置2及び表示装置3の処理の流れを示すフローチャートである。 Next, processing of the information content receiving device 2 and the display device 3 will be described. FIG. 5 is a flowchart showing a processing flow of the information content receiving device 2 and the display device 3.
 入力操作処理部21が入力操作装置1から表示ON信号を受信すると(S201でYESの場合)、入力操作処理部21は操作画面生成部22に表示ON信号を出力し、操作画面生成部22は表示出力部27を介して操作画面32を主画面31に重畳して表示させる(S202)。入力操作処理部21が表示ON信号を受信しない場合(S201でNOの場合)、待機する。 When the input operation processing unit 21 receives a display ON signal from the input operation device 1 (YES in S201), the input operation processing unit 21 outputs a display ON signal to the operation screen generation unit 22, and the operation screen generation unit 22 The operation screen 32 is displayed superimposed on the main screen 31 via the display output unit 27 (S202). If the input operation processing unit 21 does not receive the display ON signal (NO in S201), it waits.
 入力操作処理部21が指示位置情報を受信すると(S203でYESの場合)、表示ポインタ処理部23aは指示位置情報から距離dを読みとり、表示ポインタのサイズを決定する(S204)。本実施形態では円形の表示ポインタを用いるので、表示ポインタ処理部23aは表示ポインタの半径を距離dの値に応じて決定する。より詳しくは距離dが大きいときは表示ポインタの半径を大きく、距離dが小さいときは表示ポインタの半径を小さくする。 When the input operation processing unit 21 receives the designated position information (YES in S203), the display pointer processing unit 23a reads the distance d from the designated position information and determines the size of the display pointer (S204). Since a circular display pointer is used in this embodiment, the display pointer processing unit 23a determines the radius of the display pointer according to the value of the distance d. More specifically, the radius of the display pointer is increased when the distance d is large, and the radius of the display pointer is decreased when the distance d is small.
 さらに表示ポインタ処理部23aは、指示位置情報から指示ポイント14の2次元位置情報を読取り、操作画面32における指示ポイント14の座標に相当する点を算出し、この点を表示ポインタ33の中心36とし表示出力部27に出力する。 Further, the display pointer processing unit 23 a reads the two-dimensional position information of the instruction point 14 from the instruction position information, calculates a point corresponding to the coordinates of the instruction point 14 on the operation screen 32, and sets this point as the center 36 of the display pointer 33. The data is output to the display output unit 27.
 さらに操作画面32における指示ポイント14に相当する点に表示ポインタの中心36を合わせたときに、表示ポインタ33に含まれる、すなわち円形の表示ポインタ33の内側にあるオブジェクト34を特定し表示出力部27に出力する(S205)。 Further, when the center 36 of the display pointer is aligned with a point corresponding to the instruction point 14 on the operation screen 32, the object 34 included in the display pointer 33, that is, inside the circular display pointer 33 is specified and the display output unit 27 is identified. (S205).
 表示出力部27は、操作画面32における指示ポイント14に相当する位置に表示ポインタ33の中心36を合わせて表示するとともに、特定されたオブジェクト34をハイライト表示する(S206)。 The display output unit 27 displays the specified object 34 in a highlighted manner while aligning and displaying the center 36 of the display pointer 33 at a position corresponding to the instruction point 14 on the operation screen 32 (S206).
 表示OFF信号を受信しない場合は、S203へ戻り、操作画面32内に表示ポインタ33の処理を続行し、表示OFF信号を受信した場合は、操作画面32の表示及びそれに伴い表示ポインタ33の表示処理を終了する。 If the display OFF signal is not received, the process returns to S203 to continue the processing of the display pointer 33 in the operation screen 32. If the display OFF signal is received, the operation screen 32 is displayed and the display pointer 33 is displayed accordingly. Exit.
 以下、画面表示例をいくつか説明する。図6は、タッチパネル11とユーザ指示媒体13の距離dが異なる場合の2つの操作画面32を示す図であって、図6Aは距離dが比較的大きい場合、図6Bは距離dが比較的小さい場合を示す。 Hereafter, some screen display examples will be explained. FIG. 6 is a diagram showing two operation screens 32 when the distance d between the touch panel 11 and the user instruction medium 13 is different. FIG. 6A shows a case where the distance d is relatively large, and FIG. 6B shows a case where the distance d is relatively small. Show the case.
 図6Aの操作画面32では、表示ポインタ33aの領域は比較的大きく、表示ポインタ33aの領域内に複数のオブジェクト34が含まれる。これら複数のオブジェクト34の中から、複数のハイライトされたオブジェクト35が存在する。本例では、全体が表示ポインタ33aの領域に含まれるオブジェクト34をハイライトされたオブジェクト35としている。 6A, the area of the display pointer 33a is relatively large, and a plurality of objects 34 are included in the area of the display pointer 33a. Among these objects 34, there are a plurality of highlighted objects 35. In this example, the object 34 that is entirely included in the area of the display pointer 33a is the highlighted object 35.
 この時ユーザは、所望とするオブジェクトを選択するために、タッチパネル11にユーザ指示媒体13を更に近づける。その様子を図6Bに示す。 At this time, the user brings the user instruction medium 13 closer to the touch panel 11 in order to select a desired object. This is shown in FIG. 6B.
 図6Bでは、ユーザがタッチパネル11にユーザ指示媒体13をより接近させた状態であり、表示ポインタ33bの領域は小さくなっている。表示ポインタ33bの領域の中でハイライトされたオブジェクト35は一つであり、ユーザが、ハイライトされたオブジェクト35が所望のオブジェクトであるように、ユーザ指示媒体13の位置を調整した結果である。ユーザはさらにユーザ指示媒体13をタッチパネル11に接触させて押下し、ハイライトされたオブジェクト35を選択する。 In FIG. 6B, the user has brought the user instruction medium 13 closer to the touch panel 11, and the area of the display pointer 33b is small. This is a result of adjusting the position of the user instruction medium 13 so that the highlighted object 35 is a desired object, so that the highlighted object 35 is one in the area of the display pointer 33b. . The user further presses the user instruction medium 13 in contact with the touch panel 11 to select the highlighted object 35.
 これにより、ユーザ選択動作に応じて表示ポインタ33の領域の大きさを変えており、ターゲットに近づくと表示ポインタ33が小さく絞り込まれるようになっており、ユーザ動作との協調性が一層向上する。 Thereby, the size of the area of the display pointer 33 is changed according to the user selection operation, and the display pointer 33 is narrowed down closer to the target, and the cooperation with the user operation is further improved.
 他例として、表示ポインタ33b内にオブジェクトが1つしか含まれなくなった時点で、そのオブジェクトを選択されたオブジェクトとしてもよい。 As another example, when only one object is included in the display pointer 33b, the object may be selected.
 次に、オブジェクト34の具体的な表示例及び操作例について図7を用いて説明する。図7Aは地上波が選択された状態のメニュー操作画面を示す図である。図7Bは、地上波放送操作画面を示す図である。 Next, a specific display example and operation example of the object 34 will be described with reference to FIG. FIG. 7A is a diagram showing a menu operation screen in a state where terrestrial waves are selected. FIG. 7B is a diagram showing a terrestrial broadcast operation screen.
 図7Aに示すように、メニュー操作画面32aには、地上波放送を選択するオブジェクト、衛星放送を選択するオブジェクト、BD(Blu-ray Disc)/DVD(Digital Versatile Disk)装置を選択するオブジェクトや、ニュース、映画、音楽、写真など、表示させたいコンテンツを選択させるためのオブジェクト34が配置される。 As shown in FIG. 7A, the menu operation screen 32a includes an object for selecting terrestrial broadcast, an object for selecting satellite broadcast, an object for selecting a BD (Blu-ray Disc) / DVD (Digital Versatile Disk) device, An object 34 for selecting contents to be displayed such as news, movies, music, and photos is arranged.
 図7Aで示した例では、表示ポインタ33によりハイライトされたオブジェクト35が選択される。このオブジェクト35は、地上波放送のオブジェクトである。この結果同図7Bの地上波放送アプリの操作画面32bが表示される、即ちメニュー操作画面32aから地上波放送アプリの操作画面32bに遷移する。 In the example shown in FIG. 7A, the object 35 highlighted by the display pointer 33 is selected. This object 35 is a terrestrial broadcast object. As a result, the terrestrial broadcast application operation screen 32b shown in FIG. 7B is displayed, that is, the menu operation screen 32a is changed to the terrestrial broadcast application operation screen 32b.
 図7Bの地上波放送アプリの操作画面32bには、1CH~12CHのチャンネル選択オブジェクトと番組表やデータ放送を直接選択できるオブジェクトが配置されている。また矢印の記号(←、→)のオブジェクトは、前のページや次のページにページ送りするオブジェクトで、一度に表示するオブジェクトの数を制限できるようにして、ユーザの視認性を上げている。該矢印の記号(←、→)のオブジェクトは、図7Aにもある。 In the operation screen 32b of the terrestrial broadcast application in FIG. 7B, a channel selection object of 1CH to 12CH and an object that can directly select a program guide or data broadcast are arranged. Also, the objects indicated by the arrow symbols (←, →) are objects that are paged to the previous page or the next page, and the number of objects that can be displayed at one time can be limited to improve the visibility of the user. The object indicated by the arrow symbol (←, →) is also in FIG. 7A.
 図8も操作画面例を示す図であって、図8Aはメニュー操作画面32aでSNSが選択された状態を示し、図8Bは、主画面31にSNS(Social Network Service)ログイン画面が表示された状態を示す。 FIG. 8 is also a diagram showing an example of the operation screen. FIG. 8A shows a state where the SNS is selected on the menu operation screen 32a, and FIG. 8B shows that the SNS (Social Network Service) login screen is displayed on the main screen 31. Indicates the state.
 上記SNSアプリケーションの選択がなされると、図8Bに示すように、主画面31にはSNSログイン画面31aが現れる。この時、メニュー操作画面32aは自動的に非表示となる。該主画面31には、ID(Identification)入力枠、Password入力枠などの文字入力枠37のほか、ログインの実行ボタン38や、文字入力枠37や実行ボタン38を選択するのに用いるカーソル39が配置される。 When the SNS application is selected, an SNS login screen 31a appears on the main screen 31, as shown in FIG. 8B. At this time, the menu operation screen 32a is automatically hidden. The main screen 31 includes a character input frame 37 such as an ID (Identification) input frame and a password input frame, a login execution button 38, and a cursor 39 used to select the character input frame 37 and the execution button 38. Be placed.
 SNSログイン画面31aは、PC(Personal Computer)などで用いられている画面に近いものであり、文字入力枠37、実行ボタン38、カーソル39も同様の機能をもつ。 The SNS login screen 31a is similar to a screen used in a PC (Personal Computer) or the like, and the character input frame 37, the execution button 38, and the cursor 39 have the same functions.
 文字入力枠37には、入力操作装置1の入力キー12により文字入力を行う。またカーソル39の移動は、前記同様、ユーザ指示媒体13をタッチパネル11上で動かすことで行うことができる。この時、タッチパネル11と主画面31とは、座標系の同一化を行なわず、ユーザ指示媒体13の相対的な移動情報によって、カーソルを移動させてもよい。 In the character input frame 37, characters are input by the input key 12 of the input operation device 1. The cursor 39 can be moved by moving the user instruction medium 13 on the touch panel 11 as described above. At this time, the touch panel 11 and the main screen 31 may move the cursor based on the relative movement information of the user instruction medium 13 without making the coordinate system identical.
 本例によれば、操作画面32のメニュー画面からのオブジェクト選択のみならず、文字入力等を必要とするアプリケーションも入力操作が可能となる。 According to this example, not only an object selection from the menu screen of the operation screen 32 but also an application that requires character input or the like can be input.
 このように、本実施形態では、操作画面32の領域の座標表現がタッチパネル11の領域の座標表現と1対1に対応しており、表示ポインタの中心36が指示ポイント14に対応する1つの座標位置に決定される。 Thus, in this embodiment, the coordinate expression of the area of the operation screen 32 has a one-to-one correspondence with the coordinate expression of the area of the touch panel 11, and the center 36 of the display pointer corresponds to one coordinate corresponding to the indication point 14. Determined to position.
 更に本実施形態では、タッチパネル11とユーザ指示媒体13の接触或いは接近の度合い、即ちタッチパネル11の上面からユーザ指示媒体13までの距離dに応じ、表示ポインタ33の大きさが変わる。図中では、距離dが比較的遠い場合の表示ポインタを符号33aで、比較的近い場合の表示ポインタを符号33bで示している。図示のように距離dが遠いときには表示ポインタ33が大きくなり、近いときには小さくなる。 Furthermore, in the present embodiment, the size of the display pointer 33 changes according to the degree of contact or approach between the touch panel 11 and the user instruction medium 13, that is, the distance d from the upper surface of the touch panel 11 to the user instruction medium 13. In the drawing, the display pointer when the distance d is relatively far is indicated by reference numeral 33a, and the display pointer when the distance d is relatively close is indicated by reference numeral 33b. As shown in the figure, the display pointer 33 increases when the distance d is long, and decreases when the distance d is short.
 これによりタッチパネル11とユーザ指示媒体13の接触或いは接近の度合いが、操作画面32では、表示ポインタ33の大きさで表され、ユーザに直感的な視認性を与える。 Thus, the degree of contact or approach between the touch panel 11 and the user instruction medium 13 is represented by the size of the display pointer 33 on the operation screen 32, and gives intuitive visibility to the user.
 なお図では、表示ポインタ33の形状は円を基調としたものとして記載しているが、その他多角形を基調としたもの、あるいはアニメのキャラクタ形状など、ユーザがより親しみやすいものであってもよい。また表示ポインタ33は、図示するように半透明で、背景と区別のしやすい着色などが施されるのが好ましい。 In the figure, the shape of the display pointer 33 is described as being based on a circle. However, other shapes such as a polygon based or an anime character shape may be more familiar to the user. . Further, the display pointer 33 is preferably translucent as shown in the figure, and is colored so as to be easily distinguished from the background.
 表示ポインタ33は、タッチパネル11とユーザ指示媒体13の接触或いは接近の度合いが中位なとき、表示ポインタ33の領域内には、幾つかのオブジェクト34が含まれる。本例では、これらの幾つかのオブジェクト34の中から、表示ポインタの中心36に近いオブジェクトがハイライトされたオブジェクト35として、強調表示され、ユーザに所望するオブジェクトかどうかをわかりやすく提示する。 The display pointer 33 includes several objects 34 in the area of the display pointer 33 when the degree of contact or proximity between the touch panel 11 and the user instruction medium 13 is medium. In this example, among these several objects 34, an object close to the center 36 of the display pointer is highlighted as the highlighted object 35, so that the user can easily understand whether the object is a desired object.
 ユーザは、該ハイライトされたオブジェクト35が所望のオブジェクトである場合、ユーザ指示媒体13をタッチパネル11に押下することにより選択動作を実行する。もし該ハイライトされたオブジェクト35が所望のオブジェクトでない場合、所望のオブジェクトがハイライトされたオブジェクト35となるように、ユーザ指示媒体13をタッチパネル11上で移動させる。 When the highlighted object 35 is a desired object, the user performs a selection operation by pressing the user instruction medium 13 on the touch panel 11. If the highlighted object 35 is not a desired object, the user instruction medium 13 is moved on the touch panel 11 so that the desired object becomes the highlighted object 35.
 またユーザは、表示ポインタ33の領域が大きく、領域内に含まれるオブジェクト34の数が多くて、ハイライトされたオブジェクト35が所望のオブジェクトと判断するのが難しいと感じる場合には、ユーザ指示媒体13をタッチパネル11に近づけることにより、表示ポインタ33の大きさを小さくすることができる。 If the user feels that the area of the display pointer 33 is large, the number of objects 34 included in the area is large, and the highlighted object 35 is difficult to determine as a desired object, the user instruction medium By bringing 13 to the touch panel 11, the size of the display pointer 33 can be reduced.
 かかる選択動作により一つのオブジェクト34が選択されたとき、ハイライト表示されたオブジェクト35を例えば色を変えるや3D表示をさせるなど、異なる表示形態にすることもユーザ選択動作に対して応答を視覚的に見せることに有用である。 When one object 34 is selected by such a selection operation, the highlighted object 35 can be displayed in a different display form, for example, by changing the color or displaying the 3D display. Useful for showing to.
 このように、ユーザは自身の選択動作を、表示ポインタ33とハイライトされたオブジェクト35を表示装置3の操作画面32を見ながら操作可能となる。タッチパネル11やユーザ指示媒体13を直接見る必要がない。また操作画面32は、情報コンテンツが表示される主画面31に多重表示されており、選択動作において、情報コンテンツから目を離す必要もない。 Thus, the user can operate his / her selection operation while viewing the operation screen 32 of the display device 3 with the display pointer 33 and the highlighted object 35. There is no need to look directly at the touch panel 11 or the user instruction medium 13. The operation screen 32 is displayed in multiple on the main screen 31 on which the information content is displayed, and it is not necessary to keep an eye on the information content in the selection operation.
 また、ユーザ指示媒体13の揺れについても、本実施形態は有効である。ユーザがユーザ指示媒体13にてオブジェクト選択動作を行う場合、ユーザ自身の動きの揺れは、直接ユーザ指示媒体13の揺れとなる。この場合表示ポインタ33に揺らぎが生じることになるが、本実施形態では、ハイライトされたオブジェクト35をユーザに提示しているため、揺れ或いは移動の大きさがオブジェクトの大きさの範囲であれば、ハイライトされたオブジェクト35が同一であることを続けることが可能である。揺れ或いは移動の大きさとオブジェクトの大きさを比較させ、揺れ或いは移動の大きさがオブジェクトの大きさに比較して大きい場合のみ、ハイライトされたオブジェクト35を異ならせればよい。 Also, the present embodiment is effective for shaking of the user instruction medium 13. When the user performs an object selection operation on the user instruction medium 13, the shaking of the user's own movement directly shakes the user instruction medium 13. In this case, the display pointer 33 fluctuates. However, in this embodiment, since the highlighted object 35 is presented to the user, the amount of fluctuation or movement is within the range of the object size. It is possible to continue the highlighted object 35 being the same. The size of the shake or movement is compared with the size of the object, and the highlighted object 35 may be made different only when the size of the shake or movement is larger than the size of the object.
 この時、揺れの細かな動きをフィルタリングさせることにより、表示ポインタ33の揺れを低減させることも、ユーザに視認性向上に有効である。 At this time, reducing the shaking of the display pointer 33 by filtering the fine movement of the shaking is also effective for improving the visibility of the user.
<第二実施形態>
 図9と図10を参照して、第二実施形態について説明する。第二実施形態は、操作画面32のON及びOFFを手動で行う点で、第一実施形態とは異なる例である。図9は、第二実施形態に係る入力操作装置を示す図である。図10は、第二実施形態に係る入力操作処理部21に係るフローチャートである。
<Second embodiment>
The second embodiment will be described with reference to FIGS. 9 and 10. The second embodiment is an example different from the first embodiment in that the operation screen 32 is manually turned on and off. FIG. 9 is a diagram illustrating the input operation device according to the second embodiment. FIG. 10 is a flowchart according to the input operation processing unit 21 according to the second embodiment.
 図9は、第二実施形態に適用する入力操作装置1aであり、図1に記載の入力操作装置1に、さらに保持センサ15を備える。 FIG. 9 shows an input operation device 1a applied to the second embodiment, and the input operation device 1 shown in FIG.
 保持センサ15は、入力操作装置1aの本体10外側に設置され、例えば圧力センサにより実現されるものであり、ユーザが情報コンテンツ受信装置2を動作させようと意図して、入力操作装置1aを掴んだ動作を検知する。ユーザが入力操作装置1aを保持するとユーザの手が保持センサ15と接触するため、保持センサ15は、該接触を検知して検知信号を出力する。該検知信号に基づいて、CPU101(図2A参照)はユーザが入力操作装置1aを保持していることを示す保持情報を生成し、通信部120を介して入力操作装置1aから情報コンテンツ受信装置2に送信する。保持センサ15は、ユーザが入力操作装置1aを保持したことを検知している間は、検知信号を出力し続ける、または一定の周期で出力し続けるように構成されている。このため、ユーザが入力操作装置1aを保持している間は入力操作装置1aから保持情報が連続的或いは周期的に情報コンテンツ受信装置2に送信され続ける。 The holding sensor 15 is installed outside the main body 10 of the input operation device 1a and is realized by, for example, a pressure sensor. The user grasps the input operation device 1a with the intention of operating the information content receiving device 2. Detect motion. When the user holds the input operation device 1a, the user's hand comes into contact with the holding sensor 15. Therefore, the holding sensor 15 detects the contact and outputs a detection signal. Based on the detection signal, the CPU 101 (see FIG. 2A) generates holding information indicating that the user holds the input operation device 1a, and the information content receiving device 2 from the input operation device 1a via the communication unit 120. Send to. The holding sensor 15 is configured to continue outputting the detection signal or to output at a constant period while detecting that the user holds the input operation device 1a. For this reason, while the user is holding the input operation device 1a, the held information is continuously or periodically transmitted to the information content receiving device 2 from the input operation device 1a.
 図10は、情報コンテンツ受信装置2の入力操作処理部21の本実施形態における処理動作を示すフローチャートである。 FIG. 10 is a flowchart showing a processing operation in the present embodiment of the input operation processing unit 21 of the information content receiving device 2.
 操作画面32がOFFの状態で、入力操作処理部21が入力操作装置1aから保持情報を受信すると(S108a/YES)、操作画面32の表示をONさせるべく、入力操作処理部21は、操作画面生成部22を介して表示ON信号を表示出力部27に送り、操作画面32を表示する(S112)。 When the input operation processing unit 21 receives the hold information from the input operation device 1a in a state where the operation screen 32 is OFF (S108a / YES), the input operation processing unit 21 displays the operation screen to turn on the display of the operation screen 32. A display ON signal is sent to the display output unit 27 via the generation unit 22 to display the operation screen 32 (S112).
 この状態で入力操作装置1aからユーザ操作による指示情報を受信した場合(S114/YES)、その指示情報を操作画面生成部22、表示ポインタ処理部23a、及び中央制御部23で取得して処理、ユーザ操作に合った表示が行われる(S115)。 When instruction information by a user operation is received from the input operation device 1a in this state (S114 / YES), the instruction information is acquired and processed by the operation screen generation unit 22, the display pointer processing unit 23a, and the central control unit 23. A display suitable for the user operation is performed (S115).
 入力操作装置1aの保持状態が継続する間、即ち入力操作処理部21が保持情報を受信している間(S108b/YES)は、ユーザ操作検知(S114)へ戻り、操作画面32の表示が続く。 While the holding state of the input operation device 1a continues, that is, while the input operation processing unit 21 receives the holding information (S108b / YES), the process returns to the user operation detection (S114) and the display of the operation screen 32 continues. .
 入力操作装置1aから保持検知信号の送信が途絶えると(S108b/NO)、入力操作処理部21は、操作画面生成部22を介して表示OFF信号を表示出力部27に出力し、操作画面32を非表示にする(S117)。 When the transmission of the holding detection signal from the input operation device 1a is interrupted (S108b / NO), the input operation processing unit 21 outputs a display OFF signal to the display output unit 27 via the operation screen generating unit 22, and the operation screen 32 is displayed. Hide (S117).
 本実施形態においても、ユーザ操作が行われない場合に操作画面32が主画面31に重畳して表示されることはなく、不必要に情報コンテンツの一部が操作画面により視聴を妨げられることがなくなる。 Also in the present embodiment, when no user operation is performed, the operation screen 32 is not displayed superimposed on the main screen 31, and a part of information content is unnecessarily prevented from being viewed by the operation screen. Disappear.
 また本実施形態では、入力操作装置1aを掴むというユーザの直接的な行動を検知し、この検知状態を用いて操作画面が表示できるので、操作性をより向上させることができる。 Further, in this embodiment, since the user's direct action of grasping the input operation device 1a is detected and the operation screen can be displayed using this detection state, the operability can be further improved.
<第三実施形態>
 図11を参照して、第三実施形態について説明する。図11は第三実施形態に係る入力操作装置を示す図である。図11に示す入力操作装置1bにおいて、図1で示した入力操作装置1と同一のものについては、同一の番号を付している。
<Third embodiment>
A third embodiment will be described with reference to FIG. FIG. 11 is a diagram illustrating an input operation device according to the third embodiment. In the input operation device 1b shown in FIG. 11, the same components as those of the input operation device 1 shown in FIG.
 図11に示すように第三実施形態に係る入力操作装置1bは、タッチパネル付表示パネル16、及びホームボタン17を備える。 As shown in FIG. 11, the input operation device 1b according to the third embodiment includes a display panel 16 with a touch panel and a home button 17.
 本実施形態では、タッチパネル11と入力キー12が一つのタッチパネル付表示パネル16上に構成される。タッチパネル付表示パネル16は、例えば電子ペーパであったり、タッチセルを内蔵したLCD(Liquid Crystal Display)であったりする。 In the present embodiment, the touch panel 11 and the input key 12 are configured on one display panel 16 with a touch panel. The touch panel-equipped display panel 16 is, for example, electronic paper or an LCD (Liquid Crystal Display) with a built-in touch cell.
 タッチパネル11と入力キー12は、タッチパネル付表示パネル16を分割し、各分割領域内に其々配置される。図11では、上部が入力キー12を配列した領域であり、下部がタッチパネル11に相当する領域である。 The touch panel 11 and the input key 12 divide the display panel 16 with a touch panel and are arranged in each divided area. In FIG. 11, the upper part is an area where the input keys 12 are arranged, and the lower part is an area corresponding to the touch panel 11.
 タッチパネル付表示パネル16のタッチパネルは、入力操作装置1bの上面全領域にわたって一様に備えられているものであり、入力キー12を配列した領域とタッチパネル11に相当する領域とは、一体化したタッチパネルで作成できる。このため、本実施形態は、入力操作装置1bを例えばスマートフォンや電子ブックなどの他の応用にも用いられる装置のアプリケーションとして実現できる。 The touch panel of the display panel 16 with a touch panel is uniformly provided over the entire upper surface area of the input operation device 1b, and the area where the input keys 12 are arranged and the area corresponding to the touch panel 11 are an integrated touch panel. Can be created. For this reason, this embodiment can implement | achieve the input operation apparatus 1b as an application of the apparatus used also for other applications, such as a smart phone and an electronic book, for example.
 ホームボタン17は、入力操作装置1bを他に応用する場合、さらに上位の画面に遷移させるために設けられる。 The home button 17 is provided to make a transition to a higher screen when the input operation device 1b is applied to other applications.
 上記実施形態について、入力操作装置1、情報コンテンツ受信装置2、及び表示装置3が別の装置として説明したが、本発明の実施形態はこのように3つの装置が別体に構成されている場合に限らず、情報コンテンツ受信装置2と表示装置3が一体となっている装置でもよい。 In the above embodiment, the input operation device 1, the information content receiving device 2, and the display device 3 have been described as separate devices. However, in the embodiment of the present invention, the three devices are configured separately as described above. However, the information content receiving device 2 and the display device 3 may be integrated.
 また入力操作装置1及び情報コンテンツ受信装置2を一体の装置、表示装置3を別体の装置として構成してもよい。図12を参照して上記の他例について説明する。図12は、タブレット端末とプロジェクタとを接続した情報コンテンツ受信システムを示す図である。なお、タブレット端末7で作成した情報コンテンツ、例えばタブレット端末7で撮影した動画や図面、イラストを表示する場合には、情報コンテンツ表示システムといってもよい。 Further, the input operation device 1 and the information content receiving device 2 may be configured as an integrated device, and the display device 3 may be configured as a separate device. The other examples will be described with reference to FIG. FIG. 12 is a diagram showing an information content receiving system in which a tablet terminal and a projector are connected. In addition, when displaying the information content produced with the tablet terminal 7, for example, the moving image, drawing, and illustration which were image | photographed with the tablet terminal 7, you may call it an information content display system.
 図12に示すタブレット端末7は、静電容量を計測できるタッチパネルを備えている。またタブレット端末7の内部には、図1の入力操作装置1及び情報コンテンツ受信装置2と同様の構成を有し、情報コンテンツを受信し、記憶する。そして、タブレット端末7とプロジェクタ8とを無線通信、例えばBluetooth(登録商標)により通信接続し、タブレット端末7で受け付けた入力操作に基づく情報を、プロジェクタ8に送信して表示させてもよい。この場合、タブレット端末7は、入力操作装置1及び情報コンテンツ受信装置2を一体の装置として構成した装置に相当し、プロジェクタ8が表示装置3に相当する。 The tablet terminal 7 shown in FIG. 12 includes a touch panel that can measure capacitance. The tablet terminal 7 has the same configuration as that of the input operation device 1 and the information content receiving device 2 in FIG. 1, and receives and stores information content. Then, the tablet terminal 7 and the projector 8 may be connected to each other by wireless communication, for example, Bluetooth (registered trademark), and information based on the input operation received by the tablet terminal 7 may be transmitted to the projector 8 and displayed. In this case, the tablet terminal 7 corresponds to a device in which the input operation device 1 and the information content receiving device 2 are configured as an integrated device, and the projector 8 corresponds to the display device 3.
 上記各実施形態は、本発明を限定する趣旨ではなく、例えば、ある実施形態の構成の一部を他の実施形態に置き換えることが可能である。また、ある実施形態の構成に、他の実施形態の構成を加えることも可能である。これらは全て本発明の範疇に属するものである。さらに、文中や図中に現れる数値やメッセージ等もあくまで一例であり、異なるものを用いても本発明の効果を損なうものではない。 The above embodiments are not intended to limit the present invention, and for example, a part of the configuration of one embodiment can be replaced with another embodiment. In addition, the configuration of another embodiment can be added to the configuration of a certain embodiment. These all belong to the category of the present invention. Furthermore, numerical values, messages, and the like appearing in sentences and drawings are merely examples, and the use of different ones does not impair the effects of the present invention.
 なおさらに、前述した本発明の機能等は、それらの一部または全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、マイクロプロセッサユニット等がそれぞれの機能等を実現する動作プログラムを解釈して実行することによりソフトウェアで実現してもよい。さらに、ハードウェアとソフトウェアを併用しても良い。 Still further, some or all of the functions and the like of the present invention described above may be realized by hardware by designing, for example, an integrated circuit. Alternatively, the microprocessor unit or the like may be realized by software by interpreting and executing an operation program that realizes each function or the like. Furthermore, hardware and software may be used together.
 1 入力操作装置
 2 情報コンテンツ受信装置
 3 表示装置
 4 情報蓄積装置
 5 放送受信媒体
 6 通信ネットワーク
 7 タブレット端末
 8 プロジェクタ
 11 タッチセンサ部
 12 入力キー
 13 ユーザ指示媒体
 14 指示ポイント
 15 保持センサ部
 16 タッチパネル付表示パネル
 17 ホームボタン
 21 入力操作処理部
 22 操作画面生成部
 23 中央制御部
 24 蓄積情報入出力部
 25 放送受信部
 26 ネットワーク送受信部
 27 表示出力部
 31 主画面部
 32、32a、32b 操作画面部
 33、33a、33b 表示ポインタ
 34 オブジェクト
 35 ハイライトされたオブジェクト
 36 表示ポインタの中心
 37 文字入力枠
 38 実行ボタン
 39 カーソル
DESCRIPTION OF SYMBOLS 1 Input operation apparatus 2 Information content receiver 3 Display apparatus 4 Information storage apparatus 5 Broadcast receiving medium 6 Communication network 7 Tablet terminal 8 Projector 11 Touch sensor part 12 Input key 13 User instruction | indication medium 14 Instruction point 15 Holding | maintenance sensor part 16 Display with touch panel Panel 17 Home button 21 Input operation processing unit 22 Operation screen generation unit 23 Central control unit 24 Accumulated information input / output unit 25 Broadcast reception unit 26 Network transmission / reception unit 27 Display output unit 31 Main screen unit 32, 32a, 32b Operation screen unit 33, 33a, 33b Display pointer 34 Object 35 Highlighted object 36 Center of display pointer 37 Character input frame 38 Run button 39 Cursor

Claims (10)

  1.  ユーザの指示を入力する指示媒体とタッチ面との接触或いは接近の度合いを検出するタッチパネルを用いて表示画面に入力操作対象となる範囲を示す表示ポインタを表示するための入力操作方法であって、
     前記タッチ面に対する前記指示媒体の位置を特定する指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触或いは接近の度合いを示す情報を含む指示位置情報を取得するステップと、
     前記接触或いは接近の度合いを示す情報に基づいて、前記表示画面内の入力操作対象となる範囲を示す表示ポインタのサイズを決定するステップと、
     前記表示画面の2次元座標系と前記タッチ面内の2次元座標系とを対応付けた操作表示系情報を参照し、前記指示ポイントの2次元位置情報を前記表示画面の2次元座標系に変換した点に、前記決定したサイズの表示ポインタの中心を合わせて表示するステップと、
     を含むことを特徴とする入力操作方法。
    An input operation method for displaying a display pointer indicating an input operation target range on a display screen using a touch panel for detecting a degree of contact or approach between an instruction medium for inputting a user instruction and a touch surface,
    Indicated position information for specifying the position of the pointing medium with respect to the touch surface, and the coordinates of the pointing point at which the pointing medium is detected on the touch surface are represented by a two-dimensional coordinate system in the touch surface. Obtaining indicated position information including two-dimensional position information and information indicating a degree of contact or approach between the pointing medium and the touch surface when the pointing medium is detected;
    Determining a size of a display pointer indicating a range to be an input operation target in the display screen based on the information indicating the degree of contact or approach; and
    The operation display system information that associates the two-dimensional coordinate system of the display screen with the two-dimensional coordinate system in the touch surface is referred to, and the two-dimensional position information of the indication point is converted into the two-dimensional coordinate system of the display screen. A step of displaying the determined size with the center of the display pointer of the determined size,
    The input operation method characterized by including.
  2.  請求項1に記載の入力操作方法であって、
     前記接触或いは接近の度合いを示す情報は、前記指示媒体と前記タッチ面との間の距離を示す情報であって、
     前記距離が小さいほど前記表示ポインタのサイズは小さく決定される、
     ことを特徴とする入力操作方法。
    The input operation method according to claim 1,
    The information indicating the degree of contact or approach is information indicating the distance between the pointing medium and the touch surface,
    The display pointer size is determined to be smaller as the distance is smaller.
    An input operation method characterized by the above.
  3.  請求項1に記載の入力操作方法であって、
     前記接触或いは接近の度合いを示す情報は、前記指示媒体が前記タッチ面を押した圧力の大きさを示す情報であって、
     前記圧力が大きいほど前記表示ポインタのサイズは小さく決定される、
     ことを特徴とする入力操作方法。
    The input operation method according to claim 1,
    The information indicating the degree of contact or approach is information indicating the magnitude of the pressure with which the pointing medium presses the touch surface,
    The larger the pressure is, the smaller the size of the display pointer is determined.
    An input operation method characterized by the above.
  4.  請求項1に記載の入力操作方法であって、
     前記表示画面には、前記ユーザが選択する候補となる少なくとも一つ以上のオブジェクトが更に表示され、
     前記表示ポインタが前記オブジェクトに重畳されて表示された際に、前記表示ポインタが重畳されるオブジェクトは、前記表示ポインタが重畳されないオブジェクトとは異なる表示態様で表示される、
     ことを特徴とする入力操作方法。
    The input operation method according to claim 1,
    The display screen further displays at least one object that is a candidate for the user to select,
    When the display pointer is superimposed on the object and displayed, the object on which the display pointer is superimposed is displayed in a display mode different from the object on which the display pointer is not superimposed.
    An input operation method characterized by the above.
  5.  請求項4に記載の入力操作方法であって、
     前記タッチパネルと前記指示媒体との接触或いは接近の度合いが強くなりそれに応じて前記表示ポインタのサイズが小さくなり、前記表示ポインタが重畳されたオブジェクトが1つになった場合に、当該オブジェクトを前記ユーザが選択したオブジェクトとして決定されるステップを更に含む、
     ことを特徴とする入力操作方法。
    The input operation method according to claim 4,
    When the degree of contact or approach between the touch panel and the pointing medium is increased and the size of the display pointer is reduced accordingly, and the object on which the display pointer is superimposed becomes one, the object is moved to the user. Further comprising the step of determining as the selected object,
    An input operation method characterized by the above.
  6.  ユーザの指示を入力する指示媒体とタッチ面との接触或いは接近の度合いを検出するタッチパネルと、
     前記タッチ面に対する前記指示媒体の位置を特定する指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触或いは接近の度合いを示す情報を含む指示位置情報を出力する入力操作制御部と、
     表示画面における入力操作対象となる範囲を示す表示ポインタのサイズ及び前記表示画面内における表示位置を、前記指示位置情報に基づいて決定する表示制御装置に対して、前記指示位置情報を送信する通信部と、
     を備えることを特徴する入力操作装置。
    A touch panel for detecting the degree of contact or approach between an instruction medium for inputting a user instruction and the touch surface;
    Indicated position information for specifying the position of the pointing medium with respect to the touch surface, and the coordinates of the pointing point at which the pointing medium is detected on the touch surface are represented by a two-dimensional coordinate system in the touch surface. An input operation control unit that outputs two-dimensional position information and pointing position information including information indicating a degree of contact or approach between the pointing medium and the touch surface when the pointing medium is detected;
    A communication unit that transmits the indicated position information to a display control device that determines a size of a display pointer indicating a range to be input on the display screen and a display position in the display screen based on the indicated position information. When,
    An input operation device comprising:
  7.  請求項6の入力操作装置において、
     前記入力操作制御部及び前記通信部を内蔵する本体であって、当該本体の上面に前記タッチ面が配置された本体と、
     前記本体の側面に配置され、前記ユーザが前記本体を保持したことを検知する保持センサと、を更に備え、
     前記入力操作制御部が、前記保持センサから前記ユーザが前記本体を保持したことを検知する保持検知信号を取得すると、前記通信部から前記保持検知信号を前記表示制御装置に対して送信する、
     ことを特徴とする入力操作装置。
    The input operation device according to claim 6, wherein
    A main body including the input operation control unit and the communication unit, wherein the touch surface is disposed on an upper surface of the main body;
    A holding sensor that is disposed on a side surface of the main body and detects that the user holds the main body;
    When the input operation control unit obtains a holding detection signal for detecting that the user holds the main body from the holding sensor, the holding operation signal is transmitted from the communication unit to the display control device.
    An input operation device characterized by that.
  8.  ユーザの入力操作対象となる範囲を示す表示ポインタを表示画面に表示させる表示制御装置であって、
     前記ユーザの指示を入力する指示媒体とタッチ面との接触或いは接近の度合いを検出するタッチパネルが出力した指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触或いは接近の度合いを示す情報を含む指示位置情報を取得する入力操作処理部と、
     前記接触或いは接近の度合いを示す情報に基づいて、前記表示画面内の入力操作対象となる範囲を示す表示ポインタのサイズを決定し、前記表示画面の2次元座標系と前記タッチ面内の2次元座標系とを対応付けた操作表示系情報を参照し、前記指示ポイントの2次元位置情報を前記表示画面の2次元座標系に変換した点に、前記決定したサイズの表示ポインタの中心を合わせて表示させる表示ポインタ処理部と、
     を含むことを特徴とする表示制御装置。
    A display control apparatus that displays a display pointer indicating a range to be input by a user on a display screen,
    The indication position information outputted by the touch panel for detecting the degree of contact or approach between the indication medium for inputting the user's instruction and the touch surface, the indication point being the position where the indication medium is detected on the touch surface Indicated position including two-dimensional position information representing coordinates in a two-dimensional coordinate system in the touch surface, and information indicating a degree of contact or approach between the indicating medium and the touch surface when the indicating medium is detected An input operation processing unit for acquiring information;
    Based on the information indicating the degree of contact or approach, a size of a display pointer indicating a range that is an input operation target in the display screen is determined, and a two-dimensional coordinate system of the display screen and a two-dimensional display in the touch surface Refer to the operation display system information associated with the coordinate system, and align the center of the display pointer of the determined size with the point obtained by converting the two-dimensional position information of the indication point into the two-dimensional coordinate system of the display screen. A display pointer processing unit to be displayed;
    A display control apparatus comprising:
  9.  ユーザの入力操作を受け付ける入力操作装置と、情報コンテンツを受信する情報コンテンツ受信装置と、情報コンテンツを表示する表示画面を有する表示装置と、を通信接続した情報コンテンツ受信システムにおいて、
     前記入力操作装置は、
     ユーザの指示を入力する指示媒体とタッチ面との接触或いは接近の度合いを検出するタッチパネルと、
     前記タッチ面に対する前記指示媒体の位置を特定する指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触或いは接近の度合いを示す情報を含む指示位置情報を出力する入力操作制御部と、
     前記情報コンテンツ受信装置に対して前記指示位置情報を送信する通信部と、
     を備え、
     前記情報コンテンツ受信装置は、
     放送波を受信する放送受信部、通信接続されたネットワークから情報コンテンツを受信するネットワーク送受信部、又は情報コンテンツを蓄積する情報蓄積装置から前記情報コンテンツの入力を受け付ける蓄積情報入力部の少なくとも一つと、
     前記ユーザの入力操作対象となる範囲を示す表示ポインタを表示画面に表示させる表示制御装置と、を備え、
     前記表示制御装置は、
     前記ユーザの指示を入力する指示媒体とタッチ面との接触或いは接近の度合いを検出するタッチパネルが出力した指示位置情報であって、前記タッチ面において前記指示媒体が検出された位置である指示ポイントの座標を前記タッチ面内の2次元座標系で表した2次元位置情報、及び前記指示媒体が検出された際の前記指示媒体と前記タッチ面との接触或いは接近の度合いを示す情報を含む指示位置情報を取得する入力操作処理部と、
     前記ユーザの選択対象候補となるオブジェクトを少なくとも一つ配置した操作画面を生成する操作画面生成部と、
     前記接触或いは接近の度合いを示す情報に基づいて、前記操作画面内の入力操作対象となる範囲を示す表示ポインタのサイズを決定し、前記表示画面の2次元座標系と前記タッチ面内の2次元座標系とを対応付けた操作表示系情報を参照し、前記指示ポイントの2次元位置情報を前記表示画面の2次元座標系に変換した点に、前記決定したサイズの表示ポインタの中心を合わせて表示させる表示ポインタ処理部と、
     前記操作画面内において前記表示ポインタが重畳表示されたオブジェクトを、前記オブジェクトが重畳されていないオブジェクトとは異なる表示態様で表示する表示出力部と、を含む、
     ことを特徴とする情報コンテンツ受信システム。
    In an information content receiving system in which an input operation device that accepts a user's input operation, an information content receiving device that receives information content, and a display device that has a display screen that displays information content are connected in communication.
    The input operation device includes:
    A touch panel for detecting the degree of contact or approach between an instruction medium for inputting a user instruction and the touch surface;
    Indicated position information for specifying the position of the pointing medium with respect to the touch surface, and the coordinates of the pointing point at which the pointing medium is detected on the touch surface are represented by a two-dimensional coordinate system in the touch surface. An input operation control unit that outputs two-dimensional position information and pointing position information including information indicating a degree of contact or approach between the pointing medium and the touch surface when the pointing medium is detected;
    A communication unit for transmitting the indicated position information to the information content receiving device;
    With
    The information content receiving device includes:
    At least one of a broadcast receiving unit that receives broadcast waves, a network transmission / reception unit that receives information content from a network connected to communication, or a storage information input unit that receives input of the information content from an information storage device that stores information content;
    A display control device for displaying on the display screen a display pointer indicating a range to be input by the user,
    The display control device includes:
    The indication position information outputted by the touch panel for detecting the degree of contact or approach between the indication medium for inputting the user's instruction and the touch surface, the indication point being the position where the indication medium is detected on the touch surface Indicated position including two-dimensional position information representing coordinates in a two-dimensional coordinate system in the touch surface, and information indicating a degree of contact or approach between the indicating medium and the touch surface when the indicating medium is detected An input operation processing unit for acquiring information;
    An operation screen generating unit that generates an operation screen in which at least one object to be selected by the user is arranged;
    Based on the information indicating the degree of contact or approach, a size of a display pointer indicating a range to be an input operation in the operation screen is determined, and a two-dimensional coordinate system of the display screen and a two-dimensional in the touch surface Refer to the operation display system information associated with the coordinate system, and align the center of the display pointer of the determined size with the point obtained by converting the two-dimensional position information of the indication point into the two-dimensional coordinate system of the display screen. A display pointer processing unit to be displayed;
    A display output unit that displays the object on which the display pointer is superimposed in the operation screen in a display mode different from the object on which the object is not superimposed,
    An information content receiving system.
  10.  請求項9の情報コンテンツ受信システムおいて、
     前記入力操作装置は、前記入力操作制御部及び前記通信部を内蔵する本体であって、当該本体の上面に前記タッチパネルが配置された本体と、
     前記本体の側面に配置され、前記ユーザが前記本体を保持したことを検知する保持センサと、を更に備え、
     前記入力操作制御部が、前記保持センサから前記ユーザが前記本体を保持したことを検知する保持検知信号を取得すると、前記通信部から前記保持検知信号を前記表示制御装置に対して送信し、
     前記操作画面生成部は、前記保持検知信号を受信すると前記操作画面を生成し、前記操作画面の表示が開始される、
     ことを特徴とする情報コンテンツ受信システム。
    In the information content receiving system of claim 9,
    The input operation device is a main body including the input operation control unit and the communication unit, and a main body in which the touch panel is disposed on an upper surface of the main body,
    A holding sensor that is disposed on a side surface of the main body and detects that the user holds the main body;
    When the input operation control unit obtains a holding detection signal for detecting that the user holds the main body from the holding sensor, the holding operation signal is transmitted from the communication unit to the display control device,
    The operation screen generation unit generates the operation screen when receiving the holding detection signal, and starts displaying the operation screen.
    An information content receiving system.
PCT/JP2016/082659 2016-11-02 2016-11-02 Input operation method, input operation device, display controller, and information content receiving system WO2018083765A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/082659 WO2018083765A1 (en) 2016-11-02 2016-11-02 Input operation method, input operation device, display controller, and information content receiving system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/082659 WO2018083765A1 (en) 2016-11-02 2016-11-02 Input operation method, input operation device, display controller, and information content receiving system

Publications (1)

Publication Number Publication Date
WO2018083765A1 true WO2018083765A1 (en) 2018-05-11

Family

ID=62075878

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/082659 WO2018083765A1 (en) 2016-11-02 2016-11-02 Input operation method, input operation device, display controller, and information content receiving system

Country Status (1)

Country Link
WO (1) WO2018083765A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480653A (en) * 2021-05-26 2022-12-16 北京搜狗科技发展有限公司 Input method, device and device for input

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000270236A (en) * 1999-03-15 2000-09-29 Nippon Hoso Kyokai <Nhk> REMOTE CONTROL TERMINAL DEVICE AND RECEIVING DEVICE FOR REMOTE OPERATION OF TELEVISION RECEIVER, AND STORAGE MEDIUM FOR REMOTE OPERATION
JP2008027287A (en) * 2006-07-24 2008-02-07 Navitime Japan Co Ltd Map display system, map display device, map display method and map distribution server
JP2010061224A (en) * 2008-09-01 2010-03-18 Honda Motor Co Ltd Input/output device for automobile
JP2012238079A (en) * 2011-05-10 2012-12-06 Kyocera Corp Input device and electronic apparatus
JP2013093066A (en) * 2013-02-21 2013-05-16 Sharp Corp Display device, and program
JP2013109668A (en) * 2011-11-22 2013-06-06 Sony Computer Entertainment Inc Information processing apparatus and information processing method
JP2015060303A (en) * 2013-09-17 2015-03-30 船井電機株式会社 Information processor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000270236A (en) * 1999-03-15 2000-09-29 Nippon Hoso Kyokai <Nhk> REMOTE CONTROL TERMINAL DEVICE AND RECEIVING DEVICE FOR REMOTE OPERATION OF TELEVISION RECEIVER, AND STORAGE MEDIUM FOR REMOTE OPERATION
JP2008027287A (en) * 2006-07-24 2008-02-07 Navitime Japan Co Ltd Map display system, map display device, map display method and map distribution server
JP2010061224A (en) * 2008-09-01 2010-03-18 Honda Motor Co Ltd Input/output device for automobile
JP2012238079A (en) * 2011-05-10 2012-12-06 Kyocera Corp Input device and electronic apparatus
JP2013109668A (en) * 2011-11-22 2013-06-06 Sony Computer Entertainment Inc Information processing apparatus and information processing method
JP2013093066A (en) * 2013-02-21 2013-05-16 Sharp Corp Display device, and program
JP2015060303A (en) * 2013-09-17 2015-03-30 船井電機株式会社 Information processor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115480653A (en) * 2021-05-26 2022-12-16 北京搜狗科技发展有限公司 Input method, device and device for input

Similar Documents

Publication Publication Date Title
CN107801094B (en) Method of controlling source device at sink device and apparatus using the same
EP2677741A1 (en) Remote control apparatus and control method thereof
CN109101147B (en) Message notification display method and terminal
EP2744218A1 (en) Display apparatus, remote control apparatus, and method for providing user interface using the same
EP3133801B1 (en) Mobile terminal and method of controling the same
RU2598598C2 (en) Information processing device, information processing system and information processing method
EP3131254B1 (en) Mobile terminal and method for controlling the same
EP3024220A2 (en) Display apparatus and display method
CN108881617B (en) Display switching method and mobile terminal
EP3021572A1 (en) Display apparatus and control method thereof
CN107741814B (en) Display control method and mobile terminal
KR20150008230A (en) Inputting apparatus and method of computer by using smart terminal having electronic pen
CN107948278B (en) Information transmission method, terminal equipment and system
EP2899986B1 (en) Display apparatus, mobile apparatus, system and setting controlling method for connection thereof
KR20170042953A (en) Display apparatus and method of controling thereof
EP2804084A1 (en) Remote control method and remote control system of image display apparatus
CN108009274A (en) Piece file mergence method and mobile terminal
WO2018083765A1 (en) Input operation method, input operation device, display controller, and information content receiving system
KR102338901B1 (en) An electronic apparatus and operating method for the same
KR20150020778A (en) Method for changing screen in a user device terminal having pen
WO2013008330A1 (en) Interactive-screen data transmission/reception system and interactive-screen data transmission/reception program
CN103891266A (en) Display device, and method of controlling a camera of the display device
CN109407831A (en) A kind of exchange method and terminal
JP2015018382A (en) Information processing apparatus
KR101791222B1 (en) Portable electric device for providing mouse function and operating method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16920478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16920478

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP