[go: up one dir, main page]

WO2013084687A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2013084687A1
WO2013084687A1 PCT/JP2012/079716 JP2012079716W WO2013084687A1 WO 2013084687 A1 WO2013084687 A1 WO 2013084687A1 JP 2012079716 W JP2012079716 W JP 2012079716W WO 2013084687 A1 WO2013084687 A1 WO 2013084687A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
information
information processing
approach
processing apparatus
Prior art date
Application number
PCT/JP2012/079716
Other languages
French (fr)
Japanese (ja)
Inventor
落合 昇
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2013084687A1 publication Critical patent/WO2013084687A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and a program for processing information.
  • an information processing apparatus equipped with a display having a touch panel function has been considered to be capable of inputting desired characters and symbols by performing handwriting input in a predetermined area of the display (for example, , See Patent Document 1).
  • the above-described technique has a problem in that it is difficult to input a desired character or symbol by handwriting in a predetermined area on the display.
  • An object of the present invention is to provide an information processing apparatus, an information processing method, and a program that solve the above-described problems.
  • the information processing apparatus of the present invention An information processing apparatus, A display unit; A detection unit for detecting contact or approach to the display surface of the display unit; Based on the detection result of the detection unit, an area determination unit for determining an input area for inputting information in the display surface of the display unit; And a control unit that inputs the information to the information processing apparatus based on the contact or approach detected by the detection unit in the input region determined by the region determination unit.
  • the information processing method of the present invention includes: An information processing method performed by an information processing apparatus including a display unit, Processing for detecting contact or approach to the display surface of the display unit; Based on the detected result, a process of determining an input area for inputting information in the display surface of the display unit; Based on the detected contact or approach in the determined input area, the information is input to the information processing apparatus.
  • the program of the present invention is A program for causing an information processing apparatus including a display unit to execute, A procedure for detecting contact or approach to the display surface of the display unit; A procedure for determining an input area for inputting information in the display surface of the display unit based on the detected result; Based on the detected contact or approach in the determined input area, a procedure for inputting the information to the information processing apparatus is executed.
  • FIG. 3 is a flowchart for explaining an information processing method in the information processing apparatus shown in FIG. 1. It is a figure which shows an example of the screen displayed on the display part when an operation mode is not a character input mode. It is a figure which shows the other example of the screen displayed on the display part when an operation mode is not a character input mode.
  • FIG. 6 is a diagram showing an example of a screen displayed on the display unit after the processes of steps 2 to 5 are performed in a state where the screen shown in FIG. 5 is displayed on the display unit shown in FIG.
  • FIG. 1 is a diagram showing an embodiment of an information processing apparatus according to the present invention.
  • the information processing apparatus 100 includes a display unit 110, a detection unit 120, an area determination unit 130, a control unit 140, a pattern storage unit 150, and a pattern recognition unit 160. It has been.
  • the display unit 110 is a display that displays information based on an instruction from the control unit 140. In addition, the display unit 110 displays character information input to the information processing apparatus 100 by the control unit 140.
  • the detection unit 120 detects the contact or approach of an object to the display surface of the display unit 110.
  • the detection unit 120 may be a contact sensor or a proximity sensor.
  • the detection unit 120 detects a position on the display surface that has been touched or approached, and notifies the region determination unit 130 of the position (for example, coordinates).
  • the display part 110 and the detection part 120 may be integrated like a touch panel.
  • the region determination unit 130 determines an input region for inputting information in the display surface of the display unit 110 based on the detection result of the detection unit 120 when the handwriting input is first performed in the character input mode. Specifically, when the handwriting input is first performed in the character input mode, the region determination unit 130 determines a region including the detected moved position notified from the detection unit 120 as the input region. In addition, the area determination unit 130 notifies the pattern recognition unit 160 of the determined input area.
  • the control unit 140 inputs information to the information processing apparatus 100 based on the contact or approach detected by the detection unit 120 in the input region determined by the region determination unit 130. At this time, the control unit 140 inputs the character information output from the pattern recognition unit 160 to the information processing apparatus 100. In addition, when command information is output from the pattern recognition unit 160, the control unit 140 performs command processing (control) based on the command information.
  • the pattern storage unit 150 stores correspondence between character information and pattern information indicating a movement pattern on the display surface of the approach or contact detected by the detection unit 120. Further, the pattern storage unit 150 stores, as pattern information, a movement pattern on the display surface of the approach or contact from when the detection unit 120 detects the contact or approach until it stops detecting the contact or approach. The pattern storage unit 150 also stores association between pattern information and command information for controlling input.
  • FIG. 2 is a diagram showing an example of correspondence between character information and command information and pattern information stored in the pattern storage unit 150 shown in FIG.
  • alphabets, numbers, symbols, and command information are stored in the pattern storage unit 150 shown in FIG. 1 in association with pattern information indicating a movement pattern.
  • FIG. 2 shows an example in which the movement pattern is a one-stroke drawing pattern (a movement pattern from when the detection unit 120 detects contact or approach until the contact or approach is not detected).
  • the pattern storage unit 150 may store associations between pattern information of moving patterns, character information, and the like as used in general handwriting input.
  • the pattern recognition unit 160 reads character information and command information corresponding to the movement pattern from the pattern storage unit 150 based on the movement pattern detected by the detection unit 120 in the input area notified from the area determination unit 130.
  • the character information and command information corresponding to the movement pattern are recognized.
  • the pattern recognition unit 160 may be able to recognize not only a one-stroke pattern but also general character information such as a kana.
  • the pattern recognition unit 160 outputs the recognized character information or command information to the control unit 140.
  • FIG. 3 is a flowchart for explaining an information processing method in the information processing apparatus 100 shown in FIG.
  • step 1 when an application having a character input function is started in step 1, the control unit 140 enters a state of waiting for detection of contact or approach in the detection unit 120 (waiting for a tap event) in step 2.
  • an entry in an input form using a browser and an entry in an input field of a memo pad application are given as examples.
  • description will be given by taking as an example filling in an input form using a browser and filling in an input field of a notepad application.
  • step 3 the control unit 140 determines whether or not the current operation mode is the character input mode.
  • step 4 the control unit 140 changes the operation mode to the character input mode.
  • step 5 the control unit 140 updates the screen displayed by the display unit 110 and performs the process in step 2.
  • FIG. 4 is a diagram illustrating an example of a screen displayed on the display unit 110 when the operation mode is not the character input mode.
  • a predetermined input form is displayed on the display unit 110.
  • the cursor displayed in the lower right area of the input form shown in FIG. 4 as the part where the next input character is displayed in FIG. 4, a display indicated by a square filled in black
  • the control unit 140 changes the operation mode to the character input mode when the detection unit 120 detects contact or approach by causing the user to touch or approach a finger or the like.
  • FIG. 5 is a diagram illustrating another example of a screen displayed on the display unit 110 when the operation mode is not the character input mode.
  • a predetermined notepad screen is displayed on the display unit 110.
  • control unit 140 changes the operation mode to the character input mode.
  • FIG. 6 shows an example of a screen displayed on the display unit 110 after the processing of steps 2 to 5 is performed in a state where the screen shown in FIG. 5 is displayed on the display unit 110 shown in FIG. FIG.
  • a character input mode screen is displayed on the display unit 110.
  • the entire display unit 110 may be a region where characters can be input.
  • a predetermined width (indicated by a hatched portion in FIG. 6) is provided inward from each of the four sides of the display unit 110.
  • an area where characters cannot be input may be used.
  • the detection unit 120 detects contact or approach at the shaded portion on the upper side of the display unit 110, the cursor moves upward. Further, when the detection unit 120 detects contact or approach at the lower shaded portion of the display unit 110, the cursor moves downward. Further, when the detection unit 120 detects contact or approach at the right shaded portion of the display unit 110, the cursor moves in the right direction. Further, when the detection unit 120 detects contact or approach at the left shaded portion of the display unit 110, the cursor moves to the left.
  • the character input mode ends.
  • a message indicating that the character input mode may be displayed somewhere on the display screen of the display unit 110.
  • a character is input based on the movement pattern of the contact or approach in a process described later.
  • the region determination unit 130 performs handwriting input in step 6 based on the contact or approach position detected by the detection unit 120 in step 2. Determine the input area for.
  • FIG. 7 is a diagram illustrating an example of a state when the user touches the display surface of the display unit 110 illustrated in FIG. 1 with a finger.
  • the region determination unit 130 determines the input region based on the movement pattern. decide. That is, the area determination unit 130 determines the input area based on the position (including the size) of the character that is first input by handwriting in the character input mode.
  • FIG. 8 is a diagram showing the input area as coordinates.
  • the region determination unit 130 includes coordinates (x1, y1), (x1, y2), (x1, y2), and the position of movement that the detection unit 120 first detects contact or approach in the character input mode.
  • x2, y1) and a portion surrounded by (x2, y2) (a portion surrounded by a broken line in FIG. 6) are determined as input regions.
  • FIG. 9 is a diagram illustrating an example of the input region determined by the region determination unit 130 when the detection unit 120 illustrated in FIG. 1 detects contact or approach to the lower right of the display unit 110.
  • the region determination unit 130 displays a portion surrounded by a broken line shown in FIG. 9. Determine as the input area.
  • FIG. 10 is a diagram illustrating an example of the input region determined by the region determination unit 130 when the detection unit 120 illustrated in FIG. 1 detects contact or approach to the entire display unit 110.
  • the region determination unit 130 inputs a part surrounded by a broken line illustrated in FIG. 10. Determine as an area.
  • the region determination unit 130 determines the input region according to the position where the detection unit 120 detects the contact or approach of the display unit 110.
  • the locus of movement of the touched finger is indicated by a solid line with an arrow, but this is a display for convenience of explanation, and is displayed on the actual display unit 110. Not. That is, the display unit 110 displays the same screen immediately before the detection unit 120 detects contact or approach and when the detection unit 120 detects the contact or approach.
  • the pattern recognition unit 160 performs pattern recognition in the determined input area. This is because the pattern recognition unit 160 reads out character information or command information from the pattern storage unit 150 based on the contact or approach movement pattern detected by the detection unit 120 in the input region determined by the region determination unit 130 and performs pattern recognition. I do.
  • the pattern recognition unit 160 recognizes that the contact or approach movement pattern detected by the detection unit 120 indicates “A”.
  • the pattern recognition unit 160 outputs the recognized character information or command information to the control unit 140.
  • control unit 140 determines whether the information output from the pattern recognition unit 160 is character information or command information.
  • the control unit 140 inputs the character to the information processing apparatus 100, and displays the input character on the display unit 110 in step 5.
  • the display screen of the display unit 110 is updated.
  • the information processing apparatus 100 includes a language dictionary (not shown) and a language conversion unit (not shown), and the language conversion unit reads language conversion candidates from the language dictionary based on the characters, and performs language conversion (for example, (Conversion from kana to kanji) may be performed.
  • the input character when displayed on the display unit 110, it may be displayed after being converted into a predetermined display font, or the recognized pattern may be displayed as it is. Further, the size of characters to be displayed may be adjusted depending on the running application.
  • step 9 the control unit 140 performs command processing (for example, input of return, space, backspace, etc.) based on the command information. )I do.
  • step 5 the display unit 110 performs display corresponding to the process, and the process in step 2 is performed again.
  • the display unit 110 does not display the trajectory of contact or approaching movement detected by the detection unit 120, the screen displayed in the background during handwriting input does not become difficult to see.
  • the processing performed by each component provided in the information processing apparatus 100 described above may be performed by a logic circuit produced according to the purpose.
  • a computer program (hereinafter referred to as a program) in which processing contents are described as a procedure is recorded on a recording medium readable by the information processing apparatus 100, and the program recorded on the recording medium is read by the information processing apparatus 100. , May be executed.
  • the recording medium readable by the information processing apparatus 100 includes a transfer medium such as a floppy (registered trademark) disk, a magneto-optical disk, a DVD, and a CD, as well as a ROM, a RAM, and the like built in the information processing apparatus 100. Memory, HDD, etc.
  • the program recorded on the recording medium is read by the control unit 140 provided in the information processing apparatus 100, and the same processing as described above is performed under the control of the control unit 140.
  • the control unit 140 operates as a computer that executes a program read from a recording medium on which the program is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Character Discrimination (AREA)

Abstract

Provided is an information processing device (100) wherein: a detection unit (120) detects a contact or an approach to a display surface of a display unit (110); an area determination unit (130) determines an input area to which information is to be input, within the display surface of the display unit (110), on the basis of the detection result by the detection unit (120); and a control unit (140) inputs the information to the information processing unit (100) on the basis of the contact or the approach detected by the detection unit (120) in the input area determined by the area determination unit (130).

Description

情報処理装置Information processing device
 本発明は、情報を処理する情報処理装置、情報処理方法およびプログラムに関する。 The present invention relates to an information processing apparatus, an information processing method, and a program for processing information.
 近年、タッチパネル機能を具備したディスプレイが搭載された情報処理装置では、ディスプレイの所定の領域内に手書き入力を行うことで、所望の文字や記号を入力することができる技術が考えられている(例えば、特許文献1参照。)。 In recent years, an information processing apparatus equipped with a display having a touch panel function has been considered to be capable of inputting desired characters and symbols by performing handwriting input in a predetermined area of the display (for example, , See Patent Document 1).
特開2002-92547号公報JP 2002-92547 A
 しかしながら、上述した技術においては、所望の文字や記号を、ディスプレイ上のあらかじめ設定された領域内に手書き入力しなければならなく、その入力の容易性に欠けるという問題点がある。 However, the above-described technique has a problem in that it is difficult to input a desired character or symbol by handwriting in a predetermined area on the display.
 本発明の目的は、上述した課題を解決する情報処理装置、情報処理方法およびプログラムを提供することである。 An object of the present invention is to provide an information processing apparatus, an information processing method, and a program that solve the above-described problems.
 本発明の情報処理装置は、
 情報処理装置であって、
 表示部と、
 前記表示部の表示面に対する接触または接近を検知する検知部と、
 前記検知部の検知結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する領域決定部と、
 前記領域決定部が決定した入力領域において前記検知部が検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する制御部とを有する。
The information processing apparatus of the present invention
An information processing apparatus,
A display unit;
A detection unit for detecting contact or approach to the display surface of the display unit;
Based on the detection result of the detection unit, an area determination unit for determining an input area for inputting information in the display surface of the display unit;
And a control unit that inputs the information to the information processing apparatus based on the contact or approach detected by the detection unit in the input region determined by the region determination unit.
 また、本発明の情報処理方法は、
 表示部を具備する情報処理装置が行う情報処理方法であって、
 前記表示部の表示面に対する接触または接近を検知する処理と、
 前記検知した結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する処理と、
 前記決定した入力領域において前記検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する処理とを行う。
In addition, the information processing method of the present invention includes:
An information processing method performed by an information processing apparatus including a display unit,
Processing for detecting contact or approach to the display surface of the display unit;
Based on the detected result, a process of determining an input area for inputting information in the display surface of the display unit;
Based on the detected contact or approach in the determined input area, the information is input to the information processing apparatus.
 また、本発明のプログラムは、
 表示部を具備する情報処理装置に実行させるためのプログラムであって、
 前記表示部の表示面に対する接触または接近を検知する手順と、
 前記検知した結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する手順と、
 前記決定した入力領域において前記検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する手順とを実行させる。
The program of the present invention is
A program for causing an information processing apparatus including a display unit to execute,
A procedure for detecting contact or approach to the display surface of the display unit;
A procedure for determining an input area for inputting information in the display surface of the display unit based on the detected result;
Based on the detected contact or approach in the determined input area, a procedure for inputting the information to the information processing apparatus is executed.
 以上説明したように、本発明においては、所望の文字や記号を容易に入力することができる。 As described above, in the present invention, desired characters and symbols can be easily input.
本発明の情報処理装置の実施の一形態を示す図である。It is a figure which shows one Embodiment of the information processing apparatus of this invention. 図1に示したパターン記憶部に記憶されている、文字情報およびコマンド情報と、パターン情報との対応付けの一例を示す図である。It is a figure which shows an example of matching with character information and command information, and pattern information which are memorize | stored in the pattern memory | storage part shown in FIG. 図1に示した情報処理装置における情報処理方法を説明するためのフローチャートである。3 is a flowchart for explaining an information processing method in the information processing apparatus shown in FIG. 1. 動作モードが文字入力モードでないときの表示部に表示された画面の一例を示す図である。It is a figure which shows an example of the screen displayed on the display part when an operation mode is not a character input mode. 動作モードが文字入力モードでないときの表示部に表示された画面の他の例を示す図である。It is a figure which shows the other example of the screen displayed on the display part when an operation mode is not a character input mode. 図1に示した表示部に図5に示した画面が表示されている状態で、ステップ2~5の処理が行われた後の表示部に表示されている画面の一例を示す図である。FIG. 6 is a diagram showing an example of a screen displayed on the display unit after the processes of steps 2 to 5 are performed in a state where the screen shown in FIG. 5 is displayed on the display unit shown in FIG. 利用者が図1に示した表示部の表示面へ指を接触させたときの様子の一例を示す図である。It is a figure which shows an example when a user makes a finger | toe contact the display surface of the display part shown in FIG. 入力領域を座標として示した図である。It is the figure which showed the input area as a coordinate. 図1に示した検知部が表示部の右下に接触または接近を検知したときに、領域決定部が決定した入力領域の一例を示す図である。It is a figure which shows an example of the input area which the area | region determination part determined when the detection part shown in FIG. 1 detected contact or approach to the lower right of a display part. 図1に示した検知部が表示部の全体に接触または接近を検知したときに、領域決定部が決定した入力領域の一例を示す図である。It is a figure which shows an example of the input area which the area | region determination part determined when the detection part shown in FIG. 1 detected the contact or approach to the whole display part.
 以下に、本発明の実施の形態について図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1は、本発明の情報処理装置の実施の一形態を示す図である。 FIG. 1 is a diagram showing an embodiment of an information processing apparatus according to the present invention.
 本形態における情報処理装置100には図1に示すように、表示部110と、検知部120と、領域決定部130と、制御部140と、パターン記憶部150と、パターン認識部160とが設けられている。 As shown in FIG. 1, the information processing apparatus 100 according to this embodiment includes a display unit 110, a detection unit 120, an area determination unit 130, a control unit 140, a pattern storage unit 150, and a pattern recognition unit 160. It has been.
 表示部110は、制御部140からの指示に基づいて、情報を表示するディスプレイである。また、表示部110は、制御部140が情報処理装置100に入力した文字情報を表示する。 The display unit 110 is a display that displays information based on an instruction from the control unit 140. In addition, the display unit 110 displays character information input to the information processing apparatus 100 by the control unit 140.
 検知部120は、表示部110の表示面に対する物体の接触や接近を検知する。例えば、検知部120は、接触センサや近接センサであっても良い。また、検知部120は、接触または接近があった表示面上の位置を検知し、その位置(例えば、座標)を領域決定部130へ通知する。また、タッチパネルのように、表示部110と検知部120とが一体化しているものであっても良い。 The detection unit 120 detects the contact or approach of an object to the display surface of the display unit 110. For example, the detection unit 120 may be a contact sensor or a proximity sensor. In addition, the detection unit 120 detects a position on the display surface that has been touched or approached, and notifies the region determination unit 130 of the position (for example, coordinates). Moreover, the display part 110 and the detection part 120 may be integrated like a touch panel.
 領域決定部130は、文字入力モードで最初に手書き入力を行ったときの検知部120の検知結果に基づいて、表示部110の表示面の中で情報を入力するための入力領域を決定する。具体的には、文字入力モードで最初に手書き入力を行ったとき、領域決定部130は、検知部120から通知された検知の移動した位置が含まれる領域を入力領域と決定する。また、領域決定部130は、決定した入力領域をパターン認識部160へ通知する。 The region determination unit 130 determines an input region for inputting information in the display surface of the display unit 110 based on the detection result of the detection unit 120 when the handwriting input is first performed in the character input mode. Specifically, when the handwriting input is first performed in the character input mode, the region determination unit 130 determines a region including the detected moved position notified from the detection unit 120 as the input region. In addition, the area determination unit 130 notifies the pattern recognition unit 160 of the determined input area.
 制御部140は、領域決定部130が決定した入力領域において検知部120が検知した接触または接近に基づいて、情報を情報処理装置100に入力する。このとき、制御部140は、パターン認識部160から出力されてきた文字情報を情報処理装置100に入力する。また、制御部140は、パターン認識部160から、コマンド情報が出力されてきた場合、そのコマンド情報に基づいて、コマンド処理(制御)を行う。 The control unit 140 inputs information to the information processing apparatus 100 based on the contact or approach detected by the detection unit 120 in the input region determined by the region determination unit 130. At this time, the control unit 140 inputs the character information output from the pattern recognition unit 160 to the information processing apparatus 100. In addition, when command information is output from the pattern recognition unit 160, the control unit 140 performs command processing (control) based on the command information.
 パターン記憶部150は、文字情報と、検知部120が検知した接近または接触の表示面上での移動パターンを示すパターン情報とを対応付けを記憶する。また、パターン記憶部150は、パターン情報として、検知部120が接触または接近を検知してからその接触または接近を検知しなくなるまでの接近または接触の表示面上での移動パターンを記憶する。また、パターン記憶部150は、パターン情報と、入力を制御するためのコマンド情報との対応付けを記憶する。 The pattern storage unit 150 stores correspondence between character information and pattern information indicating a movement pattern on the display surface of the approach or contact detected by the detection unit 120. Further, the pattern storage unit 150 stores, as pattern information, a movement pattern on the display surface of the approach or contact from when the detection unit 120 detects the contact or approach until it stops detecting the contact or approach. The pattern storage unit 150 also stores association between pattern information and command information for controlling input.
 図2は、図1に示したパターン記憶部150に記憶されている、文字情報およびコマンド情報と、パターン情報との対応付けの一例を示す図である。 FIG. 2 is a diagram showing an example of correspondence between character information and command information and pattern information stored in the pattern storage unit 150 shown in FIG.
 図1に示したパターン記憶部150には図2に示すように、アルファベットや数字、記号、コマンド情報が、移動パターンを示すパターン情報と対応付けられて記憶されている。図2には、移動パターンが一筆書きのパターン(検知部120が接触または接近を検知してからその接触または接近を検知しなくなるまでの移動パターン)を例に挙げて示している。パターン記憶部150には、そのほかに、一般的な手書き入力で用いられているような、移動パターンのパターン情報と、文字情報等との対応付けが記憶されていても良い。 As shown in FIG. 2, alphabets, numbers, symbols, and command information are stored in the pattern storage unit 150 shown in FIG. 1 in association with pattern information indicating a movement pattern. FIG. 2 shows an example in which the movement pattern is a one-stroke drawing pattern (a movement pattern from when the detection unit 120 detects contact or approach until the contact or approach is not detected). In addition, the pattern storage unit 150 may store associations between pattern information of moving patterns, character information, and the like as used in general handwriting input.
 パターン認識部160は、領域決定部130から通知された入力領域において検知部120が検知した移動パターンに基づいて、パターン記憶部150から、その移動パターンに対応する文字情報やコマンド情報を読み出すことで、移動パターンに対応する文字情報やコマンド情報を認識する。このとき、パターン認識部160は、移動パターンが一筆書きのパターンだけではなく、一般的な仮名等の文字情報についても認識できるものであっても良い。パターン認識部160は、認識した文字情報またはコマンド情報を制御部140へ出力する。 The pattern recognition unit 160 reads character information and command information corresponding to the movement pattern from the pattern storage unit 150 based on the movement pattern detected by the detection unit 120 in the input area notified from the area determination unit 130. The character information and command information corresponding to the movement pattern are recognized. At this time, the pattern recognition unit 160 may be able to recognize not only a one-stroke pattern but also general character information such as a kana. The pattern recognition unit 160 outputs the recognized character information or command information to the control unit 140.
 以下に、図1に示した情報処理装置100における情報処理方法について説明する。 Hereinafter, an information processing method in the information processing apparatus 100 shown in FIG. 1 will be described.
 図3は、図1に示した情報処理装置100における情報処理方法を説明するためのフローチャートである。 FIG. 3 is a flowchart for explaining an information processing method in the information processing apparatus 100 shown in FIG.
 まず、ステップ1にて、文字入力機能を具備するアプリケーションを起動すると、ステップ2にて、制御部140は、検知部120における接触または接近の検知待ちの状態(タップイベント待ち状態)となる。この文字入力機能を具備するアプリケーションの利用の一態様としては、ブラウザを使った入力フォームへの記入や、メモ帳アプリケーションの入力欄への記入が例に挙げられる。以下、ブラウザを使った入力フォームへの記入や、メモ帳アプリケーションの入力欄への記入を例に挙げて説明する。 First, when an application having a character input function is started in step 1, the control unit 140 enters a state of waiting for detection of contact or approach in the detection unit 120 (waiting for a tap event) in step 2. As an aspect of the use of an application having this character input function, an entry in an input form using a browser and an entry in an input field of a memo pad application are given as examples. In the following, description will be given by taking as an example filling in an input form using a browser and filling in an input field of a notepad application.
 その後、検知部120が接触または接近を検知すると、ステップ3にて、制御部140は、現在の動作モードが文字入力モードであるかどうかを判別する。 Thereafter, when the detection unit 120 detects contact or approach, in step 3, the control unit 140 determines whether or not the current operation mode is the character input mode.
 文字入力モードではない場合、ステップ4にて、制御部140は、動作モードを文字入力モードへ変更する。 If it is not the character input mode, in step 4, the control unit 140 changes the operation mode to the character input mode.
 そして、ステップ5にて、制御部140は、表示部110が表示する画面を更新し、ステップ2の処理を行う。 In step 5, the control unit 140 updates the screen displayed by the display unit 110 and performs the process in step 2.
 図4は、動作モードが文字入力モードでないときの表示部110に表示された画面の一例を示す図である。 FIG. 4 is a diagram illustrating an example of a screen displayed on the display unit 110 when the operation mode is not the character input mode.
 図4に示すように、表示部110には、所定の入力フォームが表示されている。ここで、例えば、次に入力される文字が表示される部分として図4に示した入力フォームの右下部分の領域に表示されているカーソル(図4では、黒色で塗りつぶした四角で示した表示)に対して、利用者が指等を接触または接近させることで、検知部120が接触または接近を検知すると、制御部140が、動作モードを文字入力モードに変更する。 As shown in FIG. 4, a predetermined input form is displayed on the display unit 110. Here, for example, the cursor displayed in the lower right area of the input form shown in FIG. 4 as the part where the next input character is displayed (in FIG. 4, a display indicated by a square filled in black) ), The control unit 140 changes the operation mode to the character input mode when the detection unit 120 detects contact or approach by causing the user to touch or approach a finger or the like.
 図5は、動作モードが文字入力モードでないときの表示部110に表示された画面の他の例を示す図である。 FIG. 5 is a diagram illustrating another example of a screen displayed on the display unit 110 when the operation mode is not the character input mode.
 図5に示すように、メモ帳等のアプリケーションが起動している場合、表示部110には、所定のメモ帳の画面が表示されている。ここで、例えば、図5に示したメモ帳の文字を入力したい部分(「ああああ」が入力されている次の位置)に対して、利用者が指等を接触または接近させることで、検知部120が接触または接近を検知すると、制御部140が、動作モードを文字入力モードに変更する。 As shown in FIG. 5, when an application such as a notepad is activated, a predetermined notepad screen is displayed on the display unit 110. Here, for example, when the user touches or approaches a portion (the next position where “Ah” is input) of the memo pad shown in FIG. When 120 detects contact or approach, control unit 140 changes the operation mode to the character input mode.
 図6は、図1に示した表示部110に図5に示した画面が表示されている状態で、ステップ2~5の処理が行われた後の表示部110に表示されている画面の一例を示す図である。 FIG. 6 shows an example of a screen displayed on the display unit 110 after the processing of steps 2 to 5 is performed in a state where the screen shown in FIG. 5 is displayed on the display unit 110 shown in FIG. FIG.
 図6に示すように、表示部110には文字入力モードの画面が表示される。 As shown in FIG. 6, a character input mode screen is displayed on the display unit 110.
 表示部110全体を文字入力可能な領域としても良いが、例えば、図6に示すように、表示部110の4つの辺それぞれから内側へ所定の幅分(図6では、斜線の部分)を、カーソルの移動に用いる部分として、文字入力ができない領域としても良い。この場合、検知部120が、表示部110の上側の斜線部分にて接触または接近を検知すると、カーソルが上方向へ移動する。また、検知部120が、表示部110の下側の斜線部分にて接触または接近を検知すると、カーソルが下方向へ移動する。また、検知部120が、表示部110の右側の斜線部分にて接触または接近を検知すると、カーソルが右方向へ移動する。また、検知部120が、表示部110の左側の斜線部分にて接触または接近を検知すると、カーソルが左方向へ移動する。 The entire display unit 110 may be a region where characters can be input. For example, as shown in FIG. 6, a predetermined width (indicated by a hatched portion in FIG. 6) is provided inward from each of the four sides of the display unit 110. As a portion used for moving the cursor, an area where characters cannot be input may be used. In this case, when the detection unit 120 detects contact or approach at the shaded portion on the upper side of the display unit 110, the cursor moves upward. Further, when the detection unit 120 detects contact or approach at the lower shaded portion of the display unit 110, the cursor moves downward. Further, when the detection unit 120 detects contact or approach at the right shaded portion of the display unit 110, the cursor moves in the right direction. Further, when the detection unit 120 detects contact or approach at the left shaded portion of the display unit 110, the cursor moves to the left.
 また、検知部120が、斜線部分に含まれて表示部110の四隅のいずれかにて接触または接近を検知すると、文字入力モードが終了する。 Further, when the detection unit 120 detects contact or approach at any of the four corners of the display unit 110 included in the hatched portion, the character input mode ends.
 また、このとき、表示部110の表示画面のどこかに、文字入力モード(文字入力中)である旨を表示するものであっても良い。 At this time, a message indicating that the character input mode (character input is in progress) may be displayed somewhere on the display screen of the display unit 110.
 検知部120が、その他の部分にて接触または接近を検知すると、後述する処理で、その接触または接近の移動パターンに基づいて、文字が入力される。 When the detection unit 120 detects contact or approach at another part, a character is input based on the movement pattern of the contact or approach in a process described later.
 なお、図6に示した斜線は、実際の表示部110には表示されないものであることが好ましい。 Note that the diagonal lines shown in FIG. 6 are preferably not displayed on the actual display unit 110.
 一方、ステップ3にて動作モードが文字入力モードである場合は、ステップ2で検知部120が検知した接触または接近の位置に基づいて、ステップ6にて、領域決定部130が、手書き入力をするための入力領域を決定する。 On the other hand, when the operation mode is the character input mode in step 3, the region determination unit 130 performs handwriting input in step 6 based on the contact or approach position detected by the detection unit 120 in step 2. Determine the input area for.
 図7は、利用者が図1に示した表示部110の表示面へ指を接触させたときの様子の一例を示す図である。 FIG. 7 is a diagram illustrating an example of a state when the user touches the display surface of the display unit 110 illustrated in FIG. 1 with a finger.
 文字入力モードで最初に図7に示した矢印のように指を表示部110の表示面に接触(接近)させて移動させると、この移動のパターンに基づいて、領域決定部130が入力領域を決定する。つまり、文字入力モードで最初に手書き入力を行った文字の位置(大きさを含む)に基づいて、領域決定部130が入力領域を決定する。 When the finger is first moved in contact (approaching) the display surface of the display unit 110 as indicated by the arrow shown in FIG. 7 in the character input mode, the region determination unit 130 determines the input region based on the movement pattern. decide. That is, the area determination unit 130 determines the input area based on the position (including the size) of the character that is first input by handwriting in the character input mode.
 図8は、入力領域を座標として示した図である。 FIG. 8 is a diagram showing the input area as coordinates.
 図8に示すように、領域決定部130は、文字入力モードで最初に検知部120が接触または接近を検知した移動の位置が含まれる、座標(x1,y1)、(x1,y2)、(x2,y1)、(x2,y2)で囲まれる部分(図6中、破線で囲んだ部分)を入力領域として決定する。 As shown in FIG. 8, the region determination unit 130 includes coordinates (x1, y1), (x1, y2), (x1, y2), and the position of movement that the detection unit 120 first detects contact or approach in the character input mode. x2, y1) and a portion surrounded by (x2, y2) (a portion surrounded by a broken line in FIG. 6) are determined as input regions.
 図9は、図1に示した検知部120が表示部110の右下に接触または接近を検知したときに、領域決定部130が決定した入力領域の一例を示す図である。 FIG. 9 is a diagram illustrating an example of the input region determined by the region determination unit 130 when the detection unit 120 illustrated in FIG. 1 detects contact or approach to the lower right of the display unit 110.
 図9に示すように、文字入力モードで最初に検知部120が表示部110の右下に接触または接近を検知した場合、領域決定部130は、図9中に示す破線で囲まれた部分を入力領域として決定する。 As shown in FIG. 9, when the detection unit 120 first detects contact or approach to the lower right of the display unit 110 in the character input mode, the region determination unit 130 displays a portion surrounded by a broken line shown in FIG. 9. Determine as the input area.
 図10は、図1に示した検知部120が表示部110の全体に接触または接近を検知したときに、領域決定部130が決定した入力領域の一例を示す図である。 FIG. 10 is a diagram illustrating an example of the input region determined by the region determination unit 130 when the detection unit 120 illustrated in FIG. 1 detects contact or approach to the entire display unit 110.
 図10に示すように、文字入力モードで最初に検知部120が表示部110の全体に接触または接近を検知した場合、領域決定部130は、図10中に示す破線で囲まれた部分を入力領域として決定する。 As illustrated in FIG. 10, when the detection unit 120 first detects contact or approach to the entire display unit 110 in the character input mode, the region determination unit 130 inputs a part surrounded by a broken line illustrated in FIG. 10. Determine as an area.
 このように、領域決定部130は、検知部120が表示部110の接触または接近を検知した位置に応じて、入力領域を決定する。 As described above, the region determination unit 130 determines the input region according to the position where the detection unit 120 detects the contact or approach of the display unit 110.
 なお、図7、9、10にて、接触(接近)した指の移動の軌跡を矢印付き実線で示しているが、これは説明の便宜上の表示であって、実際の表示部110には表示されない。つまり、表示部110は、検知部120が接触または接近を検知する直前と、検知部120がその接触または接近を検知しているときとで、互いに同じ画面を表示することとなる。 7, 9, and 10, the locus of movement of the touched finger is indicated by a solid line with an arrow, but this is a display for convenience of explanation, and is displayed on the actual display unit 110. Not. That is, the display unit 110 displays the same screen immediately before the detection unit 120 detects contact or approach and when the detection unit 120 detects the contact or approach.
 領域決定部130が入力領域を決定すると、ステップ7にて、決定した入力領域にてパターン認識部160がパターン認識を行う。これは、領域決定部130が決定した入力領域において、検知部120が検知した接触または接近の移動パターンに基づいて、パターン認識部160がパターン記憶部150から文字情報またはコマンド情報を読み出してパターン認識を行う。 When the area determination unit 130 determines an input area, in step 7, the pattern recognition unit 160 performs pattern recognition in the determined input area. This is because the pattern recognition unit 160 reads out character information or command information from the pattern storage unit 150 based on the contact or approach movement pattern detected by the detection unit 120 in the input region determined by the region determination unit 130 and performs pattern recognition. I do.
 例えば、パターン記憶部150に図2に示した対応付けが記憶されており、検知部120が検知した接触または接近の移動パターンが図7に示した移動パターンである場合、パターン記憶部150からアルファベットの「A」が読み出される。そして、パターン認識部160は、検知部120が検知した接触または接近の移動パターンが「A」を示しているものと認識する。 For example, when the association shown in FIG. 2 is stored in the pattern storage unit 150 and the contact or approach movement pattern detected by the detection unit 120 is the movement pattern shown in FIG. "A" is read out. Then, the pattern recognition unit 160 recognizes that the contact or approach movement pattern detected by the detection unit 120 indicates “A”.
 すると、パターン認識部160は、認識した文字情報またはコマンド情報を制御部140へ出力する。 Then, the pattern recognition unit 160 outputs the recognized character information or command information to the control unit 140.
 続いて、ステップ8にて、制御部140は、パターン認識部160から出力されてきた情報が、文字情報であるかコマンド情報であるかを判別する。 Subsequently, in step 8, the control unit 140 determines whether the information output from the pattern recognition unit 160 is character information or command information.
 パターン認識部160から出力されてきた情報が文字情報である場合、制御部140は当該文字を情報処理装置100に入力し、ステップ5にて、入力された文字を表示部110に表示させることで、表示部110の表示画面を更新する。 When the information output from the pattern recognition unit 160 is character information, the control unit 140 inputs the character to the information processing apparatus 100, and displays the input character on the display unit 110 in step 5. The display screen of the display unit 110 is updated.
 なお、ここで、パターン認識部160から出力されてきた情報が文字情報である場合、当該文字について、一般的な文字変換を行うものであっても良い。つまり、情報処理装置100が言語辞書(不図示)や言語変換部(不図示)を具備し、言語変換部が、当該文字に基づいて、言語辞書から言語変換候補を読み出し、言語変換(例えば、仮名から漢字への変換)を行うものであっても良い。 Here, when the information output from the pattern recognition unit 160 is character information, the character may be subjected to general character conversion. That is, the information processing apparatus 100 includes a language dictionary (not shown) and a language conversion unit (not shown), and the language conversion unit reads language conversion candidates from the language dictionary based on the characters, and performs language conversion (for example, (Conversion from kana to kanji) may be performed.
 また、表示部110が入力された文字を表示する場合、所定の表示用フォントに変換してから表示するものであっても良いし、認識したパターンをそのまま表示するものであっても良い。また、起動しているアプリケーションによって、表示する文字の大きさを調整するものであっても良い。 Further, when the input character is displayed on the display unit 110, it may be displayed after being converted into a predetermined display font, or the recognized pattern may be displayed as it is. Further, the size of characters to be displayed may be adjusted depending on the running application.
 一方、パターン認識部160から出力されてきた情報がコマンド情報である場合は、ステップ9にて、制御部140は当該コマンド情報に基づいて、コマンド処理(例えば、リターンやスペース、バックスペースの入力等)を行う。そして、ステップ5にて表示部110が、その処理に応じた表示を行い、再度ステップ2の処理が行われる。 On the other hand, when the information output from the pattern recognition unit 160 is command information, in step 9, the control unit 140 performs command processing (for example, input of return, space, backspace, etc.) based on the command information. )I do. In step 5, the display unit 110 performs display corresponding to the process, and the process in step 2 is performed again.
 このように、検知部120が接触または接近を検知した位置に基づいて、文字入力ができる領域を決定するため、入力領域の位置や大きさを気にせず、文字等の手書き入力を行うことができる。これにより、手書き入力をするときに、その都度、入力領域を自由に設定することができるという利点も生まれる。 Thus, in order to determine a region where characters can be input based on the position where the detection unit 120 detects contact or approach, it is possible to input characters by hand without worrying about the position or size of the input region. it can. Accordingly, there is an advantage that the input area can be freely set every time handwritten input is performed.
 また、表示部110は、検知部120が検知した接触または接近の移動の軌跡を表示しないため、手書き入力時にバックグラウンドで表示されている画面が見難くなることもない。 In addition, since the display unit 110 does not display the trajectory of contact or approaching movement detected by the detection unit 120, the screen displayed in the background during handwriting input does not become difficult to see.
 上述した情報処理装置100に設けられた各構成要素が行う処理は、目的に応じてそれぞれ作製された論理回路で行うようにしても良い。また、処理内容を手順として記述したコンピュータプログラム(以下、プログラムと称する)を情報処理装置100にて読取可能な記録媒体に記録し、この記録媒体に記録されたプログラムを情報処理装置100に読み込ませ、実行するものであっても良い。情報処理装置100にて読取可能な記録媒体とは、フロッピー(登録商標)ディスク、光磁気ディスク、DVD、CDなどの移設可能な記録媒体の他、情報処理装置100に内蔵されたROM、RAM等のメモリやHDD等を指す。この記録媒体に記録されたプログラムは、情報処理装置100に設けられた制御部140にて読み込まれ、制御部140の制御によって、上述したものと同様の処理が行われる。ここで、制御部140は、プログラムが記録された記録媒体から読み込まれたプログラムを実行するコンピュータとして動作するものである。 The processing performed by each component provided in the information processing apparatus 100 described above may be performed by a logic circuit produced according to the purpose. In addition, a computer program (hereinafter referred to as a program) in which processing contents are described as a procedure is recorded on a recording medium readable by the information processing apparatus 100, and the program recorded on the recording medium is read by the information processing apparatus 100. , May be executed. The recording medium readable by the information processing apparatus 100 includes a transfer medium such as a floppy (registered trademark) disk, a magneto-optical disk, a DVD, and a CD, as well as a ROM, a RAM, and the like built in the information processing apparatus 100. Memory, HDD, etc. The program recorded on the recording medium is read by the control unit 140 provided in the information processing apparatus 100, and the same processing as described above is performed under the control of the control unit 140. Here, the control unit 140 operates as a computer that executes a program read from a recording medium on which the program is recorded.
 以上、実施の形態を参照して本願発明を説明したが、本願発明は上記実施の形態に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。 As mentioned above, although this invention was demonstrated with reference to embodiment, this invention is not limited to the said embodiment. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
 この出願は、2011年12月9日に出願された日本出願特願2011-269955を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2011-269955 filed on Dec. 9, 2011, the entire disclosure of which is incorporated herein.

Claims (8)

  1.  情報処理装置であって、
     表示部と、
     前記表示部の表示面に対する接触または接近を検知する検知部と、
     前記検知部の検知結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する領域決定部と、
     前記領域決定部が決定した入力領域において前記検知部が検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する制御部とを有する情報処理装置。
    An information processing apparatus,
    A display unit;
    A detection unit for detecting contact or approach to the display surface of the display unit;
    Based on the detection result of the detection unit, an area determination unit for determining an input area for inputting information in the display surface of the display unit;
    An information processing apparatus comprising: a control unit configured to input the information to the information processing apparatus based on a contact or approach detected by the detection unit in an input region determined by the region determination unit.
  2.  請求項1に記載の情報処理装置において、
     前記表示部は、前記検知部が接触または接近を検知する直前と、前記検知部が該接触または該接近を検知しているときとで、互いに同じ画面を表示することを特徴とする情報処理装置。
    The information processing apparatus according to claim 1,
    The display unit displays the same screen immediately before the detection unit detects contact or approach and when the detection unit detects the contact or approach. .
  3.  請求項1または請求項2に記載の情報処理装置において、
     文字情報と、前記検知部が検知した接触または接近の前記表示面上での移動パターンを示すパターン情報との対応付けを記憶するパターン記憶部と、
     前記領域決定部が決定した入力領域において前記検知部が検知した移動パターンに基づいて、前記パターン記憶部から前記文字情報を読み出すパターン認識部とを有し、
     前記制御部は、前記パターン認識部が読み出した文字情報を入力することを特徴とする情報処理装置。
    The information processing apparatus according to claim 1 or 2,
    A pattern storage unit for storing correspondence between character information and pattern information indicating a movement pattern on the display surface of contact or approach detected by the detection unit;
    A pattern recognition unit that reads out the character information from the pattern storage unit based on the movement pattern detected by the detection unit in the input region determined by the region determination unit;
    The information processing apparatus, wherein the control unit inputs character information read by the pattern recognition unit.
  4.  請求項3に記載の情報処理装置において、
     前記パターン記憶部は、前記パターン情報として、前記検知部が前記接触または前記接近を検知してから該接触または該接近を検知しなくなるまでの該接触または該接近の前記表示面上での移動パターンを記憶することを特徴とする情報処理装置。
    The information processing apparatus according to claim 3.
    The pattern storage unit has, as the pattern information, a movement pattern on the display surface of the contact or the approach until the contact or the approach is not detected after the detection unit detects the contact or the approach. An information processing apparatus characterized by storing the information.
  5.  請求項3または請求項4に記載の情報処理装置において、
     前記パターン記憶部は、前記パターン情報と、前記入力を制御するためのコマンド情報との対応付けをさらに記憶し、
     前記パターン認識部は、前記領域決定部が決定した入力領域において前記検知部が検知した移動パターンに基づいて、前記パターン記憶部から前記コマンド情報を読み出し、
     前記制御部は、前記パターン認識部が読み出したコマンド情報に基づいて、前記入力を制御することを特徴とする情報処理装置。
    The information processing apparatus according to claim 3 or 4,
    The pattern storage unit further stores a correspondence between the pattern information and command information for controlling the input,
    The pattern recognition unit reads the command information from the pattern storage unit based on the movement pattern detected by the detection unit in the input region determined by the region determination unit,
    The information processing apparatus, wherein the control unit controls the input based on command information read by the pattern recognition unit.
  6.  請求項1から5のいずれか1項に記載の情報処理装置において、
     前記表示部は、前記制御部が入力した文字情報を表示することを特徴とする情報処理装置。
    The information processing apparatus according to any one of claims 1 to 5,
    The information processing apparatus, wherein the display unit displays character information input by the control unit.
  7.  表示部を具備する情報処理装置が行う情報処理方法であって、
     前記表示部の表示面に対する接触または接近を検知する処理と、
     前記検知した結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する処理と、
     前記決定した入力領域において前記検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する処理とを行う情報処理方法。
    An information processing method performed by an information processing apparatus including a display unit,
    Processing for detecting contact or approach to the display surface of the display unit;
    Based on the detected result, a process of determining an input area for inputting information in the display surface of the display unit;
    An information processing method for performing processing for inputting the information to the information processing apparatus based on the detected contact or approach in the determined input area.
  8.  表示部を具備する情報処理装置に、
     前記表示部の表示面に対する接触または接近を検知する手順と、
     前記検知した結果に基づいて、前記表示部の表示面の中で情報を入力するための入力領域を決定する手順と、
     前記決定した入力領域において前記検知した接触または接近に基づいて、前記情報を当該情報処理装置に入力する手順とを実行させるためのプログラム。
    In an information processing apparatus having a display unit,
    A procedure for detecting contact or approach to the display surface of the display unit;
    A procedure for determining an input area for inputting information in the display surface of the display unit based on the detected result;
    A program for executing a procedure for inputting the information to the information processing device based on the detected contact or approach in the determined input area.
PCT/JP2012/079716 2011-12-09 2012-11-16 Information processing device WO2013084687A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-269955 2011-12-09
JP2011269955 2011-12-09

Publications (1)

Publication Number Publication Date
WO2013084687A1 true WO2013084687A1 (en) 2013-06-13

Family

ID=48574065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/079716 WO2013084687A1 (en) 2011-12-09 2012-11-16 Information processing device

Country Status (2)

Country Link
JP (1) JPWO2013084687A1 (en)
WO (1) WO2013084687A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016157303A (en) * 2015-02-25 2016-09-01 京セラ株式会社 Electronic apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03263281A (en) * 1990-03-14 1991-11-22 Oki Electric Ind Co Ltd Handwritten line graphic input device
JPH06266886A (en) * 1993-03-15 1994-09-22 Toshiba Corp Character recognizing processor for hand-written character inputting device and its method
JP2003178259A (en) * 2001-09-19 2003-06-27 Ricoh Co Ltd Information processor, control method for information processor and program for executing this method by computer

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03263281A (en) * 1990-03-14 1991-11-22 Oki Electric Ind Co Ltd Handwritten line graphic input device
JPH06266886A (en) * 1993-03-15 1994-09-22 Toshiba Corp Character recognizing processor for hand-written character inputting device and its method
JP2003178259A (en) * 2001-09-19 2003-06-27 Ricoh Co Ltd Information processor, control method for information processor and program for executing this method by computer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016157303A (en) * 2015-02-25 2016-09-01 京セラ株式会社 Electronic apparatus

Also Published As

Publication number Publication date
JPWO2013084687A1 (en) 2015-04-27

Similar Documents

Publication Publication Date Title
JP4560062B2 (en) Handwriting determination apparatus, method, and program
JP5350437B2 (en) Touch sensor system
US9569106B2 (en) Information processing apparatus, information processing method and computer program
JP6902234B2 (en) Methods for inserting characters into strings and corresponding digital devices
US20110320978A1 (en) Method and apparatus for touchscreen gesture recognition overlay
JP5947887B2 (en) Display control device, control program, and display device control method
KR102402397B1 (en) Systems and Methods for Multi-Input Management
US20140123049A1 (en) Keyboard with gesture-redundant keys removed
JP5951886B2 (en) Electronic device and input method
US20090322692A1 (en) Character input apparatus and character input method
US10416868B2 (en) Method and system for character insertion in a character string
JP5550598B2 (en) Handwritten character input device
KR20080029028A (en) Character input method of terminal with touch screen
WO2017186350A1 (en) System and method for editing input management
JP2014056389A (en) Character recognition device, character recognition method and program
US9811238B2 (en) Methods and systems for interacting with a digital marking surface
JP2000137571A (en) Handwriting input device and recording medium recording handwriting input processing program
US20100245266A1 (en) Handwriting processing apparatus, computer program product, and method
WO2013084687A1 (en) Information processing device
TW201423563A (en) Apparatus and method for processing handwriting input
JP6699521B2 (en) Input device
JP6032982B2 (en) Input support device, input support method, input support program, and recording medium
JP6455856B2 (en) Handwritten character input device and computer program
JP2016200896A (en) Character input method, device and program
KR20150016854A (en) The simple method of correction of the typograghical error while text input at electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12855810

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013548165

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12855810

Country of ref document: EP

Kind code of ref document: A1